4.1.27 Information Return Case Selection

Manual Transmittal

August 07, 2018

Purpose

(1) This transmits the new Internal Revenue Manual 4.1.27, Information Return Case Selection.

Scope

This IRM contains specific information on the workload identification method that is in use by the IRS to identify and select tax returns for Automated Underreporter (AUR) and Business Underreporter (BUR).

Material Changes

(1) This is a new IRM.

Effect on Other Documents

This IRM incorporates Interim Guidance Memorandum SBSE-04-0817-0052, Return Selection Documentation, Approval, and Review Requirements, dated August 31, 2017.

Audience

SB/SE, Exam Case Selection, Information Return Case Selection (IRCS).

Effective Date

(08-07-2018)


Richard Tierney
Director, Exam Case Section
SB:S:E:HQ:ECS

Program Scope and Objectives

  1. Purpose: This Internal Revenue Manual (IRM) provides guidance to Information Return Case Selection (IRCS) personnel responsible for identifying Automated Underreporter (AUR) and Business Underreporter (BUR) workload and developing criteria for return selection.

  2. Audience: These procedures apply to IRCS employees assigned to the various programs discussed in this IRM.

  3. Policy Owner: IRCS is a function within Examination Operations.

  4. Program Owner: IRCS, Exam Case Selection, is responsible for the content of this IRM.

  5. SB/SE Program Level Objective: Ensure examinations are initiated based on indicators of non-compliance or on other criteria (such as selection for the National Research Program), identified in the Internal Revenue Manual. In addition, ensure a review of the decisions to survey a return (i.e., not initiate an examination) are based upon factors outlined in the Internal Revenue Manual and approved by an appropriate level of management.

    Note:

    While AUR and BUR notices are not audits, but considered taxpayer contacts per Rev. Proc. 2005-32, the overall selection principles addressed in the language above apply.

  6. IRCS Program Objective: The objective of IRCS is to identify, select and deliver inventory for the AUR and BUR document matching programs within the campuses.

  7. Primary Stakeholders: SB/SE

Background

  1. Underreporter cases are built from two primary sources:

    • The Master File (MF) which contains information reported to the IRS by taxpayers. This includes current entity information, tax account and filed tax returns.

    • The Information Return Master File (IRMF) which contains information submitted by payers.

  2. The IMF file contains information reported on:

    • Form 1040, U.S. Individual Income Tax Return

    • Form 1040A, U.S. Individual Income Tax Return

    • Form 1040EZ, Income Tax Return for Single or Joint Filers with No Dependents

  3. The BMF file contains information reported on:

    • Form 1120, U.S. Corporation Income Tax Return

    • Form 1041, U.S. Income Tax Return for Estates and Trusts

  4. The IRMF information is matched with the IMF and BMF tax returns to verify certain income, deductions and credits, that can be supported by information returns, are properly reported on the tax return. An AUR or BUR case is identified when a discrepancy is detected between the two data sources. Examples (not all inclusive) of the information returns in the IRMF are:

    • Form W-2, Wage and Tax Statement

    • Form 1099-MISC, Miscellaneous Income

    • Form 1099-PATR, Taxable Distribution Received From Cooperatives

    • Schedule K-1, Shareholder’s Share of Income, Deductions, Credits, etc.

    • Form 1099-INT, Interest income

    • Form 1099-K, Payment Card and Third Party Network Transactions

Authority

  1. Internal Revenue Code (IRC) Section 61 states that "except as otherwise provided in this subtitle gross income means all income from whatever source derived.” Discovering all income received by a taxpayer is the starting point for determining which items of income are includable in gross income and subject to Federal income tax.

  2. IRC Sections 6031-6060 and regulations thereunder, contain the requirements for the filing of information returns for reporting purposes.

  3. Revenue Procedure 2005-32 states taxpayer contacts initiated to verify a discrepancy between the taxpayer’s tax return and an information return, are “contacts and other actions not considered an examination, inspection or reopening”.

Roles and Responsibilities

  1. The Director of Exam Case Selection (ECS) is responsible for providing policy guidance on the selection of cases and delivery of inventory for SB/SE Examination, including AUR and BUR.

  2. The IRCS Program Manager is responsible for:

    1. Establishing internal controls relating to each program or process.

    2. Ensuring that instructions are communicated to and carried out by the assigned employees.

    3. Sets policy, establishes procedures and guidelines and ensures they are applied consistently.

    4. Revises policies as required and redesigns processes as necessary resulting from legislative changes.

    5. Performs managerial reviews of selection decisions during each phase of the selection and delivery process.

    6. Reviews and approves the business requirements written for case selection and Uniform Work Requests (UWRs) annually.

    7. Provides input into the AUR and BUR work plans.

    8. Provides input to the SB/SE Division Strategic Plan.

  3. IRCS analysts are responsible for:

    1. Selection and delivery of AUR and BUR inventory multiple times per year.

    2. Monitor results and perform analytics to improve selection.

    3. Provide campus support and guidance on workload selection and delivery-related issues.

    4. Work collaboratively with Performance Planning & Analysis to develop annual workload plans.

Program Management and Review

  1. The Director of ECS (or designee) ensures that the IRCS Program Manager reviews adhere to case selection policy.

  2. Managerial reviews of selection decisions occur during each phase of the selection and delivery process.

Terms/Definitions/Acronyms

  1. See Exhibit 4.1.27-1, Glossary and Acronyms below.

  2. See IRM Exhibit 4.19.3-1, Abbreviations, and IRM Exhibit 4.19.3-2, Glossary, for a list of abbreviations and definitions used in AUR processing.

  3. See IRM Exhibit 4.119.4-1, Acronyms, and IRM Exhibit 4.119.4-2, Glossary, for a list of abbreviations and definitions used in BUR processing.

Related Resources

  1. The following IRMs are used by the AUR campuses.

    • IRM 1.4.19, Automated Underreporter Technical and Clerical Managers and Coordinators Guide

    • IRM 4.19.2, IMF Automated Underreporter (AUR) Control

    • IRM 4.19.3, IMF Automated Underreporter Program

    • IRM 4.19.7, IMF Automated Underreporter (AUR) Technical System Procedures

  2. The following IRMs are used by the BUR campus.

    • IRM 4.119.1, BMF Underreporter (BUR) Control

    • IRM 4.119.3, BMF Underreporter (BUR) Manager and Coordinator Handbook

    • IRM 4.119.4, BMF Underreporter (BUR) Program

Selection Principles for Information Return Document Matching

  1. Planning and Performance Analysis (PPA) determines the workplan volumes by program and location.

  2. To achieve the workplan volumes, IRCS analysts consider the following factors during the case selection process:

    • Issue coverage

    • Revenue protection

    • Taxpayer burden

    • Influence taxpayer behavior (repeaters)

    • Prior document matching case results (including "agreed" and "no change" rates)

    • Fairness and integrity

    • Available resources

    • Level of automation

    • Reliability of data sources

    • Taxpayer Bill of Rights

  3. SB/SE supports administration of tax law by selecting returns to audit. The primary objective in selecting returns for examination is to promote the highest degree of voluntary compliance on the part of taxpayers while making the most efficient use of finite examination staffing and other resources. Employees must exercise their professional judgment, not personal opinions, when making return selection decisions. As explained in Policy Statement 1-236, IRS employees are expected to carry out our duties with integrity and fairness.

    • To ensure fairness to the taxpaying public, our Examination Workplan provides a balanced approach for return delivery and allocation of resources to address areas of the Tax Gap by taking into account factors such as income levels, geographic locations, and return types.

    • To ensure an equitable process for all taxpayers, return selection decisions are made utilizing available experience and/or statistics indicating the probability of substantial error. No one individual can control the examination selection decision- making process. We limit involvement to only those employees whose duties require them to be included.

    • To ensure fairness to each taxpayer whose return is selected, individual return selection decisions are based on the information contained on the taxpayer’s return and/or the underlying relevant tax law. Managerial as well as quality reviews of selection decisions occur during each phase of the selection and assignment process.

    Note:

    While AUR and BUR notices are not audits, but considered taxpayer contacts per Rev. Proc. 2005-32, the overall selection principles addressed in the language above apply.

Workload Identification for AUR and BUR

  1. Case selection for AUR and BUR begins with a set of business requirements that are used to define when a case is brought into correlation for potential selection. These business requirements define aspects of the tax returns to be matched to the respective information returns received. When discrepancies arise, the case is created for potential selection.

    • Requirements for correlation are reviewed by the program analysts on an annual basis to ensure effectiveness and relevance.

    • New tax forms, legislation and line changes are identified and updated to reflect the specific tax year.

    • Drop criteria, data elements of specific entity, tax returns, and information returns, are incorporated in the correlation requirements and are reviewed annually to determine if modifications are needed.

    • Requirements are reviewed and Uniform Work Requests (UWRs) are approved by the Program Manager of IRCS annually.

  2. ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

    • ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

    • ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

    • ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

    • ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

Case Segmentation

  1. Each case is stratified based on the main underreported issue and identifying traits. To segment the population, cases are assigned the following three definers:

    • Category: This is a two-digit numeric code that describes the primary discrepant issue on the taxpayer’s return. The code is set based on established criteria such as the percentage of the discrepancy that must be achieved in order for the case to be designated in a particular category. Each case is assigned one category code. See Exhibit 4.1.27-2 and Exhibit 4.1.27-6 for a list of AUR and BUR categories.

    • Subfile: This is a one-character alphanumeric code that describes reporting attributes of a case or taxpayer. The subfile designation is typically a compliance attribute. Examples include information types, specific return attributes, or compliance behavior such as a repeater. The code is assigned in a priority order, so each case is only assigned one subfile code. See Exhibit 4.1.27-3 and Exhibit 4.1.27-7 for a list of AUR and BUR subfiles.

    • Subcategory: A one-character alpha code that describes the range of the potential tax change attributed to the discrepancy. If the tax change falls within a specified range, then the subcategory is set. The code is determined based on a discrete range, so each case is only assigned one subcategory code. See Exhibit 4.1.27-4 and Exhibit 4.1.27-8 for a list of AUR and BUR subcategories.

Information Return Document Matching Systems

  1. Implementation of the Information Return Document Matching (IRDM) Program encompasses multiple systems:

    1. Data Assimilation: Assimilation identifies the link between tax forms and information returns filed for the same taxpayer.

    2. Data Correlation: Correlation compares tax return and information return data and applies business rules to identify potential underreporter cases. After case selection, data correlation builds a complete case record to be worked by a tax examiner.

    3. Two analytic systems provide IRCS analysts with the ability to define and execute logic for the intelligent selection of inventory to ensure effective case selection.

    • Case Identification, Selection and Analysis (CISA)

    • Business Master File Analytics (BMFA)

Case Identification, Selection and Analysis (CISA)

  1. IRCS uses the Information Return Document Matching Case Inventory Selection and Analysis (IRDMCISA) tool to select cases for AUR to work. The tool’s web-based user interface provides easy access to features and functionality and requires no technical programming knowledge to operate from the interface. The tool includes many user-requested features and capabilities, and it allows users to import, score, analyze, and select cases to work.

  2. CISA provides IRCS analysts with the ability to define and execute logic for the intelligent selection of individual taxpayer case inventory. By comparing cases in the current correlation to similar cases from past correlations, the tool’s capabilities include the following:

    1. Assign each case an Estimated Potential Assessment (EPA) score.

    2. ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

  3. IRCS analysts conduct case selection by selecting the optimal mix of underreporting cases to pursue. An optimal mix of cases includes cases that address the following. See IRM 4.1.27.2 for the full list.

    • Yield the highest assessments or money recovered.

    • Address repeat-offenders.

    • Ensure fair coverage of all taxpayer segments.

  4. The selection of cases is limited to three times per year. These periods of intense selection activities are known as correlation cycles and are based on tax reporting deadlines.

  5. IRCS staff analyzes results, performs program reviews, and monitors rule-based applications used to select inventory. The results are used to make data-driven decisions to improve program quality, improve case selection, and ensure the integrity of the selection methods.

  6. Selection is the process of identifying the optimal mix of cases to be worked in the AUR Program. The process occurs in eight phases described below:

    1. Data Import: Importing the data into the CISA Tool.

    2. Data Checks: Methods for ensuring the data was received and calculated correctly, including volume, calculation data quality, and field population checks.

      Note:

      The Data Integrity Failure Report is located in the Miscellaneous Reporting section of the tool.

    3. Scoring: The tool scores the cases by assigning an EPA and a repeater code to each case. Users update the assessment rate table before the first correlation to ensure that the scoring is as accurate as possible.

    4. Business Rules: The user builds business rules, runs the selection tool and reviews the selected inventory. After business rules are complete, cases are generally selected by the highest EPA amount. Rules apply treatment codes to individual cases. See Exhibit 4.1.27-5 Treatment Code, for selection and non-selection of cases. The treatment also identifies if the case will be available in subsequent correlations for selection consideration.

    5. Optimization: The cases are optimized by subfile using one of three methods: No Moving, Balance by BOD, and Corporate Balance. The optimization function allows the workload to be balanced among campuses using the highest EPA in each subfile.

    6. Add/Move: A process of moving higher value cases from one campus to another with lower value cases, to ensure that the best cases are worked by the AUR program.

    7. Export: The user exports cases into cycle extracts by AUR campus.

    8. Reports: The user can view reports that show, for each AUR campus. The total EPA for cases selected, average EPA for cases selected, number of cases selected, and other statistics. These reports can be viewed after correlation.

Business Master File Analysis (BMFA)

  1. The Business Master File Analytics (BMFA) system is the tool used for BUR case selection. BMFA is a web-based application that allows a user to drill-down from parent tabs through multiple subtabs. Users have the option to test the integrity of data, build rules to select cases, export data, and view various reports.

  2. BMFA provides IRCS analysts with the ability to define and execute logic for the intelligent selection of business taxpayer case inventory.

  3. IRCS analysts conduct case selection, a process of selecting the optimal mix of underreporting cases to pursue. An optimal mix of cases include cases that address the following. See IRM 4.1.27.2 for the full list.

    • Yield the highest assessments or money recovered.

    • Address repeat-offenders.

    • Ensure fair coverage of all taxpayer segments.

  4. The selection of cases is limited to two or three times per year dependent on the workplan. These periods of intense selection activities are called correlation cycles and are based on tax reporting deadlines.

  5. IRCS staff analyze results, perform program reviews, and monitor rule-based applications used to select inventory. The results are used to make data-based decisions to improve program quality, improve case selection, and to ensure the integrity of the selection methods.

  6. Selection is the act of identifying the optimal mix of cases to be delivered to the BUR campus. The process occurs in six phases:

    1. Data Import: Importing the data into the Case Selection (BMFA) Tool.

    2. Data Checks: Data Checks are methods for ensuring the data was received correctly, including volume, calculation data quality, data field, and field population checks.

    3. Analysis: Review prior year results manually to readdress business rules based on results.

    4. Business Rules: Build business rules, run the selection tool and view the selected inventory.

    5. Reports: View reports that reflect volumes and average EPA for selected cases, and other statistics.

    6. Export: Export cases to the campus. Inventory is assigned a cycle utilizing a schedule that ensures it will meet the start plan date to conform to the annual work plan.

Selection of AUR and BUR Inventory

  1. Generally, IRCS analysts select AUR and BUR inventory three times a year during the correlation process.

  2. ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

  3. After each correlation, a Summary Report is prepared by IRCS case selection analysts. This manual report documents the selection activities for the respective underreporter program, including executed business rules for each correlation. The summary addresses:

    • The correlation activities, including case volumes impacted by selection, non-selection and global non-select business rules.

    • Business justification for each rule, the SQL language or formula written for the rule, any number limitations applied to the rule, the last date the rule was run and the total cases impacted by each rule. Include details as to selected inventory, including volumes by category, subfile and sub-categories.

    • Documentation used and any analysis performed during the selection process.

  4. The IRCS program manager is required to review and approve the selection summary after each correlation prior to the export process.

Delivery Process

  1. Most of the workload delivery is systemic. Cases identified for referral are manually extracted and delivered in the file format preferred by recipient.

AUR Workload Delivery

  1. All of the workload delivery is systemic. Inventory which has been selected for opening in each campus operation will be created on the AUR system.

  2. AUR cases are worked by seven campuses. Each campus’s inventory is composed of electronic (ELF) and paper cases.

    1. The capacity at each campus is established based on the campus work plan prepared by PPA. The CISA tool determines the minimum EPA inventory required to meet the campus combined capacity (i.e., corporate capacity).

    2. ELF cases with higher EPA than the capacity to be worked are moved to another campus with the capacity to work the cases.

    3. Paper and ELF cases are distributed equally between the campuses.

  3. Optimization is the movement of cases to ensure each campus has the optimal volume and quality of cases. There are three types of optimization that the CISA tool can produce within each individual subfile:

    • No Moving: Cases are already assigned an AUR campus based on the geographic location of the taxpayer. The No Moving function allows the tool user to designate a minimum EPA threshold.

    • Balance by BOD: Previously used to balance inventory between the legacy W&I and SBSE campuses.

    • Corporate Balance: The capacity at each campus is established and the tool determines the minimum overall EPA required to meet the campuses’ combined capacity (i.e., corporate capacity).

  4. Cases selected utilizing the CISA tool are allocated to each campus using the Export function.

    1. Once the cases are exported, notice is sent to IT requesting that the cases be transferred to the AUR Case Management system via Enterprise File Transfer Utility (EFTU).

    2. The notice provides IT the extract cycle to place the cases in the AUR Case Management system.

  5. Once cases are exported, the selected inventory is controlled by each AUR campus.

BUR Workload Delivery

  1. For BUR inventory selected through BMFA upon finalization of the correlation cycle, a file is generated to the Information Return Document Matching Data Correlation (IRDMDC) group. The IRDMDC analyst transfers the needed files (typically Excel spreadsheets) to the shared server via Enterprise File Transfer Utility (EFTU) that contain case details.

    1. IRCS analysts manually load files to the production database on the shared server.

    2. Campus generates IDRS controls via the GII tool and initiates the posting of TC 925 with Process Code 4030 to Master File.

Referrals

  1. During AUR case selection, segments of cases can be identified for referral to other compliance programs based on the criteria requested. Rules are written to mark cases with the “EXAM” treatment code based on the referral criteria. The cases identified for referral are excluded from the AUR case selection process and not considered in the selection for regular inventory. IRDMCISA generates a TIN listing of the referral cases with other necessary case data, which is provided to the requestor of the information.

    Example:

    An example of a common AUR referral request involves a Form 1040NR. These cases are identified and excluded from general AUR case selection. The Form 1040NR TIN listing is sent to Large Business and International (LB&I).

Monitoring and Reporting

  1. SB/SE Examination PPA monitors program results of the SB/SE campuses. See IRM 1.1.16.3.4, Planning and Performance Analysis.

Glossary and Acronyms

The table below defines some key terms used in Document Matching.

Word/Acronym Definition
AC Action Code
AGI Adjusted Gross Income
ASED Assessment Statute Expiration Date
Assessments A change to the amount of tax on the taxpayer's account; generates a bill or a refund, a new DLN, and/or releases payment and/or freeze code
Audit Information Management System (AIMS) An IDRS control system used by Examination
AUR (Automated Underreporter) Inventory control system used in IMF Underreporter
Auto-Generated Notice (AGN) Cases systemically screened and the CP 2000 and 2501 Notices issued with no Tax Examiner (TE) or clerical handling
BMF Business Master File
BOD Business Operating Division
BOE Business Objects Environment
Business Underreporter (BUR) BMF Underreporter
Category Code A Category is a two-digit numeric code that describes the primary issue on the taxpayer’s return
CI Criminal Investigation
CISA Case Inventory, Selection and Analytics
Correspondence Production Services (CPS) AUR notices are printed and mailed from one of two CPS. CPS-East is in Detroit and prints/mails for Andover, Atlanta, Brookhaven and Philadelphia. CPS-West is in Ogden and prints/mails for Austin, Fresno and Ogden.
CRN Credit Reference Number
CRL Case Record Layout
CSN Case Sequence Number
DCI Data Collection Instrument
Document Locator Number (DLN) The number assigned to all returns and documents input to the IRS computer system
DPAD Domestic Production Activity Deduction
Drop Criteria Characteristics that indicate a case should be dropped from the correlation or selection process
ECC-MEM Enterprise Computing Center at Memphis
ECC- MTB Enterprise Computing Center at Martinsburg
EITC Earned Income Tax Credit
Employer Identification Number (EIN) Nine-digit number formatted xx-xxxxxxx used to identify taxpayer/taxpayers
EPA The EPA is an estimation of the assessment that each case will yield based on the behavior and characteristics of cases from past AUR correlations
Extract A group of SSNs selected from the inventory of cases identified with possible discrepancies
Federal Emergency Management Agency (FEMA) The agency that helps with disaster relief
Federal Record Center (FRC) A place where tax returns are stored outside the campuses
FICA Federal Insurance Contribution Act
FMV Fair Market Value
FOIA Freedom of Information Act
FTE Full Time Employment
FTF Failure to File Penalty
FTP Failure to Pay Penalty
Global Non-Select These are universal rules designed to avoid certain types of cases across the entire inventory and exclude them from all other treatments
HQ Headquarters
HAS Health Savings Account
IDRS Integrated Data Retrieval System
IMF Individual Master File
IND Indicator
Internal Process Code (IPC) A numeric/alpha code used for tracking cases within the AUR and BUR programs (does not upload to IDRS)
Integrated Submission and Remittance Processing (ISRP) The automated system that converts all paper documents to electronic form, including payments
IR Information Return
IRA Individual Retirement Account
IRC Internal Revenue Code
IRDM Information Reporting and Document Matching
IRM Internal Revenue Manual
IRMF Information Return Master File
IRPCA Information Returns Program Case Analysis
IRS Internal Revenue Service
ISRP Integrated Submission and Return Processing
IT Information Technology
LB&I Large Business and International
Modernized e-File (MeF) The system used to view electronically filed returns
MFT Master File Tax
NEC Non-Employee Compensation
NIIT Net Investment Income Tax
NOL Net Operating Loss
Non-Select Cases are non-selected to actively remove them from consideration for AUR/BUR casework
OCA Office of Compliance Analytics
O/D Over deducted
OIC Offer in Compromise
POC Point of Contact
PPA Planning and Performance Analysis
PHC Personal Holding Company
Process Codes (PC) Two or four digit numbers used to identify the action taken on a case
PRP Programming Requirements Package
PTC Premium Tax Credit
PTIN Preparer Identification Number
FY Fiscal Year
QPSC Qualified Personal Service Corporation
QTP Qualified Tuition Program
Referral A case sent to another area for technical determination
Research Request for additional information needed to continue processing
RPS The Remittance Processing System within ISRP
RRB Railroad Retirement Board
RSED Refund Statute Expiration Date
SBSE Small Business Self Employed
SC Service Center
Screening A technical review of information returns compared against the tax return. The Screening phase of the Underreporter Program is also referred to as Analysis
SCRIPS Service Center Recognition/ Image Processing System
SE Self Employed
SERP Servicewide Electronic Research Portal
SLID - Student Loan Interest Deduction
Standard employee identifier (SEID) A five digit alpha/number that identifies an IRS Employee
SSA Social Security Administration
Social Security Number (SSN) A nine digit number formatted xxx-xx-xxxx, used to Identify tax payers accounts
SST Social Security Tax
Subcategory Code A Subcategory is a one-character alphabetical code (from A through G). The subcategory describes a computer estimate of the tax change based on the under reported amount
Subfile Code A Subfile is a one-character alphanumeric code that identifies the high-level characteristic of the case
TAS Taxpayer Advocate Service
Taxpayer Delinquent account (TDA) A Collection Status
Taxpayer Information File (TIF) Individual Master File data from ECC containing tax account and tax transaction information
Taxpayer Identification Number (TIN) A nine digit number used to Identify tax payers accounts.
TE Tax Examiner
TE/GE Tax Exempt and Government Entities
TP Taxpayer
TPI Total Positive Income
TR Tax Return
Transaction Code (TC) An information marker generated through IDRS to describe actions taken
TXI Taxable Income
TY Tax Year
UC Underclaimed
UR Underreported
UWR Unified Work Request
W/H Withholding
W&I Wage & Investment
WRMS Work Request Management System

AUR Category Code

The following table is a list of Category Codes used in the AUR selection process.

CATEGORY CODE DESCRIPTION
01 100 % Mortgage and/or points paid
02 IRAs Over-deducted on Form 5498
04 NEC
05 50 % Gross Receipts w/NEC/Fishing income/Bartering
06 Education Credit 80% (Form 1098-T present)
07 Wages
08 SSB/RRB
09 100 % Interest or Dividends
10 Interest
11 Dividends
12 Pure Wages/Interest/Dividends/Combination
13 Pensions and Annuities (1099-R) Taxable
14 Pensions and Annuities (1099-R) Gross
15 Interest/Dividends/Pension/Annuity/Combination
16 Withholding, over/underclaimed
17 Fishing Income
18 Tuition and Fees (Over-deducted)
19 Rents and Royalties
20 Farm Income
21 Medical Payments
22 Distributive Share - Form 1065 and Form 1120-S
23 Distributive Share - Form 1041
24 Gambling
25 Taxable Grants
26 Reserved
27 Other Income
28 Payments in Lieu of Dividends
29 Cancellation of Debt
30 Qualified Tuition Program Earnings
31 Securities Sales - 100 or fewer IRs
32 Early Withdrawal Penalty
33 Unemployment Compensation
34 INT Combination
35 State Income Tax Refund
36 Mortgage Interest Deduction
37 SEP Contribution 80 %
38 Reemployment Trade Adjustment Assistance (RTAA) - Form 1099-G
39 Securities Sales > 100 IRs
40 Gross Capital Gain
41 Combination (not Categories 02, 31)
42 Early Distribution 10 % Tax ($150 or more)
43 Bartering Income Discrepancy
44 Dependent Care Benefits
45 IRA Overdeduction AGI dollar limitation
46 Excess FICA
48 Reserved
49 Reserved
50 Self-Employment Income Discrepancy
51 80 % Taxable Pension and Annuity Distribution Discrepancy
52 Reserved
53 Reserved
54 Reserved
55 Statutory Wages
56 Reserved
57 NEC on Schedule F and/or Form 4835
58 Medical Savings Account Distribution
59 Medical Savings Account Contribution
60 Gross Long-Term Care Benefit
61 Securities Sales, 100 or fewer Stock IRs, No Sch D
62 Student Loan Interest Deduction (SLID) (O/D)
63 Health Saving Account Distributions
64 Health Saving Account Contributions
65 Real Estate Income Discrepancy
66 Taxable Pension and Early Distribution 10 % Tax
67 100 % Mortgage and/or points paid - no Form 1098
68 Tuition and Fees (Over-deducted) - no Form 1098-T
69 Education Credits (over claimed) - no Form 1098-T
70 SSB/RRB plus Taxable Pension discrepancy is at least 60%
71 Wages plus withholding discrepancy is at least 60%
72 Payment Card
79 Securities Sales > 100 Stock IRs, No Sch D
99 When case cannot be assigned to Categories 01 - 79

AUR Subfile Code

The following table is a list of Subfile Codes used in the AUR selection process.

SUBFILE CODE DESCRIPTION
1 CTR Cases
2 IRS Employees
6 AGI per return is $50,000 - $124,999
8 Cases with a Form 1099-K discrepancy of $100 or more
9 Returns filed with ITIN
B Credit Tax Change (W/H)
C Stock, Bond or Real Estate Discrepancy
D UR Income Greater Than $10,000
E UR and EIC Present
F Mortgage Interest Over-deduction
G UR Income Greater than $10,000/Repeaters
H 1040NR (International with ITIN)
I Potential Unproductive Repeaters
J Auto-Generated Notice
K Discrepancy w/EIN Document
L Multi-year Repeater
M Cases with K1 Discrepancy of $100 or more
N Reserved
P Identity Theft cases with posted TC 971 w/AC 501, 504, 505, 506, 522, 523, 524 and/or 525
S Form 1040NR (International with SSN)
T AGI per return is less than $50,000
U AGI per return is $125,000 or more

AUR Subcategory Code

The following table is a list of Subcategory Codes used in the AUR selection process.

SUBCATEGORY CODE POTENTIAL TAX CHANGE
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
  ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
  ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

AUR Treatment Code

The following table is a list of Treatment Codes used in the AUR selection process.

CODE DESCRIPTION AVAILABLE IN FUTURE CORRELATIONS
GNS Global Non-Select Yes
HUB Hub Select No
SFN Soft Notice No
CTR Control Group 1 No
CT2 Control Group 2 No
NON AUR Non-Select Yes
SEL AUR Select No
EXM Exam No
3CO 3rd Correlation No
OCA Office of Compliance Analytics No
UNT Untreated Yes

BUR Category Code

The following table is a list of Category Codes used in the BUR selection process. Category codes are listed in priority order.

CATEGORY NAME DESCRIPTION
04 Non-Employee Compensation Gross Receipts is at least 50% of the discrepancy, and there is a Non-Employee Compensation IR with no other IR Amounts related to Gross Receipts.
12 Interest and Dividends Interest and Dividends are at least 50% of the amount of the total case discrepancy.
10 Interest Interest is at least 50% of the amount of the total case discrepancy.
11 Dividends Dividends is at least 50% of the amount of the total case discrepancy.
40 Capital Gains Capital Gains is the largest amount of the total case discrepancy.
31 Securities Security Sales is the largest amount of the total case discrepancy.
17 Fishing Gross Receipts is at least 50% of the discrepancy, and there is a Fishing IR with no other IR Amounts related to Gross Receipts.
18 Rents Rents is at least 50% of the amount of the total case discrepancy.
19 Royalties Royalties is at least 50% of the amount of the total case discrepancy.
20 Farming Farming Income is at least 50% of the amount of the discrepancy.
21 Medical Payments Gross Receipts is at least 50% of the discrepancy and there is a Medical Payments IR with no other IR Amounts related to Gross Receipts.
22 Positive Distributive Share Income Positive Distributive Share Income is at least 50% of the amount of the discrepancy.
27 Misc Other Income Other Income is at least 50% of the discrepancy and Misc Other Income is the only IR for that line.
43 Bartering Gross Receipts is at least 50% of the discrepancy, and there is a Bartering IR with no other IR Amounts related to Gross Receipts.
48 Payment Card Transactions Payment Card Transactions is the only case discrepancy.
57 Gross Receipts/Other Income Combo Gross Receipts/Other Income types are at least 50% of the discrepancy, and there are more than one IR type present on the case for that issue.
34 Interest Combo Cases where interest is discrepant along with other discrepancies and no one income type meets the 50% criteria on the case.
32 Withholding Cases where withholding is the largest amount of the Total Case Discrepancy.
99 Other Cases not assigned to the above categories.

BUR Subfile Code

The following table is a list of Subfile Codes used in the BUR selection process. Subfile codes are listed in priority order.

SUBFILE DEFINITION DESCRIPTION
B LB&I Cases with BOD code indicating LB&I case.
N NOL Recovery NOL deduction claimed and a prior year BUR adjustment.
Y High Underreporter and Multi-year High Underreporter and Multi-Year Repeater.
R Multi-Year Repeater Multi-Year Repeater.
U Potential Unproductive Repeater Multi-Year Repeater Unproductive in one of the three proceeding years.
P Medical Payments Medical payment discrepancy of $1,000 or more.
E Exam The tax form is 1120S or 1065 with over $500K in Gross receipts and there is a mismatch of at least 50% of gross receipt OR there is a TC 420/424 on the tax module.
X Personal Service Corp Tax Verification NAICS codes: 541110, 541190, 541211 - 541219, 541310 - 541380, 541400, 541511 - 541990, 621111 - 621498, 711100 - 711510 and Form 1120, Sch J, line 2 is < 35% of Form 1120 line 30.
L Low Dollar Yield Unreported Income of less than $3,000.
H High Under Reporter High Underreporter ($100,000 or more).
S Securities Cases with a Securities discrepancy $1,000 or more.
M Payment Card Transactions Cases with a Payment Card Transactions discrepancy over $1,000.
K K-1 Schedule K-1 discrepancy of $1,000 or more.
C Schedule O Controlled Group (Sch O) is filed.
W Withholding Withholding $350 or more.
O Other Other (no previous subfile applies).

BUR Subcategory Code

The following table is a list of Subcategory Codes used in the BUR selection process.

SUBCATEGORY POTENTIAL TAX CHANGE
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
  ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

BUR Treatment Code

The following table is a list of Treatment Codes used in the BUR selection process.

CODE DESCRIPTION AVAILABLE IN LATER CORRELATIONS
GNS Global Non-Select Yes
HUB Hub Select No
OCA Office of Compliance Analytics No
SFN Soft Notice No
CTR Control Group 1 No
CT2 Control Group 2 No
NON Non-Select Yes
SEL Select No
EXM Exam No
3CO Control Group 3 No

Underreporter Correlation and Selection Flowchart

This is an Image: 70767001.gif
 

Please click here for the text description of the image.