- 4.1.27 Information Return Case Selection
- 126.96.36.199 Program Scope and Objectives
- 188.8.131.52 Selection Principles for Information Return Document Matching
- 184.108.40.206 Workload Identification for AUR and BUR
- 220.127.116.11 Case Segmentation
- 18.104.22.168 Information Return Document Matching Systems
- 22.214.171.124.1 Case Identification, Selection and Analysis (CISA)
- 126.96.36.199.2 Business Master File Analysis (BMFA)
- 188.8.131.52.3 Selection of AUR and BUR Inventory
- 184.108.40.206 Delivery Process
- 220.127.116.11 Monitoring and Reporting
- Exhibit 4.1.27-1 Glossary and Acronyms
- Exhibit 4.1.27-2 AUR Category Code
- Exhibit 4.1.27-3 AUR Subfile Code
- Exhibit 4.1.27-4 AUR Subcategory Code
- Exhibit 4.1.27-5 AUR Treatment Code
- Exhibit 4.1.27-6 BUR Category Code
- Exhibit 4.1.27-7 BUR Subfile Code
- Exhibit 4.1.27-8 BUR Subcategory Code
- Exhibit 4.1.27-9 BUR Treatment Code
- Exhibit 4.1.27-10 Underreporter Correlation and Selection Flowchart
Part 4. Examining Process
Chapter 1. Planning and Special Programs
Section 27. Information Return Case Selection
August 07, 2018
(1) This transmits the new Internal Revenue Manual 4.1.27, Information Return Case Selection.
This IRM contains specific information on the workload identification method that is in use by the IRS to identify and select tax returns for Automated Underreporter (AUR) and Business Underreporter (BUR).
(1) This is a new IRM.Return Selection Documentation, Approval, and Review Requirements, dated August 31, 2017.
Director, Exam Case Section
Purpose: This Internal Revenue Manual (IRM) provides guidance to Information Return Case Selection (IRCS) personnel responsible for identifying Automated Underreporter (AUR) and Business Underreporter (BUR) workload and developing criteria for return selection.
Audience: These procedures apply to IRCS employees assigned to the various programs discussed in this IRM.
Policy Owner: IRCS is a function within Examination Operations.
Program Owner: IRCS, Exam Case Selection, is responsible for the content of this IRM.
SB/SE Program Level Objective: Ensure examinations are initiated based on indicators of non-compliance or on other criteria (such as selection for the National Research Program), identified in the Internal Revenue Manual. In addition, ensure a review of the decisions to survey a return (i.e., not initiate an examination) are based upon factors outlined in the Internal Revenue Manual and approved by an appropriate level of management.
IRCS Program Objective: The objective of IRCS is to identify, select and deliver inventory for the AUR and BUR document matching programs within the campuses.
Primary Stakeholders: SB/SE
Underreporter cases are built from two primary sources:
The Master File (MF) which contains information reported to the IRS by taxpayers. This includes current entity information, tax account and filed tax returns.
The Information Return Master File (IRMF) which contains information submitted by payers.
The IMF file contains information reported on:
Form 1040, U.S. Individual Income Tax Return
Form 1040A, U.S. Individual Income Tax Return
Form 1040EZ, Income Tax Return for Single or Joint Filers with No Dependents
The BMF file contains information reported on:
Form 1120, U.S. Corporation Income Tax Return
Form 1041, U.S. Income Tax Return for Estates and Trusts
The IRMF information is matched with the IMF and BMF tax returns to verify certain income, deductions and credits, that can be supported by information returns, are properly reported on the tax return. An AUR or BUR case is identified when a discrepancy is detected between the two data sources. Examples (not all inclusive) of the information returns in the IRMF are:
Form W-2, Wage and Tax Statement
Form 1099-MISC, Miscellaneous Income
Form 1099-PATR, Taxable Distribution Received From Cooperatives
Schedule K-1, Shareholder’s Share of Income, Deductions, Credits, etc.
Form 1099-INT, Interest income
Form 1099-K, Payment Card and Third Party Network Transactions
Internal Revenue Code (IRC) Section 61 states that "except as otherwise provided in this subtitle gross income means all income from whatever source derived.” Discovering all income received by a taxpayer is the starting point for determining which items of income are includable in gross income and subject to Federal income tax.
IRC Sections 6031-6060 and regulations thereunder, contain the requirements for the filing of information returns for reporting purposes.
Revenue Procedure 2005-32 states taxpayer contacts initiated to verify a discrepancy between the taxpayer’s tax return and an information return, are “contacts and other actions not considered an examination, inspection or reopening”.
The Director of Exam Case Selection (ECS) is responsible for providing policy guidance on the selection of cases and delivery of inventory for SB/SE Examination, including AUR and BUR.
The IRCS Program Manager is responsible for:
Establishing internal controls relating to each program or process.
Ensuring that instructions are communicated to and carried out by the assigned employees.
Sets policy, establishes procedures and guidelines and ensures they are applied consistently.
Revises policies as required and redesigns processes as necessary resulting from legislative changes.
Performs managerial reviews of selection decisions during each phase of the selection and delivery process.
Reviews and approves the business requirements written for case selection and Uniform Work Requests (UWRs) annually.
Provides input into the AUR and BUR work plans.
Provides input to the SB/SE Division Strategic Plan.
IRCS analysts are responsible for:
Selection and delivery of AUR and BUR inventory multiple times per year.
Monitor results and perform analytics to improve selection.
Provide campus support and guidance on workload selection and delivery-related issues.
Work collaboratively with Performance Planning & Analysis to develop annual workload plans.
The Director of ECS (or designee) ensures that the IRCS Program Manager reviews adhere to case selection policy.
Managerial reviews of selection decisions occur during each phase of the selection and delivery process.
See Exhibit 4.1.27-1, Glossary and Acronyms below.
See IRM Exhibit 4.19.3-1, Abbreviations, and IRM Exhibit 4.19.3-2, Glossary, for a list of abbreviations and definitions used in AUR processing.
See IRM Exhibit 4.119.4-1, Acronyms, and IRM Exhibit 4.119.4-2, Glossary, for a list of abbreviations and definitions used in BUR processing.
The following IRMs are used by the AUR campuses.
IRM 1.4.19, Automated Underreporter Technical and Clerical Managers and Coordinators Guide
IRM 4.19.2, IMF Automated Underreporter (AUR) Control
IRM 4.19.3, IMF Automated Underreporter Program
IRM 4.19.7, IMF Automated Underreporter (AUR) Technical System Procedures
The following IRMs are used by the BUR campus.
IRM 4.119.1, BMF Underreporter (BUR) Control
IRM 4.119.3, BMF Underreporter (BUR) Manager and Coordinator Handbook
IRM 4.119.4, BMF Underreporter (BUR) Program
Planning and Performance Analysis (PPA) determines the workplan volumes by program and location.
To achieve the workplan volumes, IRCS analysts consider the following factors during the case selection process:
Influence taxpayer behavior (repeaters)
Prior document matching case results (including "agreed" and "no change" rates)
Fairness and integrity
Level of automation
Reliability of data sources
Taxpayer Bill of Rights
SB/SE supports administration of tax law by selecting returns to audit. The primary objective in selecting returns for examination is to promote the highest degree of voluntary compliance on the part of taxpayers while making the most efficient use of finite examination staffing and other resources. Employees must exercise their professional judgment, not personal opinions, when making return selection decisions. As explained in Policy Statement 1-236, IRS employees are expected to carry out our duties with integrity and fairness.
To ensure fairness to the taxpaying public, our Examination Workplan provides a balanced approach for return delivery and allocation of resources to address areas of the Tax Gap by taking into account factors such as income levels, geographic locations, and return types.
To ensure an equitable process for all taxpayers, return selection decisions are made utilizing available experience and/or statistics indicating the probability of substantial error. No one individual can control the examination selection decision- making process. We limit involvement to only those employees whose duties require them to be included.
To ensure fairness to each taxpayer whose return is selected, individual return selection decisions are based on the information contained on the taxpayer’s return and/or the underlying relevant tax law. Managerial as well as quality reviews of selection decisions occur during each phase of the selection and assignment process.
Case selection for AUR and BUR begins with a set of business requirements that are used to define when a case is brought into correlation for potential selection. These business requirements define aspects of the tax returns to be matched to the respective information returns received. When discrepancies arise, the case is created for potential selection.
Requirements for correlation are reviewed by the program analysts on an annual basis to ensure effectiveness and relevance.
New tax forms, legislation and line changes are identified and updated to reflect the specific tax year.
Drop criteria, data elements of specific entity, tax returns, and information returns, are incorporated in the correlation requirements and are reviewed annually to determine if modifications are needed.
Requirements are reviewed and Uniform Work Requests (UWRs) are approved by the Program Manager of IRCS annually.
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
Each case is stratified based on the main underreported issue and identifying traits. To segment the population, cases are assigned the following three definers:
Category: This is a two-digit numeric code that describes the primary discrepant issue on the taxpayer’s return. The code is set based on established criteria such as the percentage of the discrepancy that must be achieved in order for the case to be designated in a particular category. Each case is assigned one category code. See Exhibit 4.1.27-2 and Exhibit 4.1.27-6 for a list of AUR and BUR categories.
Subfile: This is a one-character alphanumeric code that describes reporting attributes of a case or taxpayer. The subfile designation is typically a compliance attribute. Examples include information types, specific return attributes, or compliance behavior such as a repeater. The code is assigned in a priority order, so each case is only assigned one subfile code. See Exhibit 4.1.27-3 and Exhibit 4.1.27-7 for a list of AUR and BUR subfiles.
Subcategory: A one-character alpha code that describes the range of the potential tax change attributed to the discrepancy. If the tax change falls within a specified range, then the subcategory is set. The code is determined based on a discrete range, so each case is only assigned one subcategory code. See Exhibit 4.1.27-4 and Exhibit 4.1.27-8 for a list of AUR and BUR subcategories.
Implementation of the Information Return Document Matching (IRDM) Program encompasses multiple systems:
Data Assimilation: Assimilation identifies the link between tax forms and information returns filed for the same taxpayer.
Data Correlation: Correlation compares tax return and information return data and applies business rules to identify potential underreporter cases. After case selection, data correlation builds a complete case record to be worked by a tax examiner.
Two analytic systems provide IRCS analysts with the ability to define and execute logic for the intelligent selection of inventory to ensure effective case selection.
Case Identification, Selection and Analysis (CISA)
Business Master File Analytics (BMFA)
IRCS uses the Information Return Document Matching Case Inventory Selection and Analysis (IRDMCISA) tool to select cases for AUR to work. The tool’s web-based user interface provides easy access to features and functionality and requires no technical programming knowledge to operate from the interface. The tool includes many user-requested features and capabilities, and it allows users to import, score, analyze, and select cases to work.
CISA provides IRCS analysts with the ability to define and execute logic for the intelligent selection of individual taxpayer case inventory. By comparing cases in the current correlation to similar cases from past correlations, the tool’s capabilities include the following:
Assign each case an Estimated Potential Assessment (EPA) score.
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
IRCS analysts conduct case selection by selecting the optimal mix of underreporting cases to pursue. An optimal mix of cases includes cases that address the following. See IRM 18.104.22.168 for the full list.
Yield the highest assessments or money recovered.
Ensure fair coverage of all taxpayer segments.
The selection of cases is limited to three times per year. These periods of intense selection activities are known as correlation cycles and are based on tax reporting deadlines.
IRCS staff analyzes results, performs program reviews, and monitors rule-based applications used to select inventory. The results are used to make data-driven decisions to improve program quality, improve case selection, and ensure the integrity of the selection methods.
Selection is the process of identifying the optimal mix of cases to be worked in the AUR Program. The process occurs in eight phases described below:
Data Import: Importing the data into the CISA Tool.
Data Checks: Methods for ensuring the data was received and calculated correctly, including volume, calculation data quality, and field population checks.
Scoring: The tool scores the cases by assigning an EPA and a repeater code to each case. Users update the assessment rate table before the first correlation to ensure that the scoring is as accurate as possible.
Business Rules: The user builds business rules, runs the selection tool and reviews the selected inventory. After business rules are complete, cases are generally selected by the highest EPA amount. Rules apply treatment codes to individual cases. See Exhibit 4.1.27-5 Treatment Code, for selection and non-selection of cases. The treatment also identifies if the case will be available in subsequent correlations for selection consideration.
Optimization: The cases are optimized by subfile using one of three methods: No Moving, Balance by BOD, and Corporate Balance. The optimization function allows the workload to be balanced among campuses using the highest EPA in each subfile.
Add/Move: A process of moving higher value cases from one campus to another with lower value cases, to ensure that the best cases are worked by the AUR program.
Export: The user exports cases into cycle extracts by AUR campus.
Reports: The user can view reports that show, for each AUR campus. The total EPA for cases selected, average EPA for cases selected, number of cases selected, and other statistics. These reports can be viewed after correlation.
The Business Master File Analytics (BMFA) system is the tool used for BUR case selection. BMFA is a web-based application that allows a user to drill-down from parent tabs through multiple subtabs. Users have the option to test the integrity of data, build rules to select cases, export data, and view various reports.
BMFA provides IRCS analysts with the ability to define and execute logic for the intelligent selection of business taxpayer case inventory.
IRCS analysts conduct case selection, a process of selecting the optimal mix of underreporting cases to pursue. An optimal mix of cases include cases that address the following. See IRM 22.214.171.124 for the full list.
Yield the highest assessments or money recovered.
Ensure fair coverage of all taxpayer segments.
The selection of cases is limited to two or three times per year dependent on the workplan. These periods of intense selection activities are called correlation cycles and are based on tax reporting deadlines.
IRCS staff analyze results, perform program reviews, and monitor rule-based applications used to select inventory. The results are used to make data-based decisions to improve program quality, improve case selection, and to ensure the integrity of the selection methods.
Selection is the act of identifying the optimal mix of cases to be delivered to the BUR campus. The process occurs in six phases:
Data Import: Importing the data into the Case Selection (BMFA) Tool.
Data Checks: Data Checks are methods for ensuring the data was received correctly, including volume, calculation data quality, data field, and field population checks.
Analysis: Review prior year results manually to readdress business rules based on results.
Business Rules: Build business rules, run the selection tool and view the selected inventory.
Reports: View reports that reflect volumes and average EPA for selected cases, and other statistics.
Export: Export cases to the campus. Inventory is assigned a cycle utilizing a schedule that ensures it will meet the start plan date to conform to the annual work plan.
Generally, IRCS analysts select AUR and BUR inventory three times a year during the correlation process.
≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡
After each correlation, a Summary Report is prepared by IRCS case selection analysts. This manual report documents the selection activities for the respective underreporter program, including executed business rules for each correlation. The summary addresses:
The correlation activities, including case volumes impacted by selection, non-selection and global non-select business rules.
Business justification for each rule, the SQL language or formula written for the rule, any number limitations applied to the rule, the last date the rule was run and the total cases impacted by each rule. Include details as to selected inventory, including volumes by category, subfile and sub-categories.
Documentation used and any analysis performed during the selection process.
The IRCS program manager is required to review and approve the selection summary after each correlation prior to the export process.
Most of the workload delivery is systemic. Cases identified for referral are manually extracted and delivered in the file format preferred by recipient.
All of the workload delivery is systemic. Inventory which has been selected for opening in each campus operation will be created on the AUR system.
AUR cases are worked by seven campuses. Each campus’s inventory is composed of electronic (ELF) and paper cases.
The capacity at each campus is established based on the campus work plan prepared by PPA. The CISA tool determines the minimum EPA inventory required to meet the campus combined capacity (i.e., corporate capacity).
ELF cases with higher EPA than the capacity to be worked are moved to another campus with the capacity to work the cases.
Paper and ELF cases are distributed equally between the campuses.
Optimization is the movement of cases to ensure each campus has the optimal volume and quality of cases. There are three types of optimization that the CISA tool can produce within each individual subfile:
No Moving: Cases are already assigned an AUR campus based on the geographic location of the taxpayer. The No Moving function allows the tool user to designate a minimum EPA threshold.
Balance by BOD: Previously used to balance inventory between the legacy W&I and SBSE campuses.
Corporate Balance: The capacity at each campus is established and the tool determines the minimum overall EPA required to meet the campuses’ combined capacity (i.e., corporate capacity).
Cases selected utilizing the CISA tool are allocated to each campus using the Export function.
Once the cases are exported, notice is sent to IT requesting that the cases be transferred to the AUR Case Management system via Enterprise File Transfer Utility (EFTU).
The notice provides IT the extract cycle to place the cases in the AUR Case Management system.
Once cases are exported, the selected inventory is controlled by each AUR campus.
For BUR inventory selected through BMFA upon finalization of the correlation cycle, a file is generated to the Information Return Document Matching Data Correlation (IRDMDC) group. The IRDMDC analyst transfers the needed files (typically Excel spreadsheets) to the shared server via Enterprise File Transfer Utility (EFTU) that contain case details.
IRCS analysts manually load files to the production database on the shared server.
Campus generates IDRS controls via the GII tool and initiates the posting of TC 925 with Process Code 4030 to Master File.
During AUR case selection, segments of cases can be identified for referral to other compliance programs based on the criteria requested. Rules are written to mark cases with the “EXAM” treatment code based on the referral criteria. The cases identified for referral are excluded from the AUR case selection process and not considered in the selection for regular inventory. IRDMCISA generates a TIN listing of the referral cases with other necessary case data, which is provided to the requestor of the information.
SB/SE Examination PPA monitors program results of the SB/SE campuses. See IRM 126.96.36.199.4, Planning and Performance Analysis.
The table below defines some key terms used in Document Matching.
|AGI||Adjusted Gross Income|
|ASED||Assessment Statute Expiration Date|
|Assessments||A change to the amount of tax on the taxpayer's account; generates a bill or a refund, a new DLN, and/or releases payment and/or freeze code|
|Audit Information Management System (AIMS)||An IDRS control system used by Examination|
|AUR (Automated Underreporter)||Inventory control system used in IMF Underreporter|
|Auto-Generated Notice (AGN)||Cases systemically screened and the CP 2000 and 2501 Notices issued with no Tax Examiner (TE) or clerical handling|
|BMF||Business Master File|
|BOD||Business Operating Division|
|BOE||Business Objects Environment|
|Business Underreporter (BUR)||BMF Underreporter|
|Category Code||A Category is a two-digit numeric code that describes the primary issue on the taxpayer’s return|
|CISA||Case Inventory, Selection and Analytics|
|Correspondence Production Services (CPS)||AUR notices are printed and mailed from one of two CPS. CPS-East is in Detroit and prints/mails for Andover, Atlanta, Brookhaven and Philadelphia. CPS-West is in Ogden and prints/mails for Austin, Fresno and Ogden.|
|CRN||Credit Reference Number|
|CRL||Case Record Layout|
|CSN||Case Sequence Number|
|DCI||Data Collection Instrument|
|Document Locator Number (DLN)||The number assigned to all returns and documents input to the IRS computer system|
|DPAD||Domestic Production Activity Deduction|
|Drop Criteria||Characteristics that indicate a case should be dropped from the correlation or selection process|
|ECC-MEM||Enterprise Computing Center at Memphis|
|ECC- MTB||Enterprise Computing Center at Martinsburg|
|EITC||Earned Income Tax Credit|
|Employer Identification Number (EIN)||Nine-digit number formatted xx-xxxxxxx used to identify taxpayer/taxpayers|
|EPA||The EPA is an estimation of the assessment that each case will yield based on the behavior and characteristics of cases from past AUR correlations|
|Extract||A group of SSNs selected from the inventory of cases identified with possible discrepancies|
|Federal Emergency Management Agency (FEMA)||The agency that helps with disaster relief|
|Federal Record Center (FRC)||A place where tax returns are stored outside the campuses|
|FICA||Federal Insurance Contribution Act|
|FMV||Fair Market Value|
|FOIA||Freedom of Information Act|
|FTE||Full Time Employment|
|FTF||Failure to File Penalty|
|FTP||Failure to Pay Penalty|
|Global Non-Select||These are universal rules designed to avoid certain types of cases across the entire inventory and exclude them from all other treatments|
|HAS||Health Savings Account|
|IDRS||Integrated Data Retrieval System|
|IMF||Individual Master File|
|Internal Process Code (IPC)||A numeric/alpha code used for tracking cases within the AUR and BUR programs (does not upload to IDRS)|
|Integrated Submission and Remittance Processing (ISRP)||The automated system that converts all paper documents to electronic form, including payments|
|IRA||Individual Retirement Account|
|IRC||Internal Revenue Code|
|IRDM||Information Reporting and Document Matching|
|IRM||Internal Revenue Manual|
|IRMF||Information Return Master File|
|IRPCA||Information Returns Program Case Analysis|
|IRS||Internal Revenue Service|
|ISRP||Integrated Submission and Return Processing|
|LB&I||Large Business and International|
|Modernized e-File (MeF)||The system used to view electronically filed returns|
|MFT||Master File Tax|
|NIIT||Net Investment Income Tax|
|NOL||Net Operating Loss|
|Non-Select||Cases are non-selected to actively remove them from consideration for AUR/BUR casework|
|OCA||Office of Compliance Analytics|
|OIC||Offer in Compromise|
|POC||Point of Contact|
|PPA||Planning and Performance Analysis|
|PHC||Personal Holding Company|
|Process Codes (PC)||Two or four digit numbers used to identify the action taken on a case|
|PRP||Programming Requirements Package|
|PTC||Premium Tax Credit|
|PTIN||Preparer Identification Number|
|QPSC||Qualified Personal Service Corporation|
|QTP||Qualified Tuition Program|
|Referral||A case sent to another area for technical determination|
|Research||Request for additional information needed to continue processing|
|RPS||The Remittance Processing System within ISRP|
|RRB||Railroad Retirement Board|
|RSED||Refund Statute Expiration Date|
|SBSE||Small Business Self Employed|
|Screening||A technical review of information returns compared against the tax return. The Screening phase of the Underreporter Program is also referred to as Analysis|
|SCRIPS||Service Center Recognition/ Image Processing System|
|SERP||Servicewide Electronic Research Portal|
|SLID -||Student Loan Interest Deduction|
|Standard employee identifier (SEID)||A five digit alpha/number that identifies an IRS Employee|
|SSA||Social Security Administration|
|Social Security Number (SSN)||A nine digit number formatted xxx-xx-xxxx, used to Identify tax payers accounts|
|SST||Social Security Tax|
|Subcategory Code||A Subcategory is a one-character alphabetical code (from A through G). The subcategory describes a computer estimate of the tax change based on the under reported amount|
|Subfile Code||A Subfile is a one-character alphanumeric code that identifies the high-level characteristic of the case|
|TAS||Taxpayer Advocate Service|
|Taxpayer Delinquent account (TDA)||A Collection Status|
|Taxpayer Information File (TIF)||Individual Master File data from ECC containing tax account and tax transaction information|
|Taxpayer Identification Number (TIN)||A nine digit number used to Identify tax payers accounts.|
|TE/GE||Tax Exempt and Government Entities|
|TPI||Total Positive Income|
|Transaction Code (TC)||An information marker generated through IDRS to describe actions taken|
|UWR||Unified Work Request|
|W&I||Wage & Investment|
|WRMS||Work Request Management System|
The following table is a list of Category Codes used in the AUR selection process.
|01||100 % Mortgage and/or points paid|
|02||IRAs Over-deducted on Form 5498|
|05||50 % Gross Receipts w/NEC/Fishing income/Bartering|
|06||Education Credit 80% (Form 1098-T present)|
|09||100 % Interest or Dividends|
|13||Pensions and Annuities (1099-R) Taxable|
|14||Pensions and Annuities (1099-R) Gross|
|18||Tuition and Fees (Over-deducted)|
|19||Rents and Royalties|
|22||Distributive Share - Form 1065 and Form 1120-S|
|23||Distributive Share - Form 1041|
|28||Payments in Lieu of Dividends|
|29||Cancellation of Debt|
|30||Qualified Tuition Program Earnings|
|31||Securities Sales - 100 or fewer IRs|
|32||Early Withdrawal Penalty|
|35||State Income Tax Refund|
|36||Mortgage Interest Deduction|
|37||SEP Contribution 80 %|
|38||Reemployment Trade Adjustment Assistance (RTAA) - Form 1099-G|
|39||Securities Sales > 100 IRs|
|40||Gross Capital Gain|
|41||Combination (not Categories 02, 31)|
|42||Early Distribution 10 % Tax ($150 or more)|
|43||Bartering Income Discrepancy|
|44||Dependent Care Benefits|
|45||IRA Overdeduction AGI dollar limitation|
|50||Self-Employment Income Discrepancy|
|51||80 % Taxable Pension and Annuity Distribution Discrepancy|
|57||NEC on Schedule F and/or Form 4835|
|58||Medical Savings Account Distribution|
|59||Medical Savings Account Contribution|
|60||Gross Long-Term Care Benefit|
|61||Securities Sales, 100 or fewer Stock IRs, No Sch D|
|62||Student Loan Interest Deduction (SLID) (O/D)|
|63||Health Saving Account Distributions|
|64||Health Saving Account Contributions|
|65||Real Estate Income Discrepancy|
|66||Taxable Pension and Early Distribution 10 % Tax|
|67||100 % Mortgage and/or points paid - no Form 1098|
|68||Tuition and Fees (Over-deducted) - no Form 1098-T|
|69||Education Credits (over claimed) - no Form 1098-T|
|70||SSB/RRB plus Taxable Pension discrepancy is at least 60%|
|71||Wages plus withholding discrepancy is at least 60%|
|79||Securities Sales > 100 Stock IRs, No Sch D|
|99||When case cannot be assigned to Categories 01 - 79|
The following table is a list of Subfile Codes used in the AUR selection process.
|6||AGI per return is $50,000 - $124,999|
|8||Cases with a Form 1099-K discrepancy of $100 or more|
|9||Returns filed with ITIN|
|B||Credit Tax Change (W/H)|
|C||Stock, Bond or Real Estate Discrepancy|
|D||UR Income Greater Than $10,000|
|E||UR and EIC Present|
|F||Mortgage Interest Over-deduction|
|G||UR Income Greater than $10,000/Repeaters|
|H||1040NR (International with ITIN)|
|I||Potential Unproductive Repeaters|
|K||Discrepancy w/EIN Document|
|M||Cases with K1 Discrepancy of $100 or more|
|P||Identity Theft cases with posted TC 971 w/AC 501, 504, 505, 506, 522, 523, 524 and/or 525|
|S||Form 1040NR (International with SSN)|
|T||AGI per return is less than $50,000|
|U||AGI per return is $125,000 or more|
The following table is a list of Subcategory Codes used in the AUR selection process.
|SUBCATEGORY CODE||POTENTIAL TAX CHANGE|
|≡ ≡ ≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡ ≡ ≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡ ≡ ≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
The following table is a list of Treatment Codes used in the AUR selection process.
|CODE||DESCRIPTION||AVAILABLE IN FUTURE CORRELATIONS|
|CTR||Control Group 1||No|
|CT2||Control Group 2||No|
|OCA||Office of Compliance Analytics||No|
The following table is a list of Category Codes used in the BUR selection process. Category codes are listed in priority order.
|04||Non-Employee Compensation||Gross Receipts is at least 50% of the discrepancy, and there is a Non-Employee Compensation IR with no other IR Amounts related to Gross Receipts.|
|12||Interest and Dividends||Interest and Dividends are at least 50% of the amount of the total case discrepancy.|
|10||Interest||Interest is at least 50% of the amount of the total case discrepancy.|
|11||Dividends||Dividends is at least 50% of the amount of the total case discrepancy.|
|40||Capital Gains||Capital Gains is the largest amount of the total case discrepancy.|
|31||Securities||Security Sales is the largest amount of the total case discrepancy.|
|17||Fishing||Gross Receipts is at least 50% of the discrepancy, and there is a Fishing IR with no other IR Amounts related to Gross Receipts.|
|18||Rents||Rents is at least 50% of the amount of the total case discrepancy.|
|19||Royalties||Royalties is at least 50% of the amount of the total case discrepancy.|
|20||Farming||Farming Income is at least 50% of the amount of the discrepancy.|
|21||Medical Payments||Gross Receipts is at least 50% of the discrepancy and there is a Medical Payments IR with no other IR Amounts related to Gross Receipts.|
|22||Positive Distributive Share Income||Positive Distributive Share Income is at least 50% of the amount of the discrepancy.|
|27||Misc Other Income||Other Income is at least 50% of the discrepancy and Misc Other Income is the only IR for that line.|
|43||Bartering||Gross Receipts is at least 50% of the discrepancy, and there is a Bartering IR with no other IR Amounts related to Gross Receipts.|
|48||Payment Card Transactions||Payment Card Transactions is the only case discrepancy.|
|57||Gross Receipts/Other Income Combo||Gross Receipts/Other Income types are at least 50% of the discrepancy, and there are more than one IR type present on the case for that issue.|
|34||Interest Combo||Cases where interest is discrepant along with other discrepancies and no one income type meets the 50% criteria on the case.|
|32||Withholding||Cases where withholding is the largest amount of the Total Case Discrepancy.|
|99||Other||Cases not assigned to the above categories.|
The following table is a list of Subfile Codes used in the BUR selection process. Subfile codes are listed in priority order.
|B||LB&I||Cases with BOD code indicating LB&I case.|
|N||NOL Recovery||NOL deduction claimed and a prior year BUR adjustment.|
|Y||High Underreporter and Multi-year||High Underreporter and Multi-Year Repeater.|
|R||Multi-Year Repeater||Multi-Year Repeater.|
|U||Potential Unproductive Repeater||Multi-Year Repeater Unproductive in one of the three proceeding years.|
|P||Medical Payments||Medical payment discrepancy of $1,000 or more.|
|E||Exam||The tax form is 1120S or 1065 with over $500K in Gross receipts and there is a mismatch of at least 50% of gross receipt OR there is a TC 420/424 on the tax module.|
|X||Personal Service Corp Tax Verification||NAICS codes: 541110, 541190, 541211 - 541219, 541310 - 541380, 541400, 541511 - 541990, 621111 - 621498, 711100 - 711510 and Form 1120, Sch J, line 2 is < 35% of Form 1120 line 30.|
|L||Low Dollar Yield||Unreported Income of less than $3,000.|
|H||High Under Reporter||High Underreporter ($100,000 or more).|
|S||Securities||Cases with a Securities discrepancy $1,000 or more.|
|M||Payment Card Transactions||Cases with a Payment Card Transactions discrepancy over $1,000.|
|K||K-1||Schedule K-1 discrepancy of $1,000 or more.|
|C||Schedule O||Controlled Group (Sch O) is filed.|
|W||Withholding||Withholding $350 or more.|
|O||Other||Other (no previous subfile applies).|
The following table is a list of Subcategory Codes used in the BUR selection process.
|SUBCATEGORY||POTENTIAL TAX CHANGE|
|≡ ≡ ≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡ ≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡ ≡ ≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
|≡ ≡||≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡|
The following table is a list of Treatment Codes used in the BUR selection process.
|CODE||DESCRIPTION||AVAILABLE IN LATER CORRELATIONS|
|OCA||Office of Compliance Analytics||No|
|CTR||Control Group 1||No|
|CT2||Control Group 2||No|
|3CO||Control Group 3||No|