4.1.26 Campus Exam Return Selection, Delivery and Monitoring

Manual Transmittal

December 13, 2023

Purpose

(1) This transmits revised IRM 4.1.26, Campus Exam Return Selection, Delivery and Monitoring.

Material Changes

(1) IRM 4.1.26.1.6, added acronyms

(2) IRM 4.1.26.2, under (2)a) added sentences to the threshold determination paragraph.

(3) IRM 4.1.26.3.1, added information relevant to Non-EITC credits.

(4) IRM 4.1.26.7, added Review of Program Operation and Performance.

(5) Reviewed and updated IRM reference and links and made editorial changes throughout this IRM.

Effect on Other Documents

IRM 4.1.26, Campus Exam Return Selection, Delivery and Monitoring, dated 12/07/2022, effective date 01/01/2023 is superseded.

Audience

This IRM is intended for the use by all stakeholders involved in the campus examination workload identification process including Wage & Investment (W&I) Refundable Credits Program Management (RCPM) and Small Business/Self-Employed (SB/SE) Campus Examination.

Effective Date

(01-01-2024)

Ishmael P. Alejo
Director, Refundable Credits Program Management (RCPM)
Wage and Investment Division (W&I)

Program Scope and Objectives

  1. Purpose: This IRM provides guidance to personnel responsible for developing criteria for audit return selection for the Refundable Credits Examination Operation (RCEO) and Small Business/Self-Employed (SB/SE) Campus operations and to ensure that the internal controls exist for the workload identification processes. The IRS Mission statement includes enforcing the tax law with integrity and fairness to all. As Examination employees, we must provide the best possible service to the public and are expected to perform our duties with integrity and fairness to all. See IRM 1.2.1.2.36 .

  2. Audience: This IRM is intended for use by all stakeholders involved in the campus examination workload identification process including Wage & Investment (W&I) Refundable Credits Program Management (RCPM) and Small Business/Self- Employed (SB/SE) Campus Examination.

  3. Policy Owner: The Director of Refundable Credits Program Management (RCPM) owns the policy information contained in this IRM.

  4. Program Owner: W&I and SB/SE Headquarter Analysts are responsible for the administration of and updates to the content.

  5. Primary Stakeholders:

    • W&I

    • SB/SE

  6. Contact Information: To recommend changes or make any other suggestions for this IRM section, send an e-mail to the IRM author or use the Servicewide Electronic Research Program (SERP) Feedback Application.

Background

  1. This IRM was created to provide guidance on workload selection criteria for W&I and SB/SE Campus Examination and to ensure the organization is meeting the IRS Mission Statement to, "Provide America’s taxpayers top-quality service by helping them understand and meet their tax responsibilities and enforce the law with integrity and fairness to all."

Authority

  1. Exam uses the Internal Revenue Code, Regulations, Policy Statements, and Correspondence Examination Policy and Procedures. The IRM has links to the appropriate sources, as necessary.

Responsibilities

  1. W&I’s Correspondence Examination audit inventory is primarily selected systemically using risk-based scoring criteria. W&I’s Refundable Credits Program Management (RCPM) Operation has primary responsibility for developing and maintaining these criteria. They also have primary responsibility for developing the W&I Examination work plan which outlines the volumes and timeframes of audit initiations based on existing resources and using the available scored returns. The work plan is meant to be used as a guide which can be modified based on existing priorities. The actual audits are conducted by W&I Refundable Credits Examination Operations (RCEO) and consists of five campus operations located in Andover, MA, Atlanta, GA, Austin, TX, Kansas City, MO and Fresno, CA.

  2. SB/SE Campus Case Selection (CCS) is responsible for the case selection and delivery of SB/SE correspondence examinations. CCS resides under Examination Headquarters and the staff consists of two teams: Campus Workload Identification (CWI) and Campus Workload Delivery (CWD).

  3. RCEO and CCS work collaboratively with their respective workload planning staffs; Refundable Credits Program Management: Program Management (RCPM:PM) for RCEO and Performance Planning & Analysis Examination for CCS, to develop annual audit start plans. Start plans (AKA work plans) outline the estimated volumes by issue and the proposed timeframes for initiation.

  4. Potential available inventories are filtered, prioritized, and introduced into the campus work stream based on the start plan.

  5. Headquarter analysts in both W&I and SB/SE provide campus support and guidance on workload selection and delivery-related issues.

  6. The primary responsibility of all stakeholders in the workload identification and selection processes is to ensure that the organization is meeting the IRS Mission statement to provide America’s taxpayers top-quality service by helping them understand and meet their tax responsibilities and enforce the law with integrity and fairness to all.

Program Management and Review

  1. Headquarters analyzes audit results, performs program reviews, and monitors rule-based applications to select inventory. The audit results are used to make data-based decisions to improve program quality, improve case selection and to ensure the integrity of the selection methods.

    • W&I Program Management (PM) is responsible for providing oversight and monitors the performance results of the EITC and Non-EITC programs at the Refundable Credits Examination Operations (RCEO) campuses.

    • SB/SE Examination Performance Planning and Analysis provides support and monitors program results of the SB/SE campuses.

Program controls

  1. W&I and SB/SE Headquarter analysts provide program support to analyze audit results, perform program reviews, and monitors rule-based applications to select inventory. The selected inventory results are used to make data-based decisions to improve program quality, case selection, and to ensure the integrity of the selection methods.

Acronyms

  1. Most of the acronyms used by Examination can be located on the acronym database at http://rnet.web.irs.gov/Resources/Acronymdb.aspx

  2. See table below for common acronyms used in this IRM.

    Acronym Definition
    BLP Batch Leveraging Processing
    CCS Campus Case Selection
    CWD Campus Workload Delivery
    CWI Campus Workload Identification
    CWPA Campus Workload Planning and Analysis
    DDB Dependent Database
    DEBR Discretionary Exam Business Rules
    EQTS Exam Quality and Technical Support
    RCEO Refundable Credits Examination Operation
    RCPM Refundable Credits Program Management
    RME Rules Maintenance Engine
    RPT Revenue Protection Technology
    UCP Unattended Case Processing

Related Resources

  1. Examination Employees are responsible for researching and utilizing information contained in all reference materials. Other IRM chapters provide information on single topics that pertain to more than one functional group. The following table provides links to some of the most commonly used research resources. For information on other IRMs refer to http://publish.no.irs.gov/pubsys/irm/numind.html.

    Reference Link
    IRM Part 3, Submission Processing http://publish.no.irs.gov/pubsys/irm/indp03.htm
    IRM Part 4, Examining Process http://publish.no.irs.gov/pubsys/irm/indp04.htm
    IRM Part 21, Customer Account Services http://publish.no.irs.gov/pubsys/irm/indp21.htm

Cases Selected for Examination

  1. The determination of the volume of inventory selected for audit by program should consider a variety of factors. Among those factors are:

    • Issue coverage

    • Revenue protection

    • Taxpayer burden

    • Prior audit results (including "agreed" and "no change" rates)

    • Fairness and integrity, See IRM 1.2.1.2.36.

    • Available resources

    • Program specific funding

    • Level of automation

    • Reliability of data sources

    • Taxpayer Bill of Rights

  2. There are two key factors that should be considered when creating selection criteria; how the inventory selected will assist in meeting the pre-defined program level objectives of Service and Enforcement, and that inventory is selected without bias, ensuring "fairness" in the inventory selection processes.

    1. One component of meeting program level objectives is ensuring that resources are used effectively and efficiently. ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ The document is approved by the SB/SE CCS Program Manager and is stored in the SB/SE CCS SharePoint site.

    2. The W&I RCPM Director with primary responsibility over developing and maintaining the selection criteria for the Refundable Credit programs will conduct an annual certification to determine that the process used for Correspondence Exam case selection is impartial, fair and administered with the highest degree of integrity using the Form 15047, Audit Selection Internal Control Certification. The certification form will be completed and signed by the RCPM Director by September 30th each year. The certification form outlines the audit selection objectives and contains a series of questions to measure to what extent those objectives were met. If a “no” is notated for any of the requirements, an explanation is required in the “Comments” section. An approved risk assessment is mandatory for each requirement with a “no” response. The formal assessment will outline the risk(s). The signed risk assessment will be attached to the signed Form 15047, Audit Selection Internal Control Certification, and kept electronically by Program Management (PM) for a period of five years along with the DDB Meeting(s) minutes, Unified Work Requests (UWR), and signed Form 14747, Workload Identification Change Request Modification Approval.

  3. There are three parts to the fairness aspect when developing selection criteria for audit:

    • Fairness to the taxpaying public by pursuing those who fail to voluntarily comply or otherwise meet their tax obligations

    • An equitable process that selects returns for examination based on the likelihood of reporting errors across all areas of potential noncompliance and

    • Fairness to the individual taxpayers who are being examined by respecting and adhering to their rights.

W&I Correspondence Examination Guidance for Fairness in Case Selection

  1. To ensure fairness to the taxpaying public, those responsible for workload selection should consider the responsibilities and obligations that all taxpayers share, and pursue those individuals and businesses who do not comply with their tax obligations based purely on eligibility using data provided by the taxpayer or from reliable third parties. In this way, we are being fair to those who are compliant and that, in turn, helps promote public confidence in our tax system for all taxpayers.

  2. To ensure an equitable process for all taxpayers, fairness and integrity are built into the foundation of our return selection process, which is designed to select returns across those submitted with the highest likelihood of noncompliance by relying on a combination of tools. The entire process operates under a comprehensive set of checks, balances, and safeguards, all aimed at delivering and ensuring a process that is fair by design. No one individual can control the examination selection decision-making process, and we limit involvement to only those employees whose duties require participation. This creates a process that is impartial and applied consistently to each taxpayer return.

  3. To ensure fairness to each taxpayer whose return is examined, those responsible for workload identification and selection will work with a focus on taxpayer rights, a responsibility that is a priority for all IRS employees in their work every day. These taxpayer rights are embodied in the Taxpayer Bill of Rights, which outlines the 10 fundamental rights taxpayers have when working with the IRS. Managers and examiners adhere to administrative and legislative procedures, including managerial and quality reviews. Also, IRS employees are managed and evaluated on how well we provide fair and equitable treatment to taxpayers as required by the Restructuring and Reform Act of 1998. Finally, taxpayers may administratively appeal most IRS decisions, including the assessment of additional tax or penalties or the denial of a refund claim. An employee in the Independent Office of Appeals, an independent and impartial function within IRS, will contact the taxpayer, hear the case, and decide whether to sustain the results of the examination. Most taxpayers can also petition the U.S. Tax Court for a pre-assessment review of any proposed additional tax or seek a refund in other federal courts.

SB/SE Campus Examination Guidance for Fairness in Case Selection

  1. SB/SE supports administration of tax law by selecting returns to audit. The primary objective in selecting returns for examination is to promote the highest degree of voluntary compliance on the part of taxpayers while making the most efficient use of finite examination staffing and other resources.

  2. SB/SE Examination program-level objective addressing fairness in returns selection is as follows: Ensure examinations are initiated based on indicators of noncompliance or on other criteria (such as selection for the National Research Program) identified in the Internal Revenue Manual. In addition, ensure a review of the decisions to survey a return (i.e., not initiate an examination) are based upon factors outlined in the Internal Revenue Manual and approved by an appropriate level of management.

  3. SB/SE employees must exercise their professional judgment, not personal opinions, when making return selection decisions. As explained in Policy Statement P-1-236, IRS employees are expected to carry out their duties with integrity and fairness.

    • To ensure fairness to the taxpaying public, our Examination Workplan provides a balanced approach for return delivery and allocation of resources to address areas of the Tax Gap by considering factors such as income levels, and return types.

    • To ensure an equitable process for all taxpayers, return selection decisions are made utilizing available experience and/or statistics indicating the probability of substantial error. No one individual can control the examination selection decision-making process. We limit involvement to only those employees whose duties require them to be included.

    • To ensure fairness to each taxpayer whose return is selected, individual return selection decisions are based on the information contained on the taxpayer’s return and/or the underlying relevant tax law. Managerial as well as quality reviews of selection decisions occur during each phase of the selection and assignment process.

Sources of Potential Examinations

  1. Examination Headquarters identifies potential casework from many different sources. When new sources of potential examinations are identified, Headquarters will use IRS applications to select and deliver the inventory to the campus Exam Operations. The main sources of potential examination workload are in the following sections. An important distinction is made between inventory referral criteria and inventory selection criteria. Examination Headquarter functions responsible for inventory selection are not responsible for criteria used to identify cases to be referred to Campus Examination from outside areas even though they may be asked for input. Examples of these types of referrals are Criminal Investigation (CI), Questionable Refund Program (QRP) and Exam Quality and Technical Support (EQTS) referrals.

  2. ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡≡ ≡ ≡ ≡ ≡ ≡≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

Dependent Database (DDB)

  1. ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

    1. The DDB identifies potential non-compliance relevant to the EITC and other tax benefits related to the dependency exemptions, based on the relationship and residency of children. EITC returns selected for audit by the DDB application must comply with the tax laws for claiming EITC, as well as other tax issues, such as dependents, filing status, Child and Dependent Care Credit, Child Tax Credit, education benefits and other refundable credits.

    2. The DDB identifies potential non-compliance issues relevant to Non-EITC refundable credits, including Fuel Tax Credit, Claim of Right Credit, Education Credit and Child Tax Credit.

    3. The DDB identifies potential non-compliant issues relevant to PTC and other tax benefits, which includes Advanced PTC. The DDB selects returns with discrepancies related to eligibility, participation, premiums, and Second Lowest Cost Silver Plan.

Process to Create/Modify DDB Rules
  1. These rules may be modified or created through a Unified Work Request (UWR). Annual rule modifications are accomplished through a collaborative effort with IT, W&I and SB/SE Business Analysts, and senior management utilizing prior audit results, program review results, feedback from tax examiners, and legislative changes. Stakeholders involved in this yearly collaboration include representatives from Return Integrity and Compliance Services (RICS), Small Business/Self Employed Division (SB/SE) Exam, and Information Technology (IT).

DDB Approval Path
  1. Agreements to recommendations for the modifications and creations of new rules in the DDB process are discussed with senior managers and headquarter analysts of the impacted stakeholders to reach a consensus. Agreed upon recommendations are vetted with IT prior to submission. Approved revisions are entered into the Work Request Management System (WRMS) by the business unit working with IT. The WRMS is initiated by RCPM with input from SB/SE, RICS, and IT. Both the Dependent Database owner and IT programmers must concur with the changes prior to implementation.

Amendments to the Approved Unified Work Request Process
  1. When it is necessary to revise or correct selection requirements outside of the normal Unified Work Request (UWR) timeframes, the analyst must secure management approval using Form 14747, Change Request Modification Approval. The form must provide the following information and have executive approval prior to implementation:

    • The type of modification requested (i.e., correction, enhancement, or TIGTA/GAO Mandate)

    • A description of the change requested, and

    • The reason for the modification requested

  2. Upon approval, the signed Form 14747 must be forwarded to the Senior Analyst of the RCPM PM group to be maintained.

Verification of Rule Function for DDB
  1. Verification of Rule Function - As each processing year commences and tax returns begin to fire DDB rules, RCPM conducts sample reviews of selections to ensure the rules are functioning correctly. For each program reviewed, documentation is prepared that shows the project code reviewed, the results of the review, description of defects found, and steps taken to resolve or mitigate defects found. The information is shared with management for approval and/or resolution.

Discretionary Exam Business Rules (DEBR)

  1. The Discretionary Exam Business Rules (DEBR) are a subset of the workload selection rules built into the Dependent Database (DDB) programming.

  2. ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

  3. The rule filters are programmed by IT and applied against all returns as they are filed. Each processing cycle, IT provides the business unit with an analysis, via Business Objects, of the number of returns meeting the business rules. As inventory is needed to meet the work plan, Exam CWI selects the volume and type of cases to deliver to the campuses. IT runs these through standard screening filters and then establishes the cases on the Audit Information Management System (AIMS). Certain types of cases require additional screening by a Headquarters Analyst before being assigned to a campus. These are specified in the annual rules package. Any cases researched/screened in this manner can be established on AIMS by the analyst themselves, submitted to IT through an ad hoc AIMS request, or sent to CWD.

Process to Create/Modify DEBR Rules
  1. Process for criteria change - SB/SE CCS holds an annual meeting with the IT DDB staff representatives and other impacted stakeholders to discuss the effectiveness of the rules and to finalize recommended revisions to the selection criteria and processing for the upcoming filing season. In addition to providing valuable feedback to the businesses, IT’s primary role in these workshops is to ensure the rules requested can be carried out within the limitations of the systems available. Consensus of all impacted stakeholders is needed to submit the final revised recommendations of the selection criteria for SB/SE Senior Management approval. The SB/SE CCS DEBR analyst creates a DEBR IT program package and a summary of proposed rule changes for the upcoming processing year as a result of this meeting.

DEBR Approval Path
  1. All rule modifications and additions, agreed upon by the business stakeholders and vetted by IT during the annual review process, are submitted to the SB/SE CCS Program Manager for approval by the SB/SE CCS DEBR analyst.

Verification of Rule Function for DEBR
  1. Verification of Rule Function - As each processing year commences and tax returns begin to fire DEBR rules, SB/SE CCS analysts conduct sample reviews to ensure the rules are functioning correctly. Identified defects are brought to attention of the responsible IT representative. The reviewing SB/SE analyst coordinates corrective action with IT. For each rule reviewed, a document is created that shows the rule reviewed, the results of the review, description of defects found, and steps taken to resolve or mitigate defects found. Additionally, this document contains the TINs reviewed and how they tested against each specific rule condition. The final document is reviewed and signed by the SB/SE CCS Program Manager.

Compliance Data Environment (CDE)

  1. Compliance Data Environment (CDE) is a workload identification, planning and delivery system that operates in a web-based environment. It can be used to filter, order, classify, and deliver returns for examination. In CCS, CDE is primarily used to analyze return data meeting specified criteria (rules) to produce a population of returns having audit potential. SB/SE CCS analysts gather CDE output and perform further classification and filtering to make the final case selections. More information on CDE functionality may be found in IRM 4.103, Compliance Data Environment (CDE).

Process to Create/Modify CDE Rules
  1. SB/SE CCS analysts are responsible for monitoring and determining necessary changes to the rules/filters for the project codes that they are responsible for delivering. Project code reviews, directed by CCS management, are conducted by the assigned analysts. Considering the findings of the reviews, the performance of the project, analysis of the statistics, and feedback from the campus each analyst will consider necessary rule changes before pulling the bulk of their inventory each year in CDE. The assigned CWI analyst will sign into CDE, select the rule, and make changes as needed before running the rule. For any CDE rule changes beyond annual updates of tax year and cycle, the analyst must secure approval from the CCS Program Manager before using the rule for selecting cases.

CDE Approval Path
  1. All rule modifications and additions agreed upon by the CWI analyst, CWI manager, and CCS Program Manager are placed in the updated Project Code Procedure Documents on the CCS SharePoint, and then used to select inventory.

Referrals

  1. Referrals are cases identified by functions other than Examination that have audit potential. Cases are referred from the Lead Development Center (LDC) for Criminal Investigation (CI), Treasury Inspector General for Tax Administration (TIGTA), Collections, Return Integrity and Verification Operations (RIVO), Submission Processing (SP), and other areas.

Unallowables
  1. The Unallowable (UA) Code Program is a Compliance initiative used to identify potential audit inventory during the processing of the original tax return. The Unallowable program identifies issues on a return that are unallowable by law to prevent erroneous tax refunds on both IMF (Individual Master File) and BMF (Business Master File) tax returns. The program requires coordination between W&I, SB/SE, Large Business and International (LB&I) Examination, and Submission Processing (SP) Headquarters to ensure the correct types of Unallowable Conditions are being identified and to provide guidance to impacted employees. Returns identified for audit potential are referred to Exam.

Process to Create/Modify Unallowables Criteria
  1. Submission Processing has the primary oversight of the Unallowable program; however, the criteria for Exam referrals are developed in conjunction with Exam.

Unallowables Approval Path
  1. Referral Criteria is outlined in the SP IRM 3.11.3, Individual Income Tax Returns. The procedures for working the examination are outlined in the Correspondence Exam IRM 4.19.14.18, Unallowable Code (UA) Program. These IRMs are reviewed by Senior Management and are signed off by executives.

    Note:

    ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

Math Error
  1. Math (or clerical) errors are defined by IRC 6213(g)(2). This legislation permits assessment of additional tax resulting from math or clerical errors. Submission Processing contacts the taxpayer for supporting documentation. If the taxpayer disputes the request and examination criteria are met, the case is forwarded to Exam for review and validation.

  2. Accounts Management (AM) systemically verifies and scores EITC, CTC/ACTC, and AOTC math errors using Command Code (CC) DDBCK. The criterion for classification is similar to regular Exam DDB classification. If selected, AM will send the case to Exam.

  3. ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

Process to Create/Modify Math Error Criteria
  1. Submission Processing (SP) has the primary oversight of the Math Error program; however, SP collaborates with W&I and SB/SE Exam to develop referral criteria. Referral Criteria is outlined in the SP IRM 3.12.3, Individual Income Tax Returns. The procedures for working the examination are outlined in the Correspondence Exam IRM 4.19.14.10.1, Math Error Referrals to Examination and IRM 4.19.15.10, Math/Clerical Error. When the Exam, AM, or SP functions identify the need for a possible change to a specific issue or tolerance, or when another function such as TAS elevates an issue to Exam, AM, or SP Headquarters for consideration, the functions meet to discuss the business reasons and customer impact for the potential change.

Math Error Approval Path
  1. If Exam Headquarters analysts from both W&I and SB/SE agree that a change is warranted the Exam analysts discuss the proposed changes and alternatives with their respective executives and secure written approval or advice. The coordinating analyst will then advise the responsible Customer Account Services (CAS) Headquarters analyst of the changes to be made to the Math Error criteria. After receiving concurrence, the CAS Headquarters analyst posts the IRM Procedural Update (IPU) to SERP. These IRMs are reviewed by senior management and are signed off by executives.

Questionable Refund Program (QRP)
  1. QRP referrals are referred to Examination by the Return Integrity and Verification Operations (RIVO). The RIVO utilizes the Electronic Fraud Detection System (EFDS) to screen paper and electronically filed returns to verify the accuracy of taxpayer’s wages and withholding. Cases with refundable credits above referral threshold in addition to the false/inflated income and withholding are referred to Examination.

Process to Create/Modify QRP Criteria
  1. Return Integrity and Correspondence Services, Return Integrity and Verification Operations (RIVO) has ownership on the verification process. RCPM assists RIVO with developing and revising criteria for cases referred to Exam.

QRP Approval Path
  1. ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

QRP Model Updates
  1. Early in the filing season, the DM Team will develop Requirements, Models and Fraud Definitions. These findings are presented to stakeholders and impacted organization (RICS, CI, SB/SE, EXAM, MDD, DM, Office of Compliance Analytics (OCA) and Application Development) during DM Decision Point sessions. Any consolidation, deleting, or introduction of models is presented to the Revenue Protection Technology (RPT) Governance Board for approval.

Alternative Minimum Tax (AMT)
  1. Alternative Minimum Tax (AMT) program cases are identified by Submission Processing. The AMT program identifies taxpayers who are liable for the AMT but have not completed or attached Form 6251, Alternative Minimum Tax - Individuals.

  2. Submission Processing notifies taxpayers that they appear to be liable for the AMT by issuing Letter 12-C. If the taxpayer does not respond timely to a request for substantiation, or disagrees, the return is coded as an Audit Code P. This establishes the case on AIMS and the case is sent to be processed through the Unattended Case Processing (UCP) system. Inventory is filtered and classified by CWD and the workable inventory is delivered to the Batch Leveraging Process (BLP) team.

Process to Create/Modify AMT Criteria
  1. Submission Processing has the primary oversight of the AMT program; however, the criteria for Exam referrals are developed in conjunction with Exam.

AMT Approval Path
  1. Referral criteria are outlined in the SP IRM 3.12.3 Individual Income Tax Returns. The IRM is reviewed by senior management and signed off by executives.

Informant Referrals
  1. The Informant Referrals are taxpayer-generated. Taxpayers submit Form 3949-A, Information Referral, or informant letters to the IRS to report suspected/perceived tax law violations by other taxpayers. These referrals are received in Exam and are classified by either tax compliance officers (TCOs) or tax analysts for a determination of audit potential.

Process to Create/Modify Informant Referrals Criteria
  1. W&I RCPM Headquarters, SB/SE CCS, and SP Headquarters analysts coordinate to ensure the Informant Referral Program is working as intended. Bi-monthly conference calls are held to discuss any issues that may arise. The RCEO Planning and Analysis (P&A) Staff performs a product review of classified referrals to ensure quality. In addition, procedures are evaluated to identify training needs. The results are sent to Examination Headquarters quarterly. Written feedback for W&I campus examination is shared with the Examination Operations Manager. In addition, feedback is provided to SP as appropriate to assist in improving the screening process. Consensus of all impacted stakeholders is needed prior to submitting any revisions to the selection criteria to the executives for approval.

Informant Referrals Approval Path
  1. All modifications are recommended with the agreement of senior managers and Headquarters analysts for both W&I and SB/SE Campus Exam and are vetted with the Submission Processing analysts prior to recommendations being submitted for executive approval.

Refundable Credits Return Preparer Strategy (RCRPS)
  1. RCRPS returns are referred to Examination as part of the EITC Return Preparer Program. W&I Operations Support (WIOS), in conjunction with Refundable Credits Administration (RCA), selects for audit client returns of those preparers who file high volumes of potentially erroneous EITC returns. These audits are conducted in a post-refund environment. Returns selected for audit have already been run through DDB rules and scored. The cases are assigned tracking code 6472 to monitor the outcome of the audits.

Process to Create/Modify RCRPS Selection Criteria
  1. The selection criteria used for this inventory is embedded in the DDB programming detailed in IRM 4.1.26.3.1, Dependent Database (DDB). Changes made to the DDB criteria will affect this inventory.

RCRPS Approval Path
  1. The approval process for changes to the selection criteria are made as a part of the DDB rules process.

Erroneous Refund Referrals
  1. The Erroneous Refund Program in Correspondence Examination involves cases that had incorrect refunds issued to taxpayers due to a variety of reasons. IRM 21.4.5, Erroneous Refunds, contains the referral process for employees within a co-located campus function. The potential for erroneous refunds may occur in the following situations:

    • misapplied payments

    • incorrect tax adjustments/assessments

    • incorrect credit refunds

    • taxpayers filing fraudulent returns

    • taxpayers using incorrect TINs

  2. Erroneous refunds are generally classified as either:

    • Assessable - Requires a recalculation of tax liability.

    • Un-assessable - No requirement for a recalculation of tax liability.

  3. The correspondence examination program involves assessable erroneous refunds. An erroneous refund is defined as "the receipt of any money from the Service to which the recipient is not entitled." This definition includes all erroneous refunds regardless of taxpayer intent or whether the error that caused the erroneous refund was made by the IRS, the taxpayer, or a third party. The adjustment to the tax liability or recapture of a refundable credit will require Statutory Notice of Deficiency procedures requiring Exam involvement.

Claims

  1. A taxpayer can make a change to an originally filed Form 1040, U.S. Individual Income Tax Return, using Form 1040-X, Amended U.S. Individual Income Tax Return. A taxpayer can change income, exemptions, deductions, credits, filing status, etc., reported on the original tax return, including claiming tax credits and deductions that were not previously claimed. Under Sec. 6511. Limitations On Credit Or Refund (bloombergtax.com), the general rule is that a claim for refund must be filed within three years from the time the original tax return was filed or two years from the time the tax was paid, whichever is later.

Category A (CAT A)
  1. In general, CAT A Claim criteria can be set and modified by W&I, SB/SE, LB&I, or by Customer Accounts Services (CAS) but the work on the referrals is handled in the campus environment and is coordinated by Accounts Management (AM). The criteria are changed periodically. When claims come into CAS and meet CAT A criteria AM refers the claims to the Exam Classification teams electronically within the campuses to have the claims evaluated using established criteria. Each claim is either accepted as filed, rejected (Not CAT A), selected for the office or field exam, selected for Correspondence Exam, or non-considered which will require the taxpayer to provide further information for the claim to be adequately evaluated.

Process to Create/Modify CAT A Criteria
  1. When the Exam, AM or SP functions identify the need for a possible change to a specific issue or tolerance, or when another function such as TAS elevates an issue to Exam, AM or SP Headquarters for consideration, the Headquarters functions meet to discuss the business reasons and customer impact for the potential change.

CAT A Approval Path
  1. If Exam Headquarters analysts find that a change to the criteria appears warranted, a discussion and consultation with all affected stakeholders will be held. If the nature of the change requires approval from Exam leadership, the Exam analysts discuss the proposed changes and alternatives with their respective executives and secure written approval or advice. If approved by Exam leadership from both W&I and SB/SE, Exam Headquarters analysts advise the responsible AM Headquarters analyst what changes should be made to the Exam CAT A criteria. The AM Headquarters analyst prepares an interim procedural update (IPU) and requests the concurrence of analysts from both W&I and SB/SE Exam Headquarters. After receiving concurrence, the AM Headquarters analyst posts the IPU to SERP and updates the SERP IRM.

DDBCK
  1. ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡ ≡

Process to Create/Modify DDBCK Criteria
  1. W&I RICS Headquarters (HQ) analysts and AM HQ analysts coordinate to ensure the program is working as intended. When the Exam, AM or SP functions identify the need for a possible change to a specific issue or tolerance, or when another function such as TAS elevates an issue to Exam, AM or SP Headquarters for consideration, the functions meet to discuss the business reasons and customer impact for the potential change.

DDBCK Approval Path
  1. All modifications are recommended with agreement of senior managers and analysts for both W&I and SB/SE Campus Exam and vetted with the Accounts Management analysts prior to recommendations being submitted for executive approval.

Duplicate Dependent Taxpayer Identification Number (DupTIN)

  1. The DUPTIN program identifies returns where a TIN is used more than one time for a dependent exemption in the same filing year. During the filing of the original tax return an entry is recorded on the DupTIN database every time a TIN is used as a Primary, Secondary, Dependent, EIC qualifying child, etc. The first use of the TIN is not flagged. Second and subsequent uses of the TINs are flagged. The duplicated taxpayers are identified with applied treatment logic.

  2. Generally, taxpayers that have duplicated a TIN for more than one year are considered for audit. Identified DupTIN audit inventory is sorted by tax changes and business rules. Exam issues a notice to the taxpayer requesting documentation to verify the qualifying dependents(s). If after the audit process a determination is made to no-change the case, then the related taxpayer that also duplicated the use of the dependent TIN is opened for audit.

Process to Change/Modify DupTIN Criteria
  1. SB/SE Campus Exam or W&I Examination can identify a need to change the treatment logic. Proposals originating in either BOD will be vetted by the assigned DupTIN Analysts and first line managers in both BODs before being submitted to the SB/SE CCS Program Manager and the RCPM Senior Manager for approval. IT programmers can be involved at this stage, as necessary, to assist in system capability issues.

DupTIN Approval Path
  1. All modifications are recommended with the agreement of senior managers and analysts for both W&I and SB/SE Campus Exam and vetted with the IT analysts prior to recommendations being submitted for executive approval. The IT programmer is notified by e-mail outlining the recommendation for implementation. The IT programmer emails confirmation of the implementation of the change. The DupTIN Analyst updates the DupTIN overview document to include the change.

Miscellaneous Identified Inventory

  1. Inventory for audit is occasionally identified through resources and referrals that are not part of the normal inventory selection processes. Inventory such as this may carry varied selection criteria and filtering requirements specific to that case type. By its nature, this is often unplanned work and coordination between Examination and the work plan staff is required before audits are started. Typically, potential cases are further vetted by the Exam Headquarters analysts. An agreement to process audits on this type of inventory rests with Examination Headquarters management. Examples of these miscellaneous inventories are listed in the subsections below.

Treasury Inspector General for Tax Administration Related Inventory
  1. TIGTA conducts investigations, and following established TIGTA audit protocol, can refer lists of taxpayers for potential audit to SB/SE CCS, RCPM, or RCEO. Typically, additional research is conducted to determine if Examination agrees the returns should be examined. If SB/SE CCS agrees with the recommendation, CCS will create internal filters that mirror the recommendation in accordance with the Planned Corrective Action. As determined by Headquarters Exam senior management in the receiving BOD in coordination with the workload planning staff, TIGTA referred cases are given priority for delivery. The referrals can involve any filed return issue and can involve multiple years, preparers, and issue amounts.

Audit Reconsideration
  1. Audit Reconsideration is the process the IRS uses to reevaluate the results of a prior audit where additional tax was assessed and remains unpaid, or a tax credit was reversed when the taxpayer disagrees with the original determination. The taxpayer must provide information that was not previously considered during the original examination to be eligible for audit reconsideration. See IRM 4.13, Audit Reconsideration, for more information.

Other Year Returns
  1. Subsequent Year Return Program - Each quarter, SB/SE CCS conducts an analysis of select audits closed as either a Default or Agreed to determine if the subsequent year has audit potential for the same project code. SB/SE CWI analysts gather their assigned project codes’ closure data from ACIS and send it to CWD for preliminary exclusion filtering. The CWI analysts then classify the remaining inventory and introduce viable cases into the work stream at the next scheduled delivery. The inventory carries tracking code 0900.

  2. This process is independent of the subsequent or prior year selections and whipsaw case selections carried out by campus examiners in both SB/SE and W&I, which are specified in each project code’s IRM section.

Filtering and Pre-Classification Process Performed by SB/SE Campus Workload Delivery (CWD)

  1. Most work in SB/SE Campus Examination is processed through the CWD team. The CWD team performs pre-classification account and tax return information gathering and applies standard exclusion filters (see Exhibit 4.1.26 - 1) before establishing AIMS. CWD is an intermediary that processes high volumes of work through automated, systemic, and manual applications. The main filtering application is the GII-EITCRA. The EITCRA is an IDRS-based application that gathers pre-determined tax return line items, entity information and Information Return Processing (IRP) data.

  2. The CWI analyst sends batches of potential audit cases to CWD. CWD gathers IDRS data as needed for the project code being worked. Each project will have specific criteria, provided by CWI, to which the cases must be compared. CWD gathers the data, sorts the information into categorized spreadsheet tabs, and, if required for the project, runs cases through a tax template to compute potential account changes. CWD also screens and categorizes the potential cases as; workable (able to go forward with audit opening), unworkable, or needing manual review by the CWI analyst.

  3. CWI makes the final determination of which cases are submitted to the work stream. After screening and pre-classification are completed by CWD, CWI analysts perform final classification. Depending on the project code requirements, the specific classification criteria will vary. The CWI analyst sends the selected cases to the CWD team to establish AIMS at the designated campuses. CWI analysts maintain listings of classification, filtering, and selection outcomes for their assigned programs. Any cases flagged as Manual Screening by CWD are verified and retained for possible future action by the CWI analyst. These cases may be examined but only after the condition for which it was excluded is resolved.

  4. While data-gathering and filtering is performed by CWD on almost all Non-EITC project codes, certain Math Error cases and small-volume referrals do not have enough continuous volume to justify CCS involvement and are worked by the local campus examiners.

  5. Inventory selected by DEBR is not processed through CWD for data-gathering or exclusion filtering. Classification, ranking of cases and application of exclusion filters is automated. DEBR inventory does not go to CWD to assign AIMS. This process is performed by IT. DEBR-identified inventory information regarding filtering, ranking, and AIMS opening results are accessible via Business Objects.

Delivery Process

  1. W&I Delivery process: Most of the workload delivery is systemic. Inventory which has been identified for opening in each campus exam operation will be created on the Audit Information Management System (AIMS) and be automatically introduced into the Report Generation Software (RGS).

  2. Inventory for W&I is typically delivered for processing in two ways; either through Automated Correspondence Examination (ACE) or UCP. SB/SE inventory is delivered first through UCP.

    1. Automated Correspondence Examination (ACE) - This process will systemically open a case on RGS, issue the Initial Contact Letter (ICL), and update the case on AIMS into the correct letter status.

    2. Unattended Case Processing (UCP) Tool - This process will hold inventory that has been opened on AIMS until the cases are scheduled to be started. At that point, the inventory is selected and created on RGS for manual opening.

  3. SB/SE Delivery Process: For work processed through CWD, the CWI analyst transfers the needed files (typically Excel spreadsheets) to CWD and requests CWD to establish AIMS in specified campuses/employee group codes Employee Group Code (EGC). If the cases are to be started in conjunction with AIMS establishment, or shortly thereafter, this can be accompanied by a delivery sheet. If cases being delivered are to be started at a later date, delivery sheets will be sent closer to the scheduled start date, per local procedures.

    1. Delivery sheets are the methods used to inform CWD, Examination Workload Planning, CCS Management, and campus contacts of the details of a project start. They contain information such as the start schedule, volumes, and tracking codes.

    2. For work not processed through CWD, such as DEBR, only delivery sheets are sent as described.

  4. SB/SE CCS documents the classification, filtering, and AIMS opening results for each delivery on an Activity Record. This document provides an overview of volumes at each stage of selection or non-selection, who took the action and when, as well as other pertinent delivery information. This document provides CCS Management with hyperlinks to, and/or attachments of, the specific case data used in selection. They are used in CCS managerial reviews of case selection and are maintained in the CCS shared drive.

Examination Opening Process

  1. Inventory is opened utilizing a schedule that ensures it will meet the start plan date to conform to the annual Work Plan. If the work type is programmed for systemic opening through the ACE process, it will be opened when needed to meet the start plan. This inventory cannot be held once it has been submitted for AIMS creation.

    1. Automated Correspondence Examination (ACE) is multifunctional software application that fully automates the initiation, Aging and Closing of certain Earned Income Tax Credit (EITC) and some Non-EITC cases. Using the ACE, Correspondence Examination can process specified cases with minimal-to-no tax examiner involvement prior to a taxpayer reply. Please see IRM 4.19.20.1, Automated Correspondence Exam Overview (ACE), for further information.

  2. Some inventory that will be processed through ACE programming is populated first in the UCP Tool. This inventory is scheduled for systemic opening to meet start plan needs.

  3. Workload inventory not programmed for ACE processing is systemically created in the Discretionary Tool for UCP. This inventory can be opened for Examination in one of three ways:

    1. Batch Leveraging Process (BLP) - Is a joint effort between SB/SE Headquarters and the Ogden Campus Examination that utilizes tax examiners to expedite the input of issues into RGS. If there is a project code that cannot be directly introduced into the ACE system, the BLP team will prepare and mail out the initial contact letter. SB/SE Headquarters will designate the specific issues and adjustments that the BLP tax examiner needs to input into RGS. Once the cases have been manually created by the BLP team, the cases can be introduced into the ACE for automated processing.

    2. Filer Bridge - This process allows batches of case work to be introduced non-traditionally into the ACE system. The project code issues are created in a tickler file that can be introduced into ACE.

    3. Manual Starts - The system will only create the work center files and inventory record in RGS. These cases need to be manually worked on RGS, Correspondence Examination Automation Support (CEAS) by a tax examiner. The campus tax examiner will have to input all relevant project code issues then print and mail the taxpayer the required letters, forms and reports that are in direct association to an examination.

      Note:

      Non-Filer inventory is not created in RGS using either ACE or the Discretionary Tool programming. This inventory is created using the Non-Filer Bridge. This is a process by which the SB/SE CWD team creates a tickler file that can be introduced into ACE. This allows Non-Filer cases to be inserted into ACE without manual input.

Inventory Case Review

  1. As part of W&I Examination overall monitoring strategy, there are three reviews that are conducted at the W&I Headquarters level:

    1. Rule/Filter validation reviews - These reviews are done to verify that the rule used to select the inventory is working as requested. This review type should be done by the analyst with oversight for implementing rule creation/revisions.

    2. Casework reviews - These reviews are done to validate that the campus examiners are working the inventory accurately and applying the tax law appropriately. The Analyst with oversight of the program should complete this review type.

    3. Review of Program Operation and Performance - Results of audits are reviewed by RCPM as directed by the Exam Policy and Coordinator (EPC) Program manager. Reviews are performed to gauge program effectiveness and Correspondence Exam Technicians’ adherence to IRM procedures. Results are shared with RCEO management and, as appropriate, can be referred to RCPM EPC for issuance of alerts, feedback on training, or for IRM revisions.
      Additionally, system reviews are conducted to ensure case selection systems, such as DDB are preforming as prescribed in the system documentation. These reviews are typically done by the analyst with oversight of rule creation/revisions. Resolving or mitigating any defects found is a coordinated effort between the RCPM analyst and DDB IT programmers.
      Documentation of these reviews, including corrective action taken, is provided to the RCPM Director for approval.

  2. As part of SB/SE Examination’s overall monitoring strategy, two inventory review processes are in place in CCS: Review of Program Operation and Performance and Review of Selection/Non-Selection and Survey Determinations.

    1. SB/SE Review of Program Operation and Performance - Results of audits are reviewed in SB/SE CCS as directed by the Program manager. Reviews are performed to gauge program effectiveness and Examiner adherence to IRM procedures. Results are shared with SB/SE CCS management and, as appropriate, can be referred to SB/SE Examination Policy for issuance of alerts, feedback for training, or IRM revisions. Additionally, system reviews are conducted to ensure case selection systems, such as DEBR are performing as prescribed in the system documentation. These reviews are typically done by the analyst with oversight of rule creation/revisions. Resolving or mitigating any defects found is a coordinated effort between the CWI analyst and the respective systems analyst(s) or coordinator. Documentation of these reviews, including corrective action taken, is provided to the SB/SE CCS Program Manager for signature and retained.

    2. SB/SE Review of Selection/Non-Selection and Survey Determinations - As directed by the SB/SE CCS Program Manager, the CWI Supervisory Tax Analyst, or other managerial designee (hereafter referred to as the reviewer), conducts reviews to ensure adherence to IRM guidance in selecting or non-selecting cases for audit and in making determinations to survey case.

    3. Reviews of selection/non-selection -This review is conducted at least once on every project code delivered for campus inventory during the annual start plan period. The schedule and sample size, including any supplemental or follow-up reviews, is determined by the SB/SE CCS Program Manager.
      CWI Analysts are required to prepare an activity record for each group of cases delivered to the campus inventories. The activity record contains:
      • The volume of potential cases that met pre-established filters.
      • The research and filtering actions taken on that pool of cases, along with which CCS/CWI function took the action.
      • Activity dates and volumes remaining in the pool at each stage of the selection process.
      • Hyperlinks to the data files that contain the considered cases and their relevant research results.
      The reviewer uses the activity record and data file to determine if the actions adhered to established selection criteria appropriate for the individual program as well as meeting the requirements to select cases with integrity and fairness as outlined in IRM 4.1.26.2, Cases Selected for Examination. These review points include but are not limited to proper application of exclusion filters, appropriate and accurate sorting/ranking of cases, as well as proper documentation of the select/non-select process. The review documents, including actions taken to remediate defects found, are submitted to the SB/SE CCS Program Manager for signature. Signed reviews are maintained by CCS.

    4. Reviews of survey determinations in CCS - CCS typically classifies more inventory than is called for in the annual start plan to ensure that there are sufficient cases available to meet the plan and provide flexibility should a plan change occur. This can result in excess inventory at the end of the plan period. This excess inventory is closed as non-examined, also known as surveyed. As directed by the SB/SE CCS Program Manager, the CWI Supervisory Tax Analyst, or other managerial designee (hereafter referred to as the reviewer), conducts reviews of surveyed cases. The review of surveys is conducted on an as-needed basis, i.e., when there is a need to bulk-survey cases. This is typically after the close of the annual start plan period.
      At the direction of the CWI Manager, a periodic review is performed to determine if there is a need to purge the systems of unstarted, unneeded inventory. This review contains but is not limited to the project code(s), volume, tax year(s) and campus assignment of the inventory being proposed as “excess”. The proposal to survey is then provided to the designated reviewer. The CWI Manager or CCS Program Manager will approve the survey of excess inventory and maintain recordation.

      Note:

      The process of surveying inventory at the CCS level is not to be confused with determining to survey individual cases in the campuses. Campus survey procedures, including requirements for managerial review, are present in their respective IRMs, predominantly, IRM 4.4.21, Non-Examined Closures and Deleting AIMS Records.

    5. SB/SE Review Requirements - As part of the Program Review process, the Campus Case Selection Program Manager (or designee) ensures the CWI and CWD Managers’ reviews adhere to the examination case selection policy.

Monitoring and Support - W&I

  1. Refundable Credits Program Management: Program Management is responsible for providing oversight and monitors the performance results of the EITC and Non-EITC programs at the Refundable Credits Examination Operations (RCEO) campuses. See IRM 1.1.13.5.5.1, Program Management (PM), for PM’s mission and a full description of how they accomplish their goals.

Monitoring and Support - SB/SE

  1. SB/SE Examination Performance Planning and Analysis provides support and monitors program results of the SB/SE campuses. See IRM 1.1.16.5.4, Performance Planning and Analysis.

Small Business/Self Employed, Campus Case Selection Inventory Filter Sheet

Exclusion Filter Data – Gii Spreadsheets & Command Codes

  • ACCTIMF Spreadsheet: MFREQC, ENMOD, INOLES, AMDISA, IMFOLT, IMFOLR

  • RTRNIMF Spreadsheet: RTVUE

  • NFIM1YR Spreadsheet: NFIMF_1YR (for IRPTRL data only when analyst asks for it)

  • TRDBVRTN Spreadsheet: TRDBV (only needed when analyst asks for it)

Unworkable Filed Inventory Criteria

  • Duplicate TIN

  • Open AIMS

  • Deceased taxpayer: "DECD" in nameline (Exception: filing status is 2 or 5)

  • Puerto Rico Zip Codes

  • If one of two prior years have a DC 02 and the project code(s) examined in those years are the same as the project code(s) being filtered (do this step after filtering for unworkable Transaction Codes and Freezes to save some time running unnecessary TINs through IMFOLT)

  • Filed Form 1040NR

  • Less than one year remaining on the Assessment Statute Expiration Date (ASED)

Unworkable Transaction Codes

  • Missing TC 150

  • TC 300/301

  • TC 420/421/424/425

  • TC 540

Unworkable Freeze Code

  • -L, Open Audit Indicator

  • -C, Combat Zone Indicator

  • F-, Frivolous Filer Indicator

  • -T, Entity Freeze

  • -V, Bankruptcy Freeze

  • -W, Litigation Pending Freeze

  • -Z, Criminal Investigation Freeze

  • Z-, Refund Schemes

  • -Y, Offer in Compromise

Manual Screening

  • Under Age 15

  • TC 290/291 (Except TC 290 .00 with TC 766/767 and CRN 256, 257, 259, or 338. All 3 criteria must be present to leave as workable)

  • TC 976/977

  • Freeze: -O

  • Freeze: -A

  • TC 922 (Except process codes 16, 21, 22, 26, 27, or 29)

Other Instructions

Provide IMFOLR_TPI column and IMFOLR_NEW_TPI_CLASS column for General Filters at the beginning of your workable spreadsheet.

Highlight any workable cases with TPI > $4,999,999.