4.2.8 Guidelines for SB/SE National Quality Review

Manual Transmittal

July 12, 2022

Purpose

(1) This transmits a revision to IRM 4.2.8, Examining Process, General Examining Procedures, Guidelines for SB/SE National Quality Review.

Material Changes

(1) Editorial changes were made throughout this IRM to add clarity, readability, and to eliminate redundancies. Website addresses, legal references, and IRM references were reviewed and updated as necessary.

(2) Significant changes to this IRM are reflected in the table below:

IRM Description of Change
4.2.8.1 Program Scope and Objectives Updated IRM citations at 4.2.8.1(4), 4.2.8.1(5). Added new IRM citation at 4.2.8.1(8)
4.2.8.1.1 Background Added new content at 4.2.8.1.1(5), moved former 4.2.8.1.1(5) to 4.2.8.1.1(6) and updated content and hyperlink, and moved former 4.2.8.1.1(6) to 4.2.8.1.1(7) and updated IRM citation
4.2.8.1.2 Authority Deleted old content at 4.2.8.1.2(1) and replaced with new, updated citation at 4.2.8.1.2(2), deleted content at 4.2.8.1.2(3) and 4.2.8.1.2(4) and replaced with new TBOR language at 4.2.8.1.2(3) as requested by Chief Counsel
4.2.8.1.3.2 Quality Analyst Responsibilities Deleted content at 4.2.8.1.3.2(2) as these duties now responsibility of Specialty Exam
4.2.8.1.5 Program Controls Updated content and hyperlink at 4.2.8.1.5(2), and added clarifying language at 4.2.8.1.5(3)
4.2.8.1.6 Terms and Acronyms Modified two terms SB/SE Field Exam and SB/SE Specialty Exam and their definition for clarity
4.2.8.1.7 Related Resources Added new URL at 4.2.8.1.7(4)
4.2.8.2.1 Quality Attributes Modified language in last bullet point at 4.2.8.2.1(1) and added content at 4.2.8.2.1(3) for clarity
4.2.8.3.1 Review of Electronic Case File Updated content at 4.2.8.3.1(1) and added note at 4.2.8.3.1(3) for clarity
4.2.8.5 Specialty Exam Case Sampling Criteria Added content at 4.2.8.5(2) to exclude Form 1041 cases
4.2.8.6 National Quality Review Case Selection Procedures Added word overview to title for clarity, deleted outdated content found in 4.2.8.6(3), moved content at 4.2.8.6(4) to new 4.2.8.6.3
4.2.8.6.1 Unagreed Appeals Case Selection Procedures Moved content to 4.2.8.6.4 and added new procedures relating to electronic case files
4.2.8.6.2 Defaulted Case Selection Procedures Moved content to 4.2.8.6.5 and added new procedures relating to defaulted electronic case files
4.2.8.6.3 Shipping Sample Select Cases Moved content to 4.2.8.6.6 and updated content for clarity
4.2.8.6.4 Sample Select Control Procedures Moved content to 4.2.8.6.7
NEW 4.2.8.6.1 Field and Office Exam Case Selection Procedures Added new procedures for physical and electronic cases selected for review
NEW 4.2.8.6.2 Employment, Estate and Gift and Excise Case Selection Procedures Added new procedures for physical and electronic cases selected for review
4.2.8.9 Case Return Criteria Update title to Field Exam Case Return criteria reflect guidance only applies to field and office exam case returns
NEW 4.2.8.9.1 Specialty Exam Case Return Criteria Content added for case return procedures for Employment, Estate and Gift/Excise cases that meet return criteria

(3) Editorial changes were made throughout this IRM to add clarity, readability, and to eliminate redundancies. Website addresses, legal references, and IRM references were reviewed and updated as necessary.

Effect on Other Documents

This material supersedes IRM 4.2.8 dated October 6, 2020

Audience

Small Business/Self-Employed (SB/SE) Field and Specialty Exam Employees.

Effective Date

(07-12-2022)

Garrett Gluth
Director, Exam Quality and Technical Support
Small Business/Self-Employed

Program Scope and Objectives

  1. General Overview. Field and Specialty Exam Quality (FSEQ) supports the Small Business/Self Employed (SB/SE) quality improvement program, providing an assessment of the quality of Field and Specialty Examination case work.

  2. Purpose. This IRM section contains general information and procedural guidance relating to the SB/SE Field and Specialty Exam National Quality Review program.

  3. Audience. The audience is employees and management officials in FSEQ as well as SB/SE stakeholders.

  4. Policy Owner. The Director, Exam Quality and Technical Support (EQ&TS), is responsible for the policies related to the National Quality Review program. Refer to IRM 1.1.16.5.5.4 , Exam Quality and Technical Support for more information.

  5. Program Owner. The Program Manager, FSEQ is responsible for overseeing the National Quality Review program. Refer to IRM 1.1.16.5.5.4.5, Field and Specialty Exam Quality for more information.

  6. Program Goals. The goal of the National Quality Review program is to provide a practical and accurate method of assessing organizational performance in support of the balanced measures.

  7. Primary Stakeholder. The Director Examination SB/SE. Additional stakeholders are Directors located in:

    • Headquarters Examination

    • Field Examination

    • Field and Campus Policy

    • Specialty Policy

    • Specialty Tax

  8. Contact Information. To recommend changes or make any other suggestions related to this IRM section, see 1.11.6.5 , Providing Feedback About an IRM Section - Outside of Clearance

Background

  1. Embedded Quality (EQ) creates a link between individual performance and organizational goals. This linkage is achieved through a common set of attributes that both national quality reviewers in FSEQ and front-line managers use to evaluate the quality of case work.

  2. EQ reviews focus on whether the examiner took the right actions at the right time while protecting taxpayer rights.

  3. National quality reviewers in FSEQ use the National Quality Review System (NQRS), an automated web-based system, to record results from case reviews for the following programs:

    • Field and Office Examination

    • Bank Secrecy Act (BSA)

    • Employment Tax

    • Estate and Gift Tax

    • Excise Tax

  4. Reports generated from NQRS provide data which may be used to evaluate organizational processes, procedures and successes, and identify areas in need of improvement.

  5. The Quality Knowledge Base contains Examination Quality program information, including Embedded Quality Review System (EQRS) and National Quality Review System (NQRS) system guidance.

  6. The Quality Knowledge Base is located at https://portal.ds.irsnet.gov/sites/vl115/pages/default.aspx.

  7. Managers use the Embedded Quality Review System (EQRS) database to evaluate employee performance. For more information regarding front-line manager use of EQRS see IRM 1.4.40.3.6, Performance Feedback..

    Note:

    NQRS data is never used to evaluate employee performance.

Authority

  1. IRM 1.2.1.2.2, Policy Statement 1-2, Principles of Quality, provides the authoritative basis for the procedures in this IRM.

  2. 26 CFR 801.6(b) states that quality measures focus on whether IRS personnel:

    • Devoted an appropriate amount of time to a matter

    • Properly analyzed the facts of the situation

    • Complied with statutory, regulatory and IRS procedures

    • Took timely actions

    • Provided adequate notification and made required contacts with taxpayers

  3. The Taxpayer Bill of Rights (TBOR) lists rights that already existed in the tax code, putting them in simple language and grouping them into 10 fundamental rights. Employees are responsible for being familiar with and acting in accord with taxpayer rights. See IRC 7803(a)(3), Execution of Duties in Accord with Taxpayer Rights. For additional information about the TBOR, see https://www.irs.gov/taxpayer-bill-of-rights.

Roles and Responsibilities

  1. Listed below are the primary roles and responsibilities of the FSEQ program manager, quality analysts, management and quality reviewers involved in the quality review process.

Program Manager Responsibilities
  1. FSEQ program manager primary responsibilities include:

    • Overseeing and allocating resources for FSEQ

    • Coordinating the development of the annual case review sample plan for FSEQ

    • Ensuring that case review inventory is sufficient for each Field and Specialty Exam Area or program based on the sample plan

    • Monitoring the delivery of the Field and Specialty Exam national sampling plan

    • Coordinating issues relating to interpreting and rating the quality attributes

    • Establishing protocol to measure, monitor, and improve reviewer accuracy and consistency

    • Sharing analysis of NQRS data to aid in organizational improvement and influence quality performance

    • Providing quality review data and/or analysis to internal/external stakeholders on an ad hoc or recurring basis

    • Coordinating with stakeholders in the development of attributes and requirements for quality reviews

    • Providing recommendations to enhance NQRS

Quality Analysts Responsibilities
  1. FSEQ analyst are responsible for:

    • Developing and distributing quality performance reports

    • Developing the annual case review sample plan

    • Reviewing attribute narratives on a regular basis to ensure guidelines are followed

    • Participating in group meetings to promote consistency, including the discussion of specific attributes and case scenarios

    • Developing and clarifying review criteria and procedures to promote consistency

    • Providing quality review data and/or analysis to internal/external stakeholders on an ad hoc or recurring basis

    • Coordinating with stakeholders in their quality improvement initiatives

    • Collaborating with stakeholders in the development of attributes and requirements for quality reviews

    • Working with stakeholders in monitoring and updating job aids, instructional guides and quality review procedures in accordance with IRM and program guidelines

Front-Line Manager Responsibilities
  1. FSEQ front-line manager responsibilities include:

    • Providing guidance for program objectives

    • Ensuring reviewers understand and adhere to program guidelines

    • Ensuring accurate and consistent application of the quality attributes

    • Reviewing attribute narratives on a regular basis to ensure guidelines are followed

    • Critiquing completed reviews on a regular basis and providing meaningful feedback to reinforce expectations for quality case reviews

    • Conducting group meetings to promote consistency, including the discussion of specific attributes and case scenarios

    • Ensuring accuracy of data input

    • Ensuring the sample plan is followed

    • Monitoring sample plan and recommending actions to address imbalances

    • Maintaining instructional guides for national quality reviewers

    • Reviewing and approving case returns that meet criteria found in IRM 4.2.8.9 ,Returning Cases to the Field.

    • Reviewing and approving rejection of cases that do not meet case sampling criteria found in IRM 4.2.8.4 and IRM 4.2.8.5.

    • Sharing trends and issues that may have nationwide impact

    • Providing input during the attribute development or update process

Reviewer Responsibilities
  1. FSEQ reviewers responsibilities include:

    • Evaluating examination case quality by conducting reviews of completed SB/SE Field and Specialty Exam cases

    • Accurately and consistently applying the attributes utilizing the appropriate Job Aid and tools such as the IRM and Internal Revenue Code

    • Completing timely case reviews using the Data Collection Instrument (DCI)

    • Completing timely and accurate input of review data into the NQRS database

    • Identifying the appropriate reason code(s) for each not met attribute rating

    • Writing clear and meaningful attribute narrative comments for each not met attribute rating

    • Elevating potential conflicts in the IRM and the Job Aid for resolution

    • Assisting in data analysis as warranted

Program Reports and Effectiveness

  1. Program reports are available on NQRS by selecting Reports from the main menu screen.

  2. FSEQ also generates quarterly performance reports for stakeholders. These reports provide data to aid in:

    • Establishing baselines to assess program performance

    • Identifying quality strengths and weaknesses

    • Determining specific training/educational needs

    • Identifying opportunities to improve work processes

    • Measuring the success of quality improvement efforts

  3. An overall quality score serves as the Balanced Measure for Business Results – Quality. This measure is reported to various levels of the organization and to external stakeholders such as Congress.

Program Controls

  1. Access to EQRS and NQRS data and reports is controlled based on the user’s assigned permission level, assigned function and assigned organization. System coordinators are responsible for assigning users to the appropriate permission level based on the user’s role in the organization. Users are only given privileges that are required for the user to perform their jobs. Users do not have access to security and other functions/features that require elevated privileges.

  2. Access to EQRS and NQRS is through the Business Entitlement Access Request System (BEARS) at https://bears.iam.int.for.irs.gov/home/Index.

  3. EQRS/NQRS systems contain input validation checks to ensure input accuracy and completeness by:

    • Restricting data input to established system parameters to ensure data accuracy

    • Using drop down lists as much as possible to restrict users from typing invalid information

    • Displaying an error message if invalid data is input into the system

    • Requiring data field input before proceeding

  4. Operations Support, Technology Solutions, Collection Systems provide core information technology management and support services for both EQRS and NQRS. They are responsible for:

    • Ensuring compliance with the Federal Information Security Management Act (FISMA)

    • Managing Unified Work Requests (UWR) for system updates and changes

    • Leading the development of enhanced data and computer security process and controls

Terms and Acronyms

  1. The following table contains commonly used terms and acronyms found in this IRM:

    Terms and Acronyms Definition
    BSA Bank Secrecy Act
    CCP Centralized Case Processing
    CJE Critical Job Element
    CEAS Correspondence Examination Automation Support
    DCI Data Collection Instrument
    EQ Embedded Quality
    EQ&TS Exam Quality & Technical Support
    FSEQ Field and Specialty Exam Quality
    FEQ Field Exam Quality is responsible for cases selected for quality review from revenue agents, tax compliance officers and tax auditors located in Field Examination
    EQRS Embedded Quality Review System
    ERCS Examination Returns Control System
    IMS Issue Management System
    IRM Internal Revenue Manual
    ITAMS Information Technology Asset Management System
    NQRS National Quality Review System
    SB/SE Small Business/Self Employed
    SPRG Specialized Product Review Group
    SEQ Specialty Exam Quality is responsible for cases selected for quality review from revenue agents, attorneys, revenue officer examiners and fuel compliance agents located in Specialty Examination
    UWR Unified Work Request

Related Resources

  1. Field and Specialty Exam job aids are reference tools used by Field and Specialty Exam management and FSEQ review staff to aid in rating the quality attributes in a uniform and consistent manner. Guidelines in the job aids align the EQ concepts to current Field and Specialty Exam procedures. IRM references support each quality attribute.

  2. Headquarters Examination, Examination Field and Campus Policy is responsible for ensuring the consistency of the Field and Office Exam job aids and training materials, along with the IRM and other guidelines.

  3. Headquarters Examination, Specialty Policy is responsible for ensuring the consistency of the Specialty Exam job aids and training materials, along with the IRM and other guidelines.

  4. Links to the job aids may be found on the Quality Knowledge Base which is located at https://portal.ds.irsnet.gov/sites/vl115/pages/default.aspx.

Overview of National Quality Review Process

  1. The quality review process provides data to measure, monitor and improve the quality of work.

  2. Organizational performance is measured by conducting independent case reviews from a statistically valid sample of examination case work.

  3. Specific measurement criteria, referred to as quality attributes, are used to evaluate the quality of case work.

Quality Attributes

  1. Quality attributes address whether:

    • Timely service was provided to the taxpayer

    • Facts of the case were properly analyzed

    • Law was correctly applied

    • Taxpayer rights were protected by following applicable IRS policies and procedures including timeliness, adequacy of notifications, and required contacts with taxpayers

    • Appropriate determination was reached regarding liability for tax

  2. Quality attributes are organized into measurement categories which allow quality data to be generated based on the following criteria:

    • Timeliness - resolving issues in the most efficient manner through proper time utilization and workload management techniques

    • Professionalism - promoting a positive image of the Service by using effective communication techniques

    • Regulatory Accuracy - adhering to statutory/regulatory process requirements

    • Procedural Accuracy - adhering to internal process requirements

  3. Quality attributes can also be organized by the following DCI attribute groups:

    • Planning

    • Income Determination (Field Exam)

    • Investigative/Audit Techniques

    • Timeliness

    • Customer Relations/Professionalism

    • Documentation/Reports

Evaluating and Coding the Attributes

  1. Reviewers evaluate case work utilizing attributes specific to their Specialized Product Review Group (SPRG).

  2. Reviewers rate all attributes that apply to the case being reviewed.

  3. Attribute ratings must be accurate and consistent. Reviewers must strive for consistency in rating similar case actions.

Attribute Scoring System

  1. The scoring system provides for the equal weighting of each attribute. Each attribute is rated as Yes, No, or in some instances Not Applicable.

  2. The quality score is computed as a percentage. The percentage is calculated as total Yes ratings divided by total Yes and No ratings. A total score of 100 percent is possible for each case.

Case Review Procedures

  1. The DCI is the principal documentation for the reviewer’s case evaluation and conclusions. A DCI is completed for each case reviewed in NQRS. Reviewers must ensure that all entries on the DCI are accurate and records are not duplicated.

  2. Reviewers will review one case at a time to completion before starting another case review.

  3. Steps in the review process include:

    • Review of the paper case file and the electronic file, where applicable

    • Input case review data on the DCI and prepare narratives to explain each not met attribute rating

    • Review the DCI for accuracy and narrative quality

    • Edit the DCI as necessary

    • Complete case review

Review of Electronic Case File

  1. Either a physical or electronic case may be assigned for review. If a physical case file is assigned for review, the physical case file is the primary source.

  2. Documents found in the electronic case file might not be in the physical case file because they were not printed or were inadvertently removed. If there are indications in the physical case file that electronic documents exist, reviewers should access the electronic file to determine if additional information is available.

  3. Electronic case files for the Excise, Employment and Estate and Gift programs are located on the Issue Management System (IMS) Team Site.

    Note:

    An approved BEARS request is required for access to the IMS Team Site.

  4. Electronic case files for Field Exam are located in Correspondence Examination Automation Support (CEAS).

DCI Header Input Procedures

  1. The first input section of the DCI is the header fields that capture basic case information. The bold header fields are mandatory and must be entered to complete the DCI.

  2. Header information is categorized into four groupings:

    • Review Information - specific information about the review itself

    • Case Information - specific information about the case

    • Process Measures - case actions taken by the examiner that are used to measure the efficiency of the examination process

    • Special Use - special tracking for local or national purposes

  3. Process Measures data may be analyzed in conjunction with the quality attributes. Process Measures data fields capture:

    • Specific tasks performed during the examination

    • How these tasks were completed

    • Key dates

    • Delays in activities

    • Hours associated with the case

Reason Code Selection and Writing Guidelines for Attribute Narratives

  1. When a quality attribute is rated not met, at least one reason code, if available, must be selected that supports the not met rating.

  2. The most appropriate reason code should be selected for the error. Multiple reason codes may be selected for multiple errors, if warranted.

  3. A narrative is required, describing the facts, for each not met attribute rating.

  4. Reviewers should contact their manager when other is used regularly as a reason code to determine if additional reason codes should be added to NQRS.

  5. Reviewer narratives must be thorough, providing clear, concise, and specific descriptions of any errors, offering sufficient detail to allow for specific recommendations for improvement.

  6. Reviewers must avoid using canned statements in their narratives.

  7. Attribute narratives should:

    • Clearly state the facts that resulted in the attribute rating

    • Identify the nature of the error in the first sentence of the narrative

    • Indicate what was not done, not what should have been done

    • Evaluate the case, not the examiner

  8. Reviewer should not include taxpayer specific or Personally Identifiable Information (PII) data in the narrative comments.

Field Exam Case Sampling Criteria

  1. The following Field Exam cases are included in the review sample:

    • SB/SE revenue agent and tax compliance officer income tax cases (corporations, partnerships, and individual returns)

    • Agreed, partially agreed, unagreed, no-change, and cases protested to Appeals

    • Secured delinquent returns not accepted as filed

    • Training cases

    • Form 1041 , U.S. Income Tax Return for Estates & Trusts, Form 1042 , Annual Withholding Tax Return for U.S. Source Income of Foreign Persons, and Form 1120-F , U.S. Income Tax Return of a Foreign Corporation, tax returns examined by revenue agents

    • Correspondence cases examined by revenue agents, tax auditors, and tax compliance officers

    • Pre-assessment innocent spouse cases

    • Claims

    • Audit reconsideration cases

    • Employment tax cases if they are closed as related cases to an income tax case (the entire related case package is included)

  2. The following Field Exam cases are excluded from the national quality review sample:

    • Secured delinquent returns accepted as filed

    • Penalty cases not included as part of an examination case

    • Surveyed returns

    • Offer in Compromise (OIC) cases

    • Post-assessment innocent spouse cases

    • Surveyed claim cases (Disposal Code 34)

    • No show/no response cases

    • Protested cases with 395 days or less remaining on the statute

      Note:

      If the case selected for review is a protested case to Appeals, there must be at least 395 days remaining on the statute of limitations. Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute of limitations as of the date the case is received in Appeals. The additional 30 days is required to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals.

    • Petitioned cases

    • Cases updated to suspense status

    • Cases updated to group status after 90-day letter issued

    • Cases closed via Form 906, Closing Agreement

    • Specific project codes as determined by Headquarters Examination

Specialty Exam Case Sampling Criteria

  1. The following Specialty Exam cases are included in the review sample:

    • Excise tax

    • Estate and Gift tax

    • Employment tax where there is no related income tax case

    • BSA Title 31 and Form 8300

  2. The following Specialty Exam cases are excluded from the national quality review sample:

    • Secured delinquent returns accepted as filed

    • Penalty cases not included as part of an examination case

    • Surveyed returns

    • Offers in Compromise (OIC) cases

    • Post-assessment innocent spouse cases

    • Surveyed claim cases (Disposal Code 34)

    • No show/no response cases

    • Protested cases with 395 days or less remaining on the statute of limitations

      Note:

      If the case selected for review is a protested case to Appeals, there must be at least 395 days remaining on the statute of limitations. Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute of limitations as of the date the case is received in Appeals. The additional 30 days is required to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals.

    • Petitioned cases

    • Cases updated to suspense status

    • Cases updated to group status after 90-day letter issued

    • Cases closed via Form 906, Closing Agreement

    • Specific project codes as determined by Headquarters Examination

    • Activity Code 421 returns -Gift Form 706GS(D) ,Generation-Skipping Transfer Tax Return for Distributions and Form 706GS(T),Generation-Skipping Transfer Tax Return For Terminations

    • Cases worked by Estate and Gift support staff, paraprofessional -Position Code 316 and audit accounting aide -Position Code 301

    • Estate and Gift returns assigned outside of the Estate and Gift area

    • Excise Form 2290 , Heavy Highway Vehicle Use Tax Return

    • Excise returns assigned outside of the Excise area

    • Employment Tax Form 1040tip cases

    • Form 1041, U.S. Income Tax Return for Estates and Trusts

Overview of National Quality Review Case Selection Procedures

  1. The Examination Record Control System (ERCS) Sample Review program automates the process of randomly selecting a valid sample of cases meeting the sampling criteria for review.

  2. The sample size is statistically valid at the Field Exam Area level and the Specialty Exam Program level. The annual sample plan is based on projected fiscal year closures for each SB/SE program.

  3. Cases meeting the sample criteria are selected by the ERCS Sample Review program at the designated sample rate for the Field Exam Area and for three of the Specialty Exam Programs (Excise, Employment, Estate and Gift). Cases are subject to the sample at the point they move to Status Code 51 (In transit to Centralized Case Processing) or 21 (In transit to Technical Services). BSA cases are not controlled on ERCS. See IRM 4.2.8.6.3, BSA Case Selection Procedures, for more information.

Field and Office Exam Case Selection Procedures

  1. For physical case files selected for review, after the case has been processed by CCP and updated to Status Code 90 (Closed), CCP will ship the physical file to Field and Office Quality (FEQ) support staff. Upon receipt from CCP, the FEQ support staff will enter the case into the unassigned inventory for assignment to a reviewer.

  2. For electronic case files selected for review, after the case has been processed by CCP and updated to Status Code 90, the FEQ support staff will enter the case into the unassigned inventory for assignment to a reviewer.

Employment, Estate and Gift/Excise Case Selection Procedures

  1. For physical case files selected for review, after the case has been processed by CCP and updated to Status Code 90, CCP will ship the physical file to Specialty Exam Quality (SEQ) support staff. Upon receipt from CCP, the SEQ support staff will enter the case into the unassigned inventory for assignment to a reviewer.

  2. For electronic case files selected for review, after the case has been processed by CCP and updated to Status Code 90, the SEQ support staff will enter the case into the unassigned inventory for assignment to a reviewer.

BSA Case Selection Procedures

  1. Form 8300 cases are selected from the weekly extract of closed cases maintained by Enterprise Computing Center - Detroit (ECC-DET) and shipped to SEQ.

  2. Title 31 cases are selected from the closed case Title 31 database using the NQ interface and shipped to SEQ.

Unagreed Appeals Case Selection Procedures

  1. The ERCS Sample Review program may select unagreed cases for review.

  2. Technical Services is responsible for sending physical unagreed Appeals cases and physical unagreed Appeals cases with at least one agreed/no-change year that are selected for sample review to the appropriate review site. These cases are high priority and procedures are established to ensure their timely review. Refer to IRM 4.8.2.3.4, Technical Services, Case Processing for more information.

  3. For unagreed electronic Appeals cases and unagreed electronic Appeals cases with at least one agreed/no-change year, the FEQ support staff will add the electronic case to review inventory.

  4. When “open” cases are transmitted to the review site by Technical Services, they should be updated to Status Code 23, Sample Review and Review Type 33 on ERCS.

  5. Reviewers will complete their review of the open unagreed case within 10 business days and return the physical case files to Technical Services via ground mail service. When the review of an electronic case file has been completed, the group manager will e-mail the Technical Services manager to advise that the review is complete.

  6. Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute of limitations as of the date the case is received in Appeals. If a non-docketed case is selected for sample review there needs to be an additional 30 days on the statute of limitations to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals.

  7. Cases that do not meet this criteria will be deselected and returned to Technical Services.

  8. Reviewers prepare an Appeals Advisory Memo to Technical Services when tax application/computation errors are found or if any taxpayer confidentiality issues are discovered. Technical Services decides whether to forward the case to Appeals or return it to the group.

Defaulted Case Selection Procedures

  1. The ERCS Sample Review program may also select unagreed cases closing for issuance of statutory notice of deficiency as part of the random sample of cases for review.

  2. Technical Services will affix the sample selection sheet to these case files and update all returns with an account transfer out freeze code "M" .

  3. If the case defaults, Technical Services will send the case to CCP. The freeze code "M" along with the Sample Selection Sheet will alert CCP that the case must be sent to the appropriate review site.

  4. CCP will update the case to Status Code 90 (Closed), remove the freeze code "M" and forward the physical case(s) to FEQ support staff. For electronic case files, when CCP updates the case to Status Code 90 (Closed) and removes the freeze code "M " , the FEQ support staff will add the case to unassigned inventory.

Shipping Physical Case Files Selected for Quality Review

  1. Physical cases selected for review should be transmitted to their respective review site.

  2. Field Exam physical cases are shipped to FEQ support staff.

  3. Specialty Exam physical cases are shipped to SEQ support staff.

  4. After screening, physical case files are shipped directly to reviewers.

  5. Closed case files should remain intact after they leave the Employment, Estate and Gift and Excise groups and Technical Services. Dismantling, purging, or discarding documents from a case file could negatively affect the case if legal actions are pursued.

  6. A separate Form 3210, Document Transmittal shall be attached to the closed case files. Each selected case shall include the full physical case file.

Sample Select Case Control Procedures

  1. Each review site will maintain an inventory control system. This will facilitate an orderly flow of case files and supporting documents between closing units, the review site, and the reviewer.

  2. All closed (Status Code 90) physical case files along with Form 3210 are transported via ground shipment for final disposition.

Case Review Consistency

  1. The reviewer’s case evaluation must be accurate and consistent to provide reliable and meaningful results.

  2. The FSEQ front-line manager should periodically perform consistency checks to ensure consistent and accurate application of the quality attributes and accurate data input.

  3. The FSEQ front-line manager may conduct consistency reviews in several ways, including:

    • Have each reviewer independently review the same case and discuss any inconsistencies in attribute rating

    • Critique completed case reviews and provide feedback to reinforce expectations of the review outcomes

    • Utilize NQRS reports including reviewer narratives, to evaluate consistency, ensure guidelines are followed and ensure the narratives are clearly and professionally written

    • Hold group meetings to discuss specific attributes and case scenarios

  4. Results of the consistency reviews are maintained and updated as warranted.

Use and Limitations of National Quality Review Data

  1. The fundamental purpose of the National Quality Review program is to provide an overall organizational assessment of case quality.

  2. Quality review results are statistically valid and reliable measurements of the overall quality of casework completed by SB/SE Field Exam only at the Area level. Specialty Exam national quality results are statistically valid at the program level. Results stratified to any lower organizational segment are not statistically reliable measurements of the quality of casework at those levels.

  3. Lower organizational segment stratifications are indicators. They should be relied upon only to the extent that they are confirmed by other reliable management measures of quality.

  4. The design and format of quality review reports within NQRS as well as access to the reports and data will be determined by the Field and Specialty Exam program manager.

  5. No attempt should be made to associate specific review results to a particular case.

  6. Review data is used to assess program performance and will not be used to evaluate individual employee performance. Any feedback or other data generated from NQRS will not be used as a substitute for EQRS case reviews, on the job visits, workload or any other reviews.

Field Exam Case Return Criteria

  1. FSEQ reviewers will follow guidance found in the Technical Services IRM 4.8.2.9 ,Returning Cases to the Field, which outlines return criteria for cases with potential for significant impact to taxpayer compliance or to tax revenues.

Specialty Exam Case Return Criteria

  1. SEQ reviewers will also follow guidance found in the Technical Services IRM 4.8.2.9’Returning Cases to the Field, which outlines return criteria for cases with potential for significant impact to taxpayer compliance or to tax revenues.

  2. Additional guidance is provided for SEQ for cases that do not close to Technical Services.

  3. SEQ reviewer will take the following actions for Employment, Estate and Gift and Excise cases meeting the case return criteria found in IRM 4.8.2.9.1 ,Case Return Criteria

    • Prepare case return memo. The memo will address the facts, law, and recommended actions needed for the case.

    • Consult with Specialty Exam Policy Subject Matter Expert (SME) to resolve any technical issues relating to the case if warranted.

    • Forward case return memo to the SEQ front-line manager for concurrence.

  4. SEQ front-line manager will send case return memo to the quality analyst for concurrence.

  5. The quality analyst will send the case return memo to the Specialty field front-line manager with a copy to the territory manager, program policy analyst responsible for quality and field technical advisor.

  6. The Specialty field front-line manager will provide SEQ front-line manager with shipping instructions.

  7. SEQ reviewer will ship case per instructions from Specialty field group front-line manager following current shipping guidelines.

    Note:

    Document all case return actions in the case file.

  8. The returned case will remain in Status Code 90, Closed. The decision to act on the SEQ reviewer recommendations is the responsibility of the Specialty field front-line and territory manager. The Specialty field front-line manager is responsible for shipping the case to files once any needed actions are completed. Document all follow-up actions in the case file.

National Standard Time Frames for Case Action

Activity - Type of exam action or activity measured

Days - Maximum number of calendar days permitted for the exam action or activity

Measured From - Start of the exam action or activity

Measured To - End of the exam action or activity

The national recommended standard timeframes (unless noted, measured in calendar days) are shown in the table below:

Activity Program Days Measured From Measured To
Start Examination Field Exam, Excise, Employment, BSA 45 First action First Appointment
Contact Estate and Gift 45 Examiner’s receipt of case Date examiner sends an initial contact letter to the taxpayer with a copy to the representative, or surveys the assigned case
Significant Activity Field and Specialty Exam 45 Last significant action Next significant activity
Response to call Field and Specialty Exam 1 business day Taxpayer or representative telephone call Return telephone call to the taxpayer or representative
Response to correspondence Field and Specialty Exam 14 Receipt of correspondence or documentation from taxpayer or representative Provide follow up response to taxpayer or representative
POA Processing Specialty Exam as soon as possible or within 24 hours of the receipt date Receipt of Form 2848 Submission to CAF Unit for processing
Agreed/No Change Case Closing Field Exam, 10 Date the report is received or the date the no-change status is communicated to the taxpayer Date the case is closed from the group
Agreed/No Change Case Closing Estate and Gift, Excise 30 Date the report is received or the date the no-change status is communicated to the taxpayer/financial institution Date the case is closed from the group
Agreed/No Change Case Closing Employment 20 Date the report is received or the date the no-change status is communicated to the taxpayer Date case is updated to status 51 and closed from the group
Agreed/No Change Case Closing BSA 20 Date closing letter finalized Date case closed from the group
Agreed cases with unpaid proposed assessments of $100,000 and greater Field and Specialty Exam 4 Date the report is received Date case is closed from the group
Unagreed Case Closing Field and Specialty Exam 20 Date the 30–Day Letter defaults or the date the request for appeals conference is received Date the case is closed from the group