4.2.8 Guidelines for SB/SE National Quality Review

Manual Transmittal

March 05, 2018

Purpose

(1) This transmits new IRM 4.2.8, Examining Process, General Examining Procedures, Guidelines for SB/SE National Quality Review.

Material Changes

(1) This new IRM section incorporates all relevant procedural guidance relating to National Quality Review found in the following IRM sections:

  • IRM 4.8.3, Technical Services, Examination Quality Measurement Staff (EQMS)

  • IRM 4.8.7.3, Examination Quality Measurement Staff (EQMS) Responsibilities

  • IRM 4.25.1.8, (Estate and Gift) Specialty Programs National Quality Review Program

  • IRM 4.26.18, Bank Secrecy Act, Embedded Quality

  • IRM 4.24.7, Excise Embedded Quality Review System (EQRS) and Excise National Quality Review System (NQRS)

Listed below are the IRM cites in the above sections that were incorporated into this new IRM:

New IRM Reference Source Comment
4.2.8.1 Program Scope and Objectives 4.8.3.1(2), 4.8.3.1.1(1), 4.24.7.2.1, 4.26.18.10(1) and new content New content added to include information involving internal controls to comply with rules and requirements under IRM 1.11.2.1.
4.2.8.1.1 Quality Review Program Authority 4.24.7.1.2, 4.24.7.1.6(3). 4.24.7.2.1 and new content Included RRA 98 regulations.
4.2.8.1.1.1 Taxpayer Bill of Rights (TIBOR) New content Overview of TIBOR
4.2.8.1.2 Field and Specialty Exam Quality - Roles and Responsibilities New content Overview of stakeholders and overall system responsibilities.
4.2.8.1.3 National Quality Review Program 4.8.3.1.1(1), 4.24.7.1.4(3), 4.26.18.1(4) and new content Overview of program.
4.2.8.1.4 Terms and Acronyms New content Added definitions.
4.2.8.1.5 Field and Specialty Exam Job Aids 4.8.3.9.1(1), 4.24.7.1.6(2), 4.24.7.10.1, 4.25.1.8.4, 4.26.18.3(1), 4.26.18.9(1), (2), 4.26.18.14(4) and new content Expanded to include roles and responsibilities relating to Job Aids and updated hyperlinks.
4.2.8.2 Field and Specialty Exam Quality - Program Manager Responsibilities New content Detailed responsibilities
4.2.8.2.1 Field and Specialty Exam - Quality Analyst Responsibilities New content Detailed responsibilities
4.2.8.3 National Quality Review Manager Responsibilities New content Detailed responsibilities
4.2.8.4 National Reviewer Responsibilities 4.8.3.3(2), 4.8.3.9(1), 4.8.3.11(2), 4.24.7.4, 4.26.18.2(3) and (4), 4.26.18.8(1), 4.26.18.8(3),(4), 4.26.18.10(2) Updated and clarified source material.
4.2.8.5 Overview of The National Quality Review System (NQRS) 4.8.3.12(1), 4.8.3.1(3), 4.8.3.1(5), 4.8.3.1.1(3), 4.8.3.12(4), 4.8.7.3, 4.24.7.13, 4.25.1.8, 4.26.18.1(3), 4.26.18.10(3),(4) Expanded content and added hyperlink.
4.2.8.6 Quality Attributes 4.8.3.5(1), (4), 4.8.3.5.1(1), (2), 4.8.3.5.2(4), 4.24.7.1.5, 4.24.7.6(2),4.25.1.8.3, 4.26.18.1(1), 4.26.18.3(1), (3),(4),(5), 4.26.18.8(2) Updated and clarified source material.
4.2.8.7 National Quality Review Process and Completion of Data Collection Instrument (DCI) 4.8.3.9.2(1), (2), (3), 4.8.3.11(1), (3), (4), 4.24.7.10.2, 4.24.7.12, 4.25.1.8.5, 4.26.18.11(1), Updated and clarified source material.
4.2.8.7.1 Review of Specialty Exam Electronic Case File New content Electronic case file review procedures.
4.2.8.7.2 Review of the CEAS Case File New content Electronic case file review procedures
4.2.8.7.3 National Quality Review Scoring System 4.8.3.5.2(1), (2), 4.24.7.6.1, 4.26.18.4(1) Updated and clarified source material.
4.2.8.7.4 DCI Header Input Procedures 4.8.3.6(1), 4.8.3.9.2, 4.26.18.11(2), (3) Updated and expanded on source material.
4.2.8.7.4.1 DCI Process Measures Fields 4.8.3.6(2), 4.24.7.7 Updated and clarified source material.
4.2.8.7.5 Evaluating and Coding the Attributes 4.26.18.11(7) and new content Updated and clarified sources material.
4.2.8.7.6 Attribute Narratives and Reason Codes 4.8.3.9.2(5),4.26.18.11(4), (5), (6) and new content Updated and clarified sources material.
4.2.8.7.6.1 Guidelines for Attribute Narratives New content Added narrative writing guidance.
4.2.8.8 National Quality Review Case Selection Procedures 4.8.3.7(1), (2), 4.8.3.7.1(1), (2), 4.24.7.9 and Exhibit 4.8.3-2 Updated and expanded on source material.
4.2.8.8.1 Protested and Unagreed Appeals Case Selection Procedures 4.8.3.7, 4.8.3.7.1(3), (4), (6), Exhibit 4.8.3-2 and Exhibit 4.8.3-3 Updated and expanded on source material.
4.2.8.8.2 Defaulted Case Selection Procedures Exhibit 4.8.3-3 Added due to reorganization and to add clarity.
4.2.8.8.3 Shipping Sample Select Cases new content Added due to reorganization and to add clarity.
4.2.8.8.4 Sample Select Case Control Procedures 4.26.18.6 and 4.26.18.7 Updated and expanded on source material due to reorganization.
4.2.8.8.5 BSA Case Selection Procedures 4.26.18.2(2) Updated due to reorganization.
4.2.8.9 Field Exam Case Sampling Criteria 4.8.3.8(1), (2) Updated and expanded on source material.
4.2.8.9.1 Specialty Exam Case Sampling Criteria 4.24.7.9 and new content Added due to reorganization and to add clarity.
4.2.8.10 Guidelines for Consistent Case Reviews 4.8.3.10, 4.26.18.12(1), (2), (3) Updated and expanded on source material.
4.2.8.10.1 Conducting Consistency Case Reviews 4.8.3.10(1), 4.8.3.10.1(1), 4.24.7.11, 4.24.7.11.1, 4.26.18.13(1) Updated and expanded on source material.
4.2.8.11 Use and Limitations of National Quality Review Data 4.8.3.1(6), 4.8.3.1.1(2), 4.8.3.12, 4.8.3.13(1), 4.8.3.14(1), 4.24.7.14, 4.24.7.15, 4.25.1.8.7, 4.26.18.14(1), 4.26.18.15(1), (2) Updated and expanded on source material.
4.2.8.12 Case Return Criteria 4.8.3.14(2) Updated and expanded on source material.
Exhibit 4.2.8-1 Quality Attributes Rated by Field and Office National Quality Reviewer Exhibit 4.8.3-1 List of attributes rated for cases selected for review from Field and Office Exam.
Exhibit 4.2.8-2 Quality Attributes Rated by Excise National Quality Reviewer Exhibit 4.24.7-1 and new content List of attributes rated for cases selected for review from Excise Tax.
Exhibit 4.2.8-3 Quality Attributes Rated by Employment Tax National Quality Reviewer New content List of attributes rated for cases selected for review from Employment Tax.
Exhibit 4.2.8-4 Quality Attributes Rated by Estate and Gift National Quality Reviewers New content List of attributes rated for cases selected for review from Estate and Gift Tax.
Exhibit 4.2.8-5 Quality Attributes Rated by Bank Secrecy Act National Quality Reviewers New content List of attributes rated for cases selected for review from Bank Secrecy Act.
Exhibit 4.2.8-6 Time Frames for Case Action Exhibit 4.8.3-1 Attribute 510 Updated and clarified content to include specific Field and Specialty Exam timeliness criteria.

Effect on Other Documents

Upon publication of this IRM section, IRM 4.8.3 and IRM 4.26.18 will be obsoleted. IRM 4.24.7 will also be obsoleted. IRM 4.8.7.3 Examination Quality Measurement Staff (EQMS) Responsibilities will be updated.

Audience

Small Business/Self-Employed (SB/SE) Field and Specialty Exam Employees.

Effective Date

(03-05-2018)

Lori Caskey
Director, Exam Quality and Technical Support
Small Business/Self-Employed

Program Scope and Objectives

  1. Purpose. This IRM provides specific information and procedural guidance relating to national quality review for Field and Specialty Exam Quality located in Small Business/Self-Employed (SB/SE).

    Note:

    Information and guidelines for Examination Campus National Quality Review may be found in IRM 21.10.1,Quality Assurance - Embedded Quality (EQ) Program.

  2. Audience. The procedures found in this IRM apply to SB/SE employees who are responsible for conducting national quality reviews utilizing the National Quality Review System (NQRS) and analyzing data from these reviews.

  3. Policy Owner. The Director, Exam Quality and Technical Support (EQ&TS), is under Headquarters Exam.

  4. Program Owner. The Program Manager, Field and Specialty Exam Quality is responsible for overseeing the Field and Specialty Exam national quality review program in SB/SE.

  5. Primary Stakeholders. Field and Specialty Exam Quality Staff, Field and Specialty Exam Quality Analysts, Field and Specialty Exam Program management in SB/SE.

  6. Program Goals. Field and Specialty Exam Quality supports the Embedded Quality (EQ) improvement program in SB/SE. Embedded Quality is designed to create a link between individual performance and organizational goals. This linkage is achieved through a common set of attributes that both National Quality Reviewers, (defined as reviewers in this IRM) who use NQRS, and front line managers, who use the Embedded Quality Review System (EQRS) to rate the quality of employee case work.

  7. For more information regarding front line manager use of EQRS, see IRM 1.4.40.3.7, Performance Feedback.

Quality Review Program Authority

  1. The requirement for an organizational measure of quality for the IRS was established under CFR 801.6(b),Quality measures, as part of the Restructuring and Reform Act of 1998 (RRA 98).

  2. CFR 801.6(b) states that quality measures focus on whether IRS personnel:

    • Devoted an appropriate amount of time to a matter

    • Properly analyzed the facts of the situation

    • Complied with statutory, regulatory and IRS procedures

    • Took timely actions

    • Provided adequate notification and made required contacts with taxpayers

  3. IRS developed specific measurement criteria, referred to as quality attributes, which are used to evaluate the quality of case work.

  4. The overall quality measure is based on rated attributes and serves as the Balanced Measure score for Business Results – Quality. This overall quality score is reported to various levels of the organization and to outside stakeholders such as Congress.

Taxpayer Bill of Rights (TBOR)
  1. The Taxpayer Bill of Rights (TBOR) lists the fundamental rights taxpayers have when working with the IRS, including a right to quality service. Consideration of these rights in every interaction with taxpayers helps to reinforce the fairness of the tax system. All IRS employees must be:

    • Informed about taxpayer rights

    • Conscientious in the performance of their duties to honor and respect those rights

    • Communicate effectively those rights that aid in reducing taxpayer burdens

    • Administer the law with integrity and fairness

    • Exercise professional judgment in conducting enforcement activities

    Refer to Pub 1,Your Rights as a Taxpayer for more information on TIBOR.

  2. IRM 4.10.1.2, Taxpayer Rights for specific information on each of the rights contained in TBOR.

Field and Specialty Exam Quality - Roles and Responsibilities

  1. The major stakeholders involved in the efficient operation of the Field and Specialty Exam Quality Program are:

    • Field and Specialty Exam Quality Review staff

    • Field and Specialty Exam Analysts

    • Examination - Field and Campus Policy

    • Specialty Exam Policy

    • Specialty Exam Program

    • Field Exam Program

  2. Operations Support, with support from Technology Solutions, Collection Systems provide core information technology management and support services for both EQRS and NQRS. They are responsible for:

    • Ensuring compliance with the Federal Information Security Management Act (FISMA).

    • Managing Unified Work Requests (UWR) for system updates and changes.

    • Leading the development of enhanced data and computer security process and controls.

  3. Operations Support in collaboration with EQRS and NQRS Site System Coordinators (SSC) have oversight of the Online Form 5081,Information User Registration/Change Request, which is used to permit access to the EQRS and NQRS applications. See http://mysbse.web.irs.gov/sbseorg/eq/syscoordguidance/default.aspx for more information regarding SSC responsibilities.

National Quality Review Program

  1. Field and Specialty Exam Quality review measures focus on whether the examiner:

    • Provided proper and timely service to the taxpayer

    • Analyzed the facts properly

    • Applied the law correctly

    • Protected taxpayer rights by following applicable IRS policies and procedures including timeliness, adequacy of notifications, and required contacts with taxpayers

    • Reached the appropriate determination regarding liability for tax and ability to pay

  2. Field and Specialty Exam Quality generate quarterly performance reports for stakeholders. The reports are used to:

    • Establish baselines to assess program performance

    • Identify the strengths and weaknesses in the quality of work performed

    • Identify specific training/educational needs

    • Identify opportunities to improve work processes

    • Analyze causes for failures

    • Measure the success of quality improvement efforts

Terms and Acronyms

  1. The following table contains commonly used terms and acronyms:

    Terms and Acronyms Definition
    BSA Bank Secrecy Act
    CCP Centralized Case Processing
    CJE Critical Job Element
    CEAS Correspondence Examination Automation Support
    DCI Data Collection Instrument
    EQ&TS Exam Quality & Technical Support
    SB/SE Field Exam Cases selected for quality review from Revenue Agents, Tax Compliance Officers and Tax Auditors located in Field Examination
    EQRS Embedded Quality Review System
    ERCS Examination Returns Control System
    IMS Issue Management System
    ITAMS Information Technology Asset Management System
    NQRS National Quality Review System
    SB/SE Small Business/Self Employed business unit
    SPRG Specialized Product Review Group
    SB/SE Specialty Exam Cases selected for quality review from Revenue Agents, Tax Compliance Officers, Attorneys, Revenue Officer Examiner and Fuel Compliance Agents located in Specialty Examination
    UWR Unified Work Request

Field and Specialty Exam Job Aids

  1. Field and Specialty Exam Job Aids are designed to be reference tools for Field and Specialty Exam management and Field and Specialty Exam Quality review staff to aid in rating the quality attributes in a uniform and consistent manner. Guidelines in the Job Aids align the EQ concepts to current Field and Specialty Exam procedures. IRM references support each quality attribute.

  2. Headquarters Examination, Examination Field and Campus Policy is responsible for ensuring the consistency of the Field and Office Exam Job Aids and training materials with the IRM and other guidelines.

  3. Headquarters Examination, Specialty Policy is responsible for ensuring the consistency of the Specialty Exam Job Aids and training materials with the IRM and other guidelines.

  4. The Field Exam Job Aids are located on the Embedded Quality web page http://mysbse.web.irs.gov/examination/examinationeq/default.aspx.

  5. The Specialty Exam Job Aids are located on the Specialty Embedded Quality web page http://mysbse.web.irs.gov/examination/examinationeq/specialtyeq/default.aspx.

Field and Specialty Exam Quality - Program Manager Responsibilities

  1. The Field and Specialty Exam Quality Program Manager is responsible for all aspects of the national quality review program. Primary responsibilities include:

    • Coordinating issues relating to interpreting and rating the quality measures with stakeholders

    • Overseeing and allocating resources for Field and Specialty Exam Quality

    • Analyzing NQRS data to drive organizational improvement and to identify factors influencing quality scores

    • Coordinating annually with SB/SE Research in developing the case review sample plan for Field and Specialty Exam

    • Ensuring that case review inventory is sufficient for each Field and Specialty Exam Area or program based on the sample plan

    • Coordinating and monitoring the quality reviews of completed Field and Specialty Exam cases to ensure the national sampling plan is met and is statistically valid.

    • Providing recommendations to enhance NQRS

    • Providing quality review data and/or analysis to internal/external stakeholders on an ad hoc or recurring basis

    • Coordinating with stakeholders in the development of attributes and requirements for quality reviews

    • Establishing baselines to measure, monitor, and improve reviewer accuracy and consistency

Field and Specialty Exam Quality - Analyst Responsibilities

  1. The Field and Specialty Exam Quality Analyst is responsible for:

    • Developing and issuing quality performance reports

    • Coordinating with stakeholders in their quality improvement initiatives

    • Providing quality review data and/or analysis to internal/external stakeholders on an ad hoc or recurring basis

    • Coordinating with stakeholders in the development of attributes and requirements for quality reviews and responding to issues raised

    • Coordinating with stakeholders in monitoring and updating Job Aids, instructional guides and quality review procedures in accordance with IRM and program guidelines

    • Developing and clarifying review criteria and procedures to ensure consistent application

    • Maintaining the Field and Specialty Exam Quality SharePoint sites including monitoring access

  2. The Quality Analyst assigned to Specialty Exam is also responsible for:

    • Maintaining Specialty Exam program EQRS and NQRS database reference tables via On Line Matrix web application

    • Coordinating Specialty Exam program end user support and system support to maintain the EQRS/NQRS programs

    • Supporting the Embedded Quality (EQ) Summits for Specialty Exam and Specialty Exam EQRS reporting to the National Treasury Employees Union (NTEU) to meet the requirements of Section 12 Article 12 of the National Agreement

National Quality Review Manager Responsibilities

  1. National Quality Review Manager responsibilities include:

    • Providing guidance for program objectives

    • Ensuring that reviewers understand and adhere to program guidelines

    • Ensuring accurate and consistent application of the quality attributes to provide reliable and meaningful results

    • Recommending updates to the quality attributes and process measures

    • Providing input during the attribute development or update process

    • Maintaining instructional guides for national quality reviewers

    • Collaborating with stakeholders to identify quality improvement opportunities, share best practices and provide support for improvement actions

    • Disseminating information on trends and issues that may have nationwide impact

    • Adhering to the annual sample plan, monitoring volumes of cases provided for review and identifying shortages, contacting the responsible function to determine the reason for any shortages and assisting in resolution

    • Ensuring that a system is followed for selecting a representative number of cases from each Area or Specialty Exam program where appropriate

    • Providing a regular systemic approach to measurement and monitoring of reviewer consistency

    • Ensuring the accuracy of all information input during reviews and correcting information as necessary

    • Critiquing completed reviews on a regular basis and providing meaningful feedback to reinforce expectations for quality case reviews

    • Reviewing attribute narratives on a regular basis to ensure guidelines are followed and the end product is professional

    • Conducting group meetings to discuss specific attributes and case scenarios

    • Reviewing and approving case returns to the field, see IRM 4.2.8.12 for more information.

National Reviewer Responsibilities

  1. Reviewers are responsible for evaluating examination case quality and collecting process measures data by conducting reviews of completed SB/SE Field and Specialty Exam cases.

  2. Reviewers record data and provide narrative feedback on the quality of case work and assist in the assessment of long term trends of system performance.

  3. It is important that reviewers have an in-depth understanding of the national quality attributes and process measures, as well as a thorough understanding of the tax law, IRM requirements and procedures. Enterprise Learning Management Service (ELMS) course number 67127, SBE-SP-EQ National Quality Review System (NQRS) covers the basics of using NQRS.

  4. Reviewers are responsible for:

    • Accurately and consistently applying the attributes utilizing the appropriate Job Aid, and current IRM

    • Timely completion of case reviews using the Data Collection Instrument (DCI) and input into the NQRS database

    • Inputting accurate header and process measures information

    • Determining the appropriate reason code(s) for a not met attribute rating

    • Writing clear and meaningful attribute narrative comments for a not met attribute rating

    • Elevating potential conflicts in the IRM and the Job Aid for resolution

    • Assisting in data analysis as warranted

Overview of The National Quality Review System (NQRS)

  1. Organizational performance, measured at the National, Exam Area and Specialty Exam Program level is evaluated using quality attributes that identify actions that move cases toward closure through appropriate and timely case activity.

  2. The National Quality Review System (NQRS) is an automated web based system used to conduct independent case reviews from a statistically valid sample of examination casework. Analysis of reports generated from NQRS provides information used to evaluate organizational processes, procedures, successes and areas of improvement.

  3. The NQRS database is accessed through the Embedded Quality home page at http://mysbse.web.irs.gov/examination/examinationeq/default.aspx.

  4. Users must complete an Online Form 5081 to obtain a password to access NQRS.

Quality Attributes

  1. Quality attributes are guidelines to assist examiners in fulfilling their professional responsibilities. Quality attributes contain objective criteria against which case quality is evaluated. They are statements of SB/SE's expectations for quality examinations.

  2. There are specific attributes utilized by reviewers assigned to:

    • Field Exam

    • Excise Tax (EX) Program

    • Employment Tax (ET) Program

    • Estate and Gift Tax (EG) Program

    • Bank Secrecy Act (BSA) Program

  3. The quality attributes are organized into measurement categories which allow quality scores to be generated based on the following criteria:

    • Timeliness - resolving issues in the most efficient manner through proper time utilization and workload management techniques

    • Professionalism - promoting a positive image of the Service by using effective communication techniques

    • Regulatory Accuracy - adhering to statutory/regulatory process requirements

    • Procedural Accuracy - adhering to internal process requirements

  4. Attribute can also be grouped into the following measurement categories:

    • Planning

    • Income Determination (Field Exam)

    • Investigative/Audit Techniques

    • Timeliness

    • Customer Relations/Professionalism

    • Documentation/Reports

National Quality Review Process and Completion of Data Collection Instrument (DCI)

  1. Reviews are performed on most types of cases worked by Field and Specialty Exam.

  2. The DCI provides the principal documentation for the reviewer’s case evaluation and conclusions. A DCI is completed for each case review in NQRS. Reviewers must ensure that all entries on the DCI are accurate and records are not duplicated.

  3. Reviewers will review one case at a time to completion before starting another case review.

  4. Steps in the review process include:

    1. Review of the case

    2. Input the data and write narratives where applicable

    3. Review the DCI for accuracy and narrative quality

    4. Edit the DCI as necessary

    5. Complete case review

Review of Specialty Exam Electronic Case File

  1. Specialty Exam reviewers in the Excise, Employment and Estate and Gift programs will review the electronic case file in addition to the physical case file.

  2. Electronic case files are located on the Issue Management System (IMS) Team Website.

  3. Users must complete an Online Form 5081 to obtain access to the IMS Team Website.

Review of the Correspondence Examination Automation Support (CEAS) Case File

  1. Currently, for Field Exam reviewers, the physical case file is the primary source for case reviews. Reviewers can view certain case information stored on the CEAS server such as the activity record, lead sheets, correspondence, and audit reports using the View Case function that may be missing from the physical file.

  2. Documents may be found in CEAS that aren’t in the physical file because they were not printed or were inadvertently removed. There may be indications in the physical case file that documents exist in CEAS. Reviewers should access CEAS to determine if the information is available.

  3. CEAS is not needed in every case review.

    Note:

    Electronic files found in CEAS are the last back-up performed when the case is closed from the group.

National Quality Review Scoring System

  1. The scoring system provides equal rating for all attributes. Each attribute is rated as Yes, No, or in some instances Not Applicable.

  2. The quality score is computed as a percentage of total Yes ratings divided by total Yes, and No, ratings. A total score of 100 is possible for each case.

DCI Header Input Procedures

  1. The first input section of the DCI is the data input of headers fields that capture basic case information. The bold header fields are mandatory and must be entered to complete the DCI.

  2. Header information is categorized into four groupings:

    • Review information - specific information about the review itself

    • Case information - basic case information

    • Process measures - case actions taken by the examiner

    • Special Use - special tracking for local or national purposes

DCI Process Measures Fields
  1. Process measures information fields on the DCI are used to measure the efficiency of the examination process.

  2. Information regarding specific tasks performed during the examination, how these tasks were completed, key dates, delays in activities and hours associated with the case may be analyzed in conjunction with the quality attributes.

Evaluating and Coding the Attributes

  1. Reviewers evaluate case work utilizing attributes specific to their Specialized Product Review Group (SPRG).

  2. Attributes listed on the DCI are either required or optional. All required attributes (in bold) must be rated Yes(Met) or No (Not Met) in order for the DCI to be recorded as complete in NQRS. Optional attribute fields should also be completed if applicable based on information in case to ensure an accurate review.

Reason Codes and Attribute Narratives

  1. When a quality attribute is rated not met, at least one reason code must be selected that supports the not met rating. Most of the quality attributes rated in NQRS have reason codes associated with them. The reason code selected should be the one that best applies to the not met rating. If no reason code describes the not met rating, select other if available.

  2. Reviewers should contact their manager when other is used regularly as a reason code, to determine if additional reason codes should be added to NQRS.

  3. In addition to selecting a reason code on all attributes rated not met the reviewer needs to write a narrative in the attribute narrative box on the DCI describing the not met attribute rating.

Guidelines for Writing Attribute Narratives
  1. Reviewers need to be thorough in documenting case reviews, providing clear, concise, and specific descriptions of any errors. Attribute narratives that are too general do not offer sufficient detail to allow for specific recommendations for improvement.

  2. Reviewers must avoid using canned statements and unmeasurable adjectives in their narratives.

  3. Attribute narratives should:

    • Clearly state the facts of actions taken that resulted in the attribute rating

    • Identify the nature of the error in the first sentence of the narrative

    • Indicate what was not done not what should have been done

    • Evaluate the case, not the examiner

National Quality Review Case Selection Procedures

  1. The Examination Record Control System (ERCS) Sample Review program automates the process of randomly selecting a valid sample of case closures for review.

  2. The sample size is statistically valid at the Field Exam Area level and the Specialty Exam Program level. The sample plan is developed annually by SB/SE Research and is based on projected fiscal year closures for each SB/SE program.

  3. Cases meeting the sample criteria are selected by the ERCS Sample Review program at the proper rate for the Field Exam Area and for three of the Specialty Exam Programs (Excise, Employment, Estate and Gift). Cases are subject to the sample at the point they move from Status Code 51 or 21 on ERCS. When a case is selected for the sample, the ERCS user is notified to print the Sample Selection Sheet to place on the file.

    Note:

    BSA sample selection procedures are listed below.

  4. Centralized Case Processing (CCP) is responsible for updating sample selected cases to Status Code 90 and sending to the appropriate site for review using ground mail service.

Unagreed Appeals Case Selection Procedures

  1. The ERCS Sample Review program may select unagreed cases as part of the random sample of cases for review.

  2. Technical Services is responsible for sending unagreed Appeals cases and unagreed Appeals cases with at least one agreed/no-change year that are selected for sample review to the appropriate review site. These cases are high priority and procedures are established to ensure their timely review. Refer to IRM 4.8.2.3.4 ,Technical Services, Case Processing, for more information.

  3. When “open” cases are transmitted to the review site by Technical Services, they should be updated to Status Code 23, Sample Review and Review Type 33 on ERCS.

  4. Field and Specialty Exam reviewers will complete their review of the open unagreed case within 10 business days and return case back to Technical Services via ground mail service.

  5. Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute as of the date the case is received in Appeals. If a non-docketed case is selected for sample review there needs to be an additional 30 days on the statute to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals.

  6. Cases that do not meet this criteria will be deselected and returned to Technical Services.

  7. Reviewers prepare an Appeals Advisory Memo to Technical Services when tax application/computation errors are found or if any taxpayer confidentiality issues discovered. Technical Services decides whether to forward the case to Appeals or return it to the group.

Defaulted Case Selection Procedures

  1. The ERCS Sample Review program may also select unagreed cases closing for issuance of statutory notice of deficiency as part of the random sample of cases for review.

  2. Technical Services will affix the sample selection sheet to these case files and update all returns with an account transfer out freeze code (“M” freeze code).

  3. If the case defaults, Technical Services will send the case to CCP. The “M” freeze code along with the Sample Selection Sheet will alert CCP that the case must be sent to the appropriate review site.

  4. CCP will update case to Status Code 90, remove the “M” freeze code and forward the case to the appropriate review site.

Shipping Sample Select Cases

  1. Cases selected for review should be transmitted to their respective review site.

  2. Field Exam cases are shipped to NQRS-Oakland.

  3. Specialty Exam cases are held in the Covington campus for screening. After screening, cases are shipped directly to reviewers.

  4. Closed case files should remain intact after they leave CCP and Technical Services. Dismantling, purging, or discarding documents from a case file could negatively affect the case if legal actions are pursued.

  5. A separate Form 3210, Document Transmittal, shall be attached to the closed case files. Each selected case shall include the full physical case file, a copy of the case history, back-up documents, and the original administrative file.

Sample Select Case Control Procedures

  1. Each review site will maintain an inventory control system. This will facilitate an orderly flow of case files and supporting documents between closing units, the review site, and the reviewer.

  2. Field and Specialty Exam reviewers shall initial Form 5344, when the case review is completed.

  3. All Field and Specialty Exam closed (status 90) physical case files along with Form 3210 are transported via ground shipment to the Covington Campus for final disposition.

BSA Case Selection Procedures

  1. BSA cases are not controlled on ERCS. Cases are selected using a pull rate selection process.

  2. Title 31 and Form 8300 cases are shipped from the Enterprise Computing Center – Detroit (ECC-DET) to the Covington Campus for screening.

  3. Form 8300 cases are pulled from the weekly extract of closed cases maintained by ECC.

  4. Title 31 cases selected for review are pulled from the Title 31 database. The Title 31 NQ interface is updated before shipment of case to the Covington Campus for screening.

Field Exam Case Sampling Criteria

  1. The following Field Exam cases are included in the review sample:

    • SB/SE revenue agent and tax compliance officer income tax cases (corporations, partnerships, and individual returns)

    • Agreed, partially agreed, unagreed, no-change, and protested cases to Appeals

    • Secured delinquent returns not accepted as filed

    • Training cases

    • Form 1041, U.S. Income Tax Return for Estates & Trusts , Form 1042, Annual Withholding Tax Return for U.S. Source Income of Foreign Persons, and Form 1120-F, U.S. Income Tax Return of a Foreign Corporation, tax returns examined by revenue agents

    • Correspondence cases examined by revenue agents, tax auditors, and tax compliance officers

    • Pre-assessment innocent spouse cases

    • Claims

    • Audit reconsideration cases

    • Employment tax cases are included if they are closed as related cases to an income tax case (the entire related case package is included)

  2. The following Field Exam cases are excluded from the national quality review sample:

    • Secured delinquent returns accepted as filed

    • Penalty cases not included as part of an examination case

    • Surveyed returns

    • Offers in Compromise cases

    • Post-assessment innocent spouse cases

    • Surveyed claim cases (Disposal Code 34)

    • No show/no response cases

    • Tax examiner cases -aging reason 99

    • Protested cases with 395 days or less remaining on the statute

      Note:

      If the case selected for review is a protested case to Appeals, there must be at least 395 days remaining on the statute. Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute as of the date the case is received in Appeals. The additional 30 days is required to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals.

    • Petitioned cases

    • Cases updated to suspense status

    • Cases updated to group status after 90 Day Letter issued

    • Cases closed via Form 906, Closing Agreement

    • Specific project codes as determined by Headquarters

Specialty Exam Case Sampling Criteria

  1. The following Specialty Exam cases are included in the review sample:

    • Excise tax cases

    • Estate and Gift tax cases

    • Employment tax cases where there is no related income tax case

    • BSA Title 31 and Form 8300 cases

  2. The following Specialty Exam cases are excluded from the national quality review sample:

    • Secured delinquent returns accepted as filed

    • Penalty cases not included as part of an examination case

    • Surveyed returns

    • Offers in Compromise cases

    • Post-assessment innocent spouse cases

    • Surveyed claim cases (Disposal Code 34)

    • No show/no response cases

    • Tax examiner cases -aging reason 99

    • Protested cases with 395 days or less remaining on the statute

      Note:

      If the case selected for review is a protested case to Appeals, there must be at least 395 days remaining on the statute. Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute as of the date the case is received in Appeals. The additional 30 days is required to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals.

    • Petitioned cases

    • Cases updated to suspense status

    • Cases updated to group status after 90 Day Letter issued

    • Cases closed via Form 906, Closing Agreement

    • Specific project codes as determined by Headquarters

    • Activity Code 421 returns -Gift Form 706GS(D), Generation-Skipping Transfer Tax Return for Distributions and Form 706GS(T), Generation-Skipping Transfer Tax Return For Terminations

    • Cases worked by Estate and Gift support staff, Paraprofessional -position code 316 and Audit Accounting Aide -position code 301

    • Estate and Gift returns assigned outside of the Estate and Gift area

    • Excise Form 2290, Heavy Highway Vehicle Use Tax Return cases

    • Excise returns assigned outside of the Excise area

    • Employment Tax Form 1040 Tip cases

Guidelines for Consistent Case Reviews

  1. The reviewers case evaluation must be accurate and consistent to provide reliable and meaningful results. Reviewers must strive to be consistent in rating similar case actions. Incomplete or inaccurate reviews may distort the quality review ratings.

  2. Reviewers should rate all attributes that apply to the case being reviewed. Actions that are not applicable to the case being reviewed must be rated as not applicable.

  3. Reviewers narratives should identify errors with enough specific detail to provide opportunities for improvement.

  4. Reviewers should select the most critical reason code for the error and expand on the reason in the narrative. Multiple reason codes may be selected for multiple errors if warranted.

Conducting Consistency Case Reviews

  1. The National Quality Review Manager should periodically perform consistency checks to ensure reviewers are uniformly applying the quality attributes and accurately inputting data.

  2. Results of the consistency reviews should be maintained and periodically evaluated. If inconsistencies in rating specific quality attributes persist, they should be reported to the Quality Analyst for discussion of potential issues raised.

  3. If the cause is determined to be a lack of clarity in the required review procedures or in how the attributes are rated, the Quality Analyst will work with impacted stakeholders to develop and/or clarify review criteria and procedures. No changes to review procedures should be made without concurrence of impacted stakeholders.

  4. National Quality Review Manager may conduct consistency reviews in several ways:

    • Have each reviewer independently review the same case and discuss any inconsistencies in attribute rating

    • Critique completed case reviews on a regular basis and provide feedback to reinforce expectations of the review outcomes

    • Utilize NQRS reports including reviewer narratives, to evaluate consistency, ensure guidelines are followed and to ensure the narratives are clearly and professionally written

    • Hold group meetings to discuss specific attributes and case scenarios

Use and Limitations of National Quality Review Data

  1. The fundamental purpose of the National Quality Review Program is the gathering of data and identifying trends for management’s use regarding SB/SE Field and Specialty Exam case quality. The emphasis is to provide management with information and analysis to determine "root causes" of quality concerns.

  2. Quality review results are statistically valid and reliable measurements of the overall quality of casework completed by SB/SE Field Exam only at the Area level. Specialty Exam national quality results are statistically valid at the Program level. Results stratified to any lower organizational segment are not statistically reliable measurements of the quality of casework at those levels.

  3. Management should view any lower organizational segment stratifications as indicators and rely on them only to the extent that they are confirmed by other reliable management measures of quality. Area, Territory, and Headquarters management should use quality review findings to help identify problems that require organizational changes in such areas as procedures, organization, and training.

  4. The design and format of quality review reports within NQRS as well as access to the reports and data will be determined by the Field and Specialty Exam Program Manager.

  5. No attempt should be made to identify an examiner or otherwise associate specific review results to a particular case.

  6. Review data is used to assess program performance and will not be used to evaluate individual employee performance. Any feedback or other work products generated from NQRS will not be used as a substitute for EQRS case reviews, on the job visits, workload or any other reviews.

Case Return Criteria

  1. Field and Specialty Exam Quality reviewers will follow guidance found in the Technical Services IRM 4.8.2.9, Returning Cases to the Field, which outlines return criteria for cases with potential for significant impact to taxpayer compliance or to tax revenues.

Quality Attributes Rated by Field and Office Exam National Quality Reviewers

The table below is a listing of attributes rated by Field and Office Exam Quality Review, attribute measurement category and the NQRS attribute description.

Note:

Refer to the Field and Office Exam Job Aid for more details.


PLANNING

Attribute Measurement Category Attribute Rates
101 Pre-Plan Activity Procedural If the pre-plan activity was appropriate.
111 LUQ After Pre-Plan (Other than Income) Procedural If large, unusual, and questionable items (other than income) on the primary return were properly considered.
112 Required Filing Checks Procedural If Required Filing Checks through appropriate IDRS research or other means, such as inspections, inquiry, etc. were conducted.
114 IDR Procedural If appropriate information document requests were prepared.


INCOME DETERMINATION

Attribute Measurement Category Attribute Rates
300 Exam Income Determination Regulatory If appropriate techniques to determine income were used.

INVESTIGATIVE/AUDIT TECHNIQUES

Attribute Measurement Category Attribute Rates
405 Interpreted/Applied Tax Law Correctly Regulatory If the tax law was interpreted and applied correctly.
407 Fraud Determination Regulatory If indications of fraud were properly pursued and developed.
408 Civil Penalty Determination Regulatory If the correct determinations/computations for civil penalties was made.
440 Audit/Compliance Interview Procedural If adequate interviews were conducted.

TIMELINESS

Attribute Measurement Category Attribute Rates
509 Time Charged Timeliness If the time spent on the examination was commensurate with the complexity of the issues.
510 Time Span Timeliness If the time span of the case was appropriate for the actions taken.

CUSTOMER RELATIONS/PROFESSIONALISM

Attribute Measurement Category Attribute Rates
605 Clear/Professional Written Communication Professionalism If all correspondence and or documentation was businesslike and professional in tone, appearance and content.
609 Confidentiality Regulatory If the confidentiality of the taxpayer and/or taxpayer information was protected.
617 TP/POA Rights and Notification Regulatory If the taxpayer/representative was advised of all rights and was kept informed throughout the examination process.
620 Solicit Payment Procedural If payment was solicited and/or an installment agreement was considered.

DOCUMENTATION / REPORTS

Attribute Measurement Category Attribute Rates
707 Work Papers Support Conclusions Procedural If the work papers (including scope, depth and techniques used) to support the conclusion of the case were properly prepared.
719 Report Writing and Tax Computation Regulatory If the correct determination and computation of the proposed or actual assessment or abatement of tax was made using the applicable report writing procedures.

Quality Attributes Rated by Excise Tax National Quality Reviewers

The table below is a listing of attributes rated by Excise Tax Quality Review, attribute measurement category and the NQRS attribute description.

Note:

Refer to the Excise Job Aid for more details.


PLANNING

Attribute Measurement Category Attribute Rates
100 Protection of Statute of Limitations Procedural If procedures to protect the statute of limitations were followed.
101 Pre-Plan Activity Procedural If the pre-plan activity was appropriate.
103 LUQ (Other than income) Procedural If large, unusual and questionable items (other than income) on the primary return were properly considered.
112 Required Filing Checks Procedural If the Required Filing Checks through appropriate IDRS research or other means, such as inspections, inquiry, etc were conducted.
114 IDR Procedural If appropriate information document requests were prepared.

INVESTIGATIVE/AUDIT TECHNIQUES

Attribute Measurement Category Attribute Rates
405 Interpreted/Applied Tax Law Correctly Regulatory If the tax law was interpreted and applied correctly.
407 Fraud Determination Regulatory If indications of fraud were properly pursued and developed.
408 Civil Penalty Determination Regulatory If the correct determinations/computations for civil penalties was made.
409 Appropriate Procedural Action Procedural If appropriate procedural action(s) not addressed in any other attribute were taken. It is important to rate the attribute when both the correct procedural action(s) were taken as well as when they were not taken.
419 Inspection of Certificates Procedural If proper inspections of all exemption certificates was made to ensure compliance is being met for the purpose of the registrant maintaining their registration.
420 Registration Tests Accuracy If there was a proper determination and documentation that the applicant meets the activity test, the acceptable risk test, and the adequate security test. As a result, was there documentation to support the need for bonding.
440 Audit/Compliance Interview Procedural If adequate interviews were conducted.

TIMELINESS

Attribute Measurement Category Attribute Rates
509 Time Charged Timeliness If the time spent on the examination was commensurate with the complexity of the issues.
510 Time Span Timeliness If the time span of the case was appropriate for the actions taken.

CUSTOMER RELATIONS/PROFESSIONALISM

Attribute Measurement Category Attribute Rates
605 Clear/Professional Written Communication Professionalism If all correspondence and or documentation is businesslike and professional in tone, appearance and content.
609 Confidentiality Regulatory If the confidentiality of the taxpayer and/or taxpayer information was protected.
613 Managerial Involvement Procedural If the level of managerial involvement was appropriate.
617 TP/POA Rights and Notification Regulatory If the taxpayer/representative was advised of all rights and kept informed throughout the examination process.
620 Solicit Payment Procedural If payment was solicited and/or an installment agreement was considered.

DOCUMENTATION / REPORTS

Attribute Measurement Category Attribute Rates
702 Employee Case/History Documentation Procedural If the required case history/case documentation was completed per IRM guidelines including accurate, clear, and concise preparation of internal documents.
705 Case Processing Documents Procedural If case processing documents were accurately completed when appropriate.
707 Work Papers Support Conclusions Procedural If work papers (including scope, depth and techniques used) to support the conclusion of the case were appropriately prepared.
719 Report Writing and Tax Computation Regulatory If the correct determination and computation of the proposed or actual assessment or abatement of tax was made using the applicable report writing procedures.

Quality Attributes Rated by Employment Tax National Quality Reviewers

The table below is a listing of attributes rated by Employment Tax Quality Review, attribute measurement category and the NQRS attribute description.

Note:

Refer to the Employment Tax Job Aid for more details.


PLANNING

Attribute Measurement Category Attribute Rates
103 LUQ (Other than income) Procedural If proper consideration of large, unusual and questionable items (other than income) on the primary return was made.
112 Required Filing Checks Procedural If Required Filing Checks through appropriate IDRS research or other means, such as inspections, inquiry, etc was made.
114 IDR Procedural If appropriate information document requests were prepared.
115 Pre-Contact Analysis Procedural If a pre-contact analysis of the taxpayer was conducted and documented.
117 Risk Analysis/Audit Plan (Large Case Only) Regulatory If the risk analysis and/or audit plan for the exam was prepared.

INVESTIGATIVE/AUDIT TECHNIQUES

Attribute Measurement Category Attribute Rates
405 Interpreted/Applied Tax Law Correctly Regulatory If the tax law was interpreted and applied correctly.
407 Fraud Determination Regulatory If indications of fraud were properly pursued and developed.
408 Civil Penalty Determination Regulatory If the correct determinations/computations for civil penalties was made.
440 Audit/Compliance Interview Procedural If adequate interviews were conducted.

TIMELINESS

Attribute Measurement Category Attribute Rates
509 Time Charged Timeliness If the time spent on the examination was commensurate with the complexity of the issues.
510 Time Span Timeliness If the time span of the case was appropriate for the actions taken.

CUSTOMER RELATIONS/PROFESSIONALISM

Attribute Measurement Category Attribute Rates
605 Clear/Professional Written Communication Professionalism If all correspondence/documentation is businesslike and professional in tone, appearance and content.
609 Confidentiality Regulatory If the confidentiality of the taxpayer and/or taxpayer information was protected.
617 TP/POA Rights and Notification Regulatory If the taxpayer/representative was advised of all rights and kept informed throughout the examination process.
620 Solicit Payment Procedural If payment was solicited and/or an installment agreement was considered.

DOCUMENTATION / REPORTS

Attribute Measurement Category Attribute Rates
702 Employee Case/History Documentation Procedural If the required case history/case documentation was completed per IRM guidelines including accurate, clear, and concise preparation of internal documents.
707 Work Papers Support Conclusions Procedural If the work papers (including scope, depth and techniques used) to support the conclusion of the case were appropriately prepared.
719 Report Writing and Tax Computation Regulatory If the correct determination and computation of the proposed or actual assessment or abatement of tax was made using the applicable report writing procedures.

Quality Attributes Rated by Estate and Gift Tax National Quality Reviewers

The table below is a listing of attributes rated by Estate and Gift Tax Quality Review, attribute measurement category and the NQRS attribute description.

Note:

Refer to the Estate and Gift Tax Job Aid for more details.


PLANNING

Attribute Measurement Category Attribute Rates
100 Protection of Statute of Limitations Procedural If procedures to protect the statute of limitations were followed.
101 Pre-Plan Activity Procedural If the pre-plan activity is appropriate.
102 IDR Procedural If the Information Document Requests were appropriately prepared.
103 LUQ (Other than income) Procedural If large, unusual and questionable items (other than income) on the primary return were properly considered.
104 Prior and Subsequent Year and Related Returns Procedural If the prior/subsequent and related returns were included in the examination when warranted.

INVESTIGATIVE/AUDIT TECHNIQUES

Attribute Measurement Category Attribute Rates
405 Interpreted/Applied Tax Law Correctly Regulatory If the tax law was interpreted and applied correctly.
407 Fraud Determination Regulatory If indications of fraud were properly pursued and developed.
408 Civil Penalty Determination Regulatory If the correct determinations/computations for civil penalties was made.
409 Appropriate Procedural Action Procedural If appropriate procedural action(s) not addressed in any other attribute were taken. It is important to rate the attribute when both the correct procedural action(s) were taken as well as when they were not taken.

TIMELINESS

Attribute Measurement Category Attribute Rates
500 Time Spent on Exam/Compliance Review Timeliness If the time spent on the examination was commensurate with the complexity of the issues.
501 Efficient Resolution and IRM Timeframes Met Timeliness If IRM timeframes were met and the case actions taken were done in the most efficient manner that did not result in any unnecessary delay to resolve the case.

CUSTOMER RELATIONS/PROFESSIONALISM

Attribute Measurement Category Attribute Rates
605 Clear/Professional Written Communication Professionalism If all correspondence and or documentation is businesslike and professional in tone, appearance and content.
609 Confidentiality Regulatory If the confidentiality of the taxpayer and/or taxpayer information was protected.
612 Solicit Payment Procedural If payment was solicited and/or an installment agreement was considered.
617 TP/POA Rights and Notification Regulatory If the taxpayer/representative was advised of all rights and kept informed throughout the examination process.

DOCUMENTATION / REPORTS

Attribute Measurement Category Attribute Rates
702 Employee Case/History Documentation Procedural If the required case history/case documentation was completed per IRM guidelines including accurate, clear, and concise preparation of internal documents.
705 Case Processing Documents Procedural If case processing documents were accurately completed when appropriate.
707 Work Papers Support Conclusions Procedural If the work papers (including scope, depth and techniques used) to support the conclusion of the case were appropriately prepared.
719 Report Writing and Tax Computation Regulatory If the correct determination and computation of the proposed or actual assessment or abatement of tax was made using the applicable report writing procedures.

Quality Attributes Rated by Bank Secrecy Act National Quality Reviewers

The table below is a listing of attributes rated by Bank Secrecy Act Quality Review, attribute measurement category and the NQRS attribute description.

Note:

Refer to the Bank Secrecy Act Job Aid for more details.


PLANNING

Attribute Measurement Category Attribute Rates
100 Protection of Statute of Limitations Procedural If procedures to protect the statute of limitations were followed.
101 Pre-Plan Activity Procedural If the pre-plan activity is appropriate.
102 IDR Procedural If the Information Document Requests were appropriately prepared.
103 LUQ (Other than income) Procedural If large, unusual and questionable items (other than income) on the primary return were properly considered.
104 Prior and Subsequent Year and Related Returns Procedural If the prior/subsequent and related returns were included in the examination when warranted.
108 Verify Full Compliance Procedural If full compliance was verified through appropriate research or other means such as inspection, inquiry, etc. as appropriate.

INVESTIGATIVE/AUDIT TECHNIQUES

Attribute Measurement Category Attribute Rates
400 Audit/Compliance Interview Procedural If adequate interviews were conducted.
401 Field Visitation Procedural If work was conducted at the appropriate location.
402 Tax Law Knowledge Procedural If a general working knowledge of the tax law was exhibited.
404 Obtain/Determine Tax Law Facts Procedural If the appropriate facts on tax law issues was obtained/determined.
405 Interpreted/Applied Tax Law Correctly Regulatory If the tax law was interpreted and applied correctly.
407 Fraud Determination Regulatory If indications of fraud were properly pursued and developed.
408 Civil Penalty Determination Regulatory If the correct determinations/computations for civil penalties was made.
409 Appropriate Procedural Action Procedural If appropriate procedural action(s) not addressed in any other attribute were taken. It is important to rate the attribute when both the correct procedural action(s) were taken as well as when they were not taken.

TIMELINESS

Attribute Measurement Category Attribute Rates
500 Time Spent on Exam/Compliance Review Timeliness If the time spent on the examination was commensurate with the complexity of the issues.
501 Efficient Resolution and IRM Timeframes Met Timeliness If IRM timeframes were met and the case actions taken were done in the most efficient manner that did not result in any unnecessary delay to resolve the case.

CUSTOMER RELATIONS/PROFESSIONALISM

Attribute Measurement Category Attribute Rates
605 Clear/Professional Written Communication Professionalism If all correspondence and or documentation is businesslike and professional in tone, appearance and content.
609 Confidentiality Regulatory If the confidentiality of the taxpayer and/or taxpayer information was protected.
612 Solicit Payment Procedural If payment was solicited and/or an installment agreement was considered.
613 Managerial Involvement Procedural If the level of managerial involvement is appropriate.
617 TP/POA Rights and Notification Regulatory If the taxpayer/representative was advised of all rights and kept informed throughout the examination process.

DOCUMENTATION / REPORTS

Attribute Measurement Category Attribute Rates
702 Employee Case/History Documentation Procedural If the required case history/case documentation was completed per IRM guidelines including accurate, clear, and concise preparation of internal documents.
707 Work Papers Support Conclusions Procedural If the work papers (including scope, depth and techniques used) to support the conclusion of the case were appropriately prepared.
708 Report Writing Procedural If the applicable report writing procedures were followed.
709 Case File Folder Procedural If the case file was properly prepared and assembled.
800 Customer Impact Customer If the appropriate action was taken to arrive at a correct and complete tax/case resolution with no material adverse impact on the customer.

NQRS Time Frames for Case Action

Activity - Type of exam action or activity measured.

Days - Maximum number of calendar days permitted for the exam action or activity.

Note:

Certain business units have their own timeliness measures.

Measured From - Start of the exam action or activity.

Measured To - End of the exam action or activity.

IRM Reference - IRM references a for specific exam action or activity.

The timeframes (unless noted, measured in calendar days) in the table listed below are when action should be taken to meet timeliness Attribute 501, Efficient Resolution and IRM Time Frames Met, Attribute 510 Time Span, and Attribute 617, TP/POA Rights and Notification.

Activity Program Days Measured From Measured To
Start Examination Field Exam, Excise, Employment, BSA 45 First action First Appointment
Contact Estate and Gift IRM 4.25.1.5.2 45 Examiner’s receipt of case Date examiner sends an initial contact letter to the taxpayer with a copy to the representative, or surveys the assigned case
Significant Activity Field and Specialty Exam 45 Last significant action Next significant activity
Response to call Field and Specialty Exam 1 business day Taxpayer or representative telephone call Return telephone call to the taxpayer or representative
Response to correspondence Field and Specialty Exam 14 Receipt of correspondence or documentation from taxpayer or representative Provide follow up response to taxpayer or representative
POA Processing Specialty Exam 5 Receipt of Form 2848 Submission to CAF Unit for processing IRM 4.11.55.1.8.2(4)
Agreed/No Change Case Closing Field Exam, BSA 10 Date the report is received or the date the no-change status is communicated to the taxpayer Date the case is closed from the group
Agreed/No Change Examiner Case Closing Excise IRM 4.24.21.3.1 10 Date the report is received or the date the no-change status is communicated to the taxpayer Date the case is mailed to manager or closed off IMS to manager
Agreed/No Change Manager Case Closing Excise IRM 4.24.21.3.1 10 Date the case is received by manager or closed off IMS Date case is updated to status 51 and closed from the group
Agreed/No Change Case Closing Estate and Gift IRM 4.25.1.5.2 30 Date the report is received or the date the no-change status is communicated to the taxpayer Date the case is closed from the group
Agreed/No Change Case Closing Employment IRM 4.23.10.4(6) 20 Date the agreed report is received or the no-change report is issued Date the case is updated to status 51 and shipped to CCP
Unagreed Case Closing Field and Specialty Exam 20 Date the 30–Day Letter defaults or the date the request for appeals conference is received Date the case is closed from the group