4.8.3  Examination Quality Measurement Staff (EQMS)

Manual Transmittal

March 21, 2013

Purpose

(1) This transmits revised IRM 4.8.3, Technical Services, Examination Quality Measurement Staff (EQMS).

Material Changes

(1) Minor editorial changes have been made throughout this IRM. Websites, legal references, and IRM references were reviewed and updated as necessary. Other significant changes to this IRM include the following:

  1. IRM 4.8.3.2 (2) - Add last sentence for clarity.

  2. IRM 4.8.3.2 (3) - Added other EQMS duties.

  3. IRM 4.8.3.5 (4) - Changed the number of attributes from eighteen to seventeen.

  4. Exhibit 4.8.3-1, Quality Attributes, Planning - Changed Attribute 102 to Attribute 114.

  5. Exhibit 4.8.3-1, Quality Attributes, Investigative Techniques - Changed Attribute 400 to Attribute 440.

  6. Exhibit 4.8.3-1, Quality Attributes, Customer Relations/Professionalism - Changed Attribute 612 to Attribute 620.

  7. Exhibit 4.8.3-1, Quality Attributes - Deleted Attribute 702 Employee Case/History Documentation. Aspects measured by this attribute were merged into Attribute 707 Work Papers Support Conclusions. Clarified definitions and renumbered certain attributes.

  8. Exhibit 4.8.3-3, Examination Quality Measurement Staff (EQMS) Procedures for Sample Review Case Selection: Technical Services Responsibilities - Changed the overnight mail requirement to ground service and increased the number of days for the review from 5 days to 10 days.

Effect on Other Documents

This material supersedes IRM 4.8.3 dated June 10, 2011.

Audience

Small Business/Self-Employed (SB/SE) Exam Employees.

Effective Date

(03-21-2013)

Rodney M. Kobayashi
Director, Technical Services SE:S:E:TS
Small Business/Self-Employed

4.8.3.1  (03-21-2013)
Introduction

  1. The purpose of this chapter is to provide an overview of the Examination Quality Measurement Staff (EQMS).

  2. EQMS is used by Small Business/Self-Employed (SB/SE) to collect information, measure examination quality, and assess the long term trends of system performance in keeping with the balanced measures.

  3. EQMS reviews a statistically valid sample of closed SB/SE cases against the quality attributes.

  4. A quality audit is defined as one that was conducted to determine the substantially correct tax at the least cost and in a manner that promotes public confidence in the Service.

  5. Quality data will be used by management to assess program performance and is an indicator of SB/SE's ability to apply the tax law with integrity and fairness to all.

  6. Quality data will be used to identify system changes, training and educational needs, and to improve work processes.

4.8.3.1.1  (03-21-2013)
Purpose

  1. EQMS is used to support SB/SE's quality improvement program. EQMS is a quality-driven program and will be included as a component of the business results. Its prominence as part of the "business results" measurement will require management to focus on systemic processes and results and look for improvement opportunities as a daily business practice.

  2. Quality data is used to establish baselines and provide an understanding of how the current system is operating, to analyze causes for failures and assess the feasibility of possible solutions, and to measure the success of quality improvement efforts.

  3. Case reviews are conducted to collect information from individual audits. The data collected from the reviews are input into the National Quality Review System (NQRS), which is a web based system that provides the basis for analysis at the territory, area, and national level.

4.8.3.2  (03-21-2013)
Technical Services/EQMS Responsibilities

  1. At the direction of the EQMS Program Manager, EQMS responsibilities include:

    1. Establishing and providing guidance for program objectives.

    2. Ensuring consistent application of the quality attributes.

    3. Recommending updates to the quality attributes, and process measures.

    4. Maintaining instructional guides.

  2. EQMS activities also include providing management with accurate information regarding case quality and the examination process in SB/SE. In addition, EQMS and the Examination Areas work in collaboration to identify quality improvement opportunities, provide support for improvement actions, and share best practices.

  3. EQMS other duties include:

    1. Coordinating and disseminating information on trends and issues identified at the area level that may have nationwide impact.

    2. Preparing and distributing the national newsletters (i.e., Technical Digest, Keys To Success, Fraud Digest).

4.8.3.3  (03-21-2013)
EQMS Review Site Responsibilities

  1. EQMS conducts reviews of completed SB/SE examination cases and provides data to the area offices.

  2. To accomplish these objectives, the responsibilities of EQMS include:

    1. Timely completing EQMS case reviews.

    2. Accurately and consistently applying the quality attributes.

    3. Ensuring accuracy of NQRS database input.

    4. Providing data analysis to the area offices.

    5. Conducting presentations on quality results.

4.8.3.4  (03-21-2013)
Examination Area Office Responsibilities

  1. Area office responsibilities include:

    1. Identifying opportunities for improvement.

    2. Conducting improvement initiatives to impact on the quality results.

  2. To accomplish area office responsibilities, EQMS should ensure:

    1. Proper sampling of completed SB/SE examination cases.

    2. Proper preparation of reports and analysis.

    3. Coordination with the area offices to ensure program objectives are met.

    4. Participation in area quality improvement initiatives.

4.8.3.5  (03-21-2013)
Quality Attributes

  1. The quality attributes are concise statements of SB/SE's expectations for quality examinations and are guidelines to assist examiners in fulfilling their professional responsibilities. See Exhibit 4.8.3-1, Quality Attributes. The quality attributes provide objective criteria against which case quality is evaluated.

  2. The quality attributes complement the Service’s modernization efforts.

  3. Each quality attribute is defined by elements representing components that are present in a quality examination.

  4. Seventeen quality attributes are used to define quality. They are categorized into the following attribute groups:

    1. Planning

    2. Income Determination

    3. Investigative/Audit Techniques

    4. Timeliness

    5. Customer Relations/Professionalism

    6. Documentation/Reports

4.8.3.5.1  (03-21-2013)
Measurement Categories

  1. The attributes are also grouped into four organizational measurement categories:

    • Timeliness

    • Professionalism

    • Regulatory Accuracy

    • Procedural Accuracy

  2. The quality of case work can be described in several different ways. The measurement categories allow quality scores to be generated according to priorities as defined by stakeholder input from the customer, employee, and the business.

4.8.3.5.2  (03-21-2013)
Scoring System

  1. The scoring system provides equal rating for all attributes. Each attribute is rated as "yes," "no," or in some instances "not applicable."

  2. The quality score is computed as a percentage of total "yes," ratings divided by total "yes," and "no," ratings. A total score of 100 is possible for each case.

  3. One overall quality score for both field exam and office exam is reported as the Business Results (Quality) measure for SB/SE.

  4. For analysis purposes, quality scores can be derived by individual attributes; attribute groups; and measurement categories at the territory, area, and national levels. See Exhibit 4.8.3-1 for the individual attributes that are included in the various attribute groups and measurement categories.

4.8.3.6  (03-21-2013)
Process Measures

  1. Process measures are used to describe how the examination was performed and the efficiency of the examination process. These measures are not evaluative on a pass/fail system like the quality attributes. They provide a "snapshot in time." The purpose of process measures is to provide information on specific tasks performed during examinations which should be analyzed in conjunction with the quality attributes.

  2. Data regarding specific tasks performed, how the tasks were completed, key dates, delays in activities, hours associated with the case, etc. are collected.

4.8.3.7  (03-21-2013)
Case Selection

  1. The ERCS (Examination Record Control System) Sample Review program automates the process of selecting a valid sample of closures for EQMS review. The sample size is statistically valid at the area level.

    Note:

    The sample plan is developed annually and is based on prior year closures.

  2. Each area has a sample rate based on closures that is used to select field and office audit cases.

4.8.3.7.1  (03-21-2013)
Case Selection Procedures and Requirements

  1. Cases meeting the sample criteria are selected by the ERCS Sample Review program at the proper rate for the area. Cases are subject to the sample at the point they move from Status Code 51 or 21 on ERCS. When a case is selected for the sample, the ERCS user is notified to print the Sample Selection Sheet to place on the file.

  2. Centralized Case Processing (CCP) is responsible for updating sample selected cases to Status Code 90 and sending to the appropriate EQMS site for review. Cases updated to Status Code 90 are forwarded to the appropriate EQMS site using routine procedures.

  3. Technical Services is responsible for sending protested cases that are selected for sample review to the appropriate EQMS site. Protested cases are "open" and time sensitive. Expedite procedures have been established to ensure timely review of these cases. See Exhibit 4.8.3-3.

  4. Technical Services is responsible for sending unagreed cases with at least one agreed/no-change year that are selected for sample review to the appropriate EQMS site. These cases are also "open" and time sensitive. Expedite procedures have been established to ensure timely review of these cases. See Exhibit 4.8.3-3.

  5. If the statute for any "open" year is 210 days or less, Technical Services will contact the EQMS site to exclude the case.

  6. When "open" cases are transmitted to the EQMS site by Technical Services, they should be updated to Status Code 23, Sample Review and Review Type 33 on ERCS.

  7. See Exhibit 4.8.3-2, Examination Quality Measurement Staff (EQMS) Procedures for Sample Review Case Selection: Centralized Case Processing Responsibilities, and Exhibit 4.8.3-3, Examination Quality Measurement Staff (EQMS) Procedures for Sample Review Case Selection: Technical Services Responsibilities, for additional case selection procedures.

4.8.3.8  (03-21-2013)
Cases Subject To EQMS Sampling

  1. The following cases are included in the EQMS sample:

    1. SB/SE field examination and office audit income tax cases (corporations, partnerships, and individual returns)

    2. Agreed, partially agreed, unagreed, no-change, and protested cases to Appeals

    3. Secured delinquent returns not accepted as filed

    4. Training cases

    5. Form 1041, U.S. Income Tax Return for Estates & Trusts, Form 1042, Annual Withholding Tax Return for U.S. Source Income of Foreign Persons, and Form 1120-F, U.S. Income Tax Return of a Foreign Corporation, income tax returns examined by revenue agents

    6. Correspondence cases examined by revenue agents, tax auditors, and tax compliance officers

    7. Pre-assessment innocent spouse cases

    8. Claims (excludes surveyed claims-Disposal Code 34)

    9. Audit reconsideration cases

    10. Employment tax cases are included if they are closed as related cases to an income tax case (the entire related case package is included)

  2. Cases excluded from EQMS sample include:

    1. Secured delinquent returns accepted as filed

    2. Pure penalty cases not included as part of an examination case

    3. Surveyed returns

    4. Offers in compromise cases

    5. Post-assessment innocent spouse cases

    6. Excise cases

    7. Estate and gift tax cases

    8. Employment tax cases are excluded only when there is no related income tax case

    9. Surveyed claim cases (Disposal Code 34)

    10. No show/no response cases

    11. Tax examiner cases

    12. Open cases with 210 days or less remaining on the statute

    13. Petitioned cases

    14. Cases updated to suspense status

    15. Cases updated to group status after 90 Day Letter issued

    16. Cases closed via Form 906, Closing Agreement

    17. Specific project codes as determined by Headquarters

4.8.3.9  (03-21-2013)
Case Review Procedures

  1. EQMS reviewers are responsible for evaluating examination case quality and collecting process measures data. It is important that reviewers have an in-depth understanding of the quality attributes, process measures and definitions.

4.8.3.9.1  (03-21-2013)
Field and Office Examination Job Aid

  1. The Document 12354, Field Compliance Embedded Quality, Field and Office Examination Job Aid, includes complete instructions, definitions, and examples of how cases should be evaluated. Refer to the Job Aid on the Embedded Quality web site: http://mysbse.web.irs.gov/mgrsact/eq/examination/default.aspx

4.8.3.9.2  (03-21-2013)
Completion Of Data Collection Instrument (DCI)

  1. The DCI is used as a guide for the review process to capture statistical data to be incorporated into the NQRS database.

  2. The DCI provides the principle documentation for the reviewer’s case evaluation and conclusions.

  3. One DCI will be completed for each case selected for EQMS review.

  4. The DCI includes the following sections:

    1. Administrative information: data regarding the EQMS review process, case characteristics, and closing information

    2. Evaluation of the quality attributes: each attribute is rated "yes" "no" , or "not applicable"

    3. Collection of process measures: identification of specific dates and completion of specific procedures relating to the examination phases

  5. The reviewers will prepare narratives for each attribute rated "no." Narratives should be clear, concise, and specific as to why the attribute was not passed.

  6. Upon completion of the EQMS review and the input of the data into the NQRS system, the original paper DCI is to be maintained in the EQMS review site until validation processing has occurred.

4.8.3.10  (03-21-2013)
Consistency Checks

  1. The reviewer’s case evaluation and data collection must be accurate and consistent to provide reliable and meaningful results. Management should periodically perform consistency checks to ensure EQMS reviewers are uniformly applying the quality attributes and accurately inputting data.

4.8.3.10.1  (03-21-2013)
Types of Checks

  1. Consistency checks are performed in several ways:

    1. Reviewers independently evaluate the same case. A group discussion is held to compare the consistency of the ratings for each attribute and the process measures data.

    2. EQMS managers should:

      • Critique completed EQMS reviews on a regular basis and provide meaningful feedback to reinforce expectations for EQMS case reviews.

      • Review narratives on a regular basis to ensure guidelines are followed and the end product is professional.

      • Use NQRS reports on a regular basis to evaluate and monitor consistency within the EQMS site and on a national basis.

      • Conduct group meetings to discuss specific attributes and case scenarios.

4.8.3.11  (03-21-2013)
EQMS Reviewer Responsibilities

  1. The DCI is the originating document for input into the NQRS database. Care must be exercised to ensure that entries are accurate and records are not duplicated.

  2. Reviewers are accountable for following guidelines and instructions for accurately applying the attributes, inputting accurate header information, selecting correct reason codes, and writing quality narratives.

  3. Reviewers will review one case to completion before starting another case review.

  4. Reviewing a case to completion includes:

    1. Review the case

    2. Input the data and narratives

    3. Print the DCI and thoroughly review for data accuracy and narrative quality

    4. Edit the DCI as necessary

    5. Move on to the next case review

4.8.3.12  (03-21-2013)
National Quality Review System (NQRS) Reports

  1. The NQRS web based system includes standard reports designed to assist in the analysis of the data collected through case reviews.

  2. Ad hoc reports are also available in NQRS to allow a variety of queries to be made in order to analyze the data.

  3. Separate reports are available for field examination and office audit. They are distinct programs. Combining the data is not meaningful.

  4. Reports are available at the national, area, and territory levels. The data are statistically valid to the area level on an annual basis.

4.8.3.13  (03-21-2013)
Management Use of National Quality Review System (NQRS) Data

  1. NQRS data will be used to assess program performance and will not be used to evaluate individual employee performance. Any feedback or other work products generated from NQRS will not be used as substitute for case reviews, case file reviews, on the job visits, and workload reviews.

4.8.3.14  (03-21-2013)
Feedback to Examiners

  1. The fundamental purpose of EQMS is the gathering of data and identifying trends for management’s use regarding SB/SE examination case quality. The emphasis is on providing management with information and analysis to determine "root causes" of quality concerns.

  2. The case return criteria outlines the conditions under which cases will be returned to a group for additional work. See case return criteria at IRM 4.8.2.8, Returning Cases to the Field.

Exhibit 4.8.3-1 
Quality Attributes


PLANNING

ATTRIBUTE DEFINITION
101 Pre-Plan Activity (Procedural Accuracy) This attribute measures if the pre-plan activity is appropriate.
111 LUQ After Pre-Plan (Other than Income) (Procedural Accuracy) This attribute measures if large, unusual, and questionable (LUQ) items (other than income) were properly considered on the key return that became evident throughout the course of the examination after the pre-plan phase. This attribute includes expanding and contracting the scope when warranted after the pre-plan.
112 Required Filing Checks (Procedural Accuracy) This attribute measures if the Required Filing Checks were conducted through appropriate IDRS research or other means, such as inspections, inquiry, etc.
114 IDR (Procedural Accuracy) This attribute measures if appropriate Information Document Requests (IDR) were prepared. This includes the initial IDR and subsequent IDR(s) (if applicable).


INCOME DETERMINATION

ATTRIBUTE DEFINITION
300 Exam Income Determination (Regulatory Accuracy) This attribute measures if appropriate techniques were used to determine income and the tax law applicable to income issues was properly considered.

INVESTIGATIVE/AUDIT TECHNIQUES

ATTRIBUTE DEFINITION
405 Interpreted/Applied Tax Law Correctly (Regulatory Accuracy) This attribute measures if the tax law was interpreted and applied correctly which includes all aspects of issue development and taxpayer communication.
407 Fraud Determination (Regulatory Accuracy) This attribute measures if indications of fraud were recognized, pursued and developed, and if the penalty was correctly computed.
408 Civil Penalty Determination (Regulatory Accuracy) This attribute measures if civil penalties were properly considered, correctly computed, and if assertion or non-assertion was adequately documented.
440 Audit/Compliance Interview (Procedural Accuracy) This attribute measures if in-depth planned interviews were conducted throughout the examination. Sufficient questions should be asked to give a clear understanding of the taxpayer as well as the business operations and financial practices of the taxpayer. Evaluation of this attribute should include consideration of all interviews held with taxpayers, representatives, and third parties.

TIMELINESS

ATTRIBUTE DEFINITION
509 Time Charged (Timeliness) This attribute measures if the time spent on the examination is commensurate with the complexity of the issues.
510 Time Span (Timeliness) This attribute measures if the time span of the case is appropriate for the actions taken. Case actions should be completed in the most efficient manner and not result in unnecessary delays during the examination process.
National Standard Timeframes
Below are the recommended national standard timeframes for which action should be taken:
  • 45 days to start the examination: first action to first appointment.

  • 45 days between significant activities.

  • 1 day (next business day) to return telephone calls to the taxpayer/representative.

  • 14 days to follow-up to correspondence from the taxpayer/representative.

  • 10 days for case closures for agreed or no-change examinations - from the date the report is received or the date the no-change status is communicated to the taxpayer to the date the case is closed from the group.

    Note:

    The no-change status should be communicated to the taxpayer as soon as it is determined.

  • 20 days for case closures for unagreed examinations-from the date the 30–Day Letter defaults or the date the request for appeals conference is received to the date the case is closed from the group.

    Note:

    Days refers to calendar days. Reasonable delays will be taken into consideration. Expedite processing requirements per the IRM will take precedence over the above time frames.

CUSTOMER RELATIONS/PROFESSIONALISM

ATTRIBUTE DEFINITION
605 Clear/Professional Written Communication (Professionalism) This attribute measures if all correspondence/documentation is businesslike and professional in tone, appearance and content.
609 Confidentiality (Regulatory Accuracy) This attribute measures if the confidentiality of taxpayer returns and/or return information was protected.
617 TP/POA Rights and Notification (Regulatory Accuracy) This attribute measures if the taxpayer/representative was advised of all rights and kept informed throughout the examination process.
620 Solicit Payment (Procedural Accuracy) This attribute measures if payment was solicited and/or installment agreement considered.

DOCUMENTATION / REPORTS

ATTRIBUTE DEFINITION
707 Work Papers Support Conclusions (Procedural Accuracy) This attribute measures if the activity record was used to document examination activities and time charges throughout the audit. It also measures if work papers (including scope, depth and techniques used) were appropriately prepared to support the conclusion of the case.
719 Report Writing and Tax Computation (Regulatory Accuracy) This attribute measures if the proposed or actual assessment or abatement of tax was correctly determined/computed using the applicable report writing procedures.

Exhibit 4.8.3-2 
Examination Quality Measurement Staff (EQMS) Procedures for Sample Review Case Selection: Centralized Case Processing Responsibilities

Centralized Case Processing will process cases for the EQMS sample. The ERCS Sample Review selection program will identify cases selected for EQMS review.
ERCS Sample Review Procedures

  1. Cases are subject to sample selection when Status Code 51 is updated to another 5X status.

  2. The ERCS user is notified when a case has been selected for EQMS review and a Sample Selection Sheet must be printed.

  3. The Sample Selection Sheet will be placed on top of the file.

  4. The Sample Selection Sheet contains the address of the EQMS site that will receive the case.

  5. If Centralized Case Processing has a question as to whether a case was selected by the ERCS system in error, the appropriate EQMS site manager should be contacted to resolve the situation.

  6. When a case includes multiple years for the same taxpayer and/or related returns closed together as a package, the entire package is considered selected for EQMS review regardless if all returns/entities are shown on the Sample Selection Sheet.

    1. A case is defined as (1) an examination of one return for a taxpayer, or (2) any group of multiple year returns and/or related returns closed as a related package.

    2. Cases selected for EQMS review should include at least one key income tax return: Form 1040, U.S. Individual Income Tax Return,Form 1041, U.S. Income Tax Return for Estates and Trusts, Form 1042, Annual Withholding Tax Return for U.S. Source Income of Foreign Persons,Form 1065, U.S. Return of Partnership Income,Form 1120-F, U.S. Income Tax Return of a Foreign Corporation,Form 1120-S, U.S. Income Tax Return for an S Corporation, or Form 1120, U.S. Corporation Income Tax Return.

  7. Cases will be updated to Status Code 90 then sent to the appropriate EQMS site.

  8. Form 2275, Records Request, Charge and Recharge, will be prepared for each tax period and stapled to the return. It should indicate that the return is being sent to the EQMS site (including address and telephone number). In the remarks area, note any related cases by taxpayer name, TIN (Taxpayer Identification Number) and tax periods.

  9. If a case selected for EQMS review is required to be returned to a group by CCP function for correction, the case will be sent to the EQMS site after corrections have been made and updated to Status Code 90.

  10. Unagreed cases closing for issuance of statutory notice of deficiency are subject to sample selection in Technical Services. When an unagreed case is selected for EQMS review, Technical Services will place the Sample Selection Sheet on top of the case file and update all returns in the case with an "M" freeze code. When the case defaults it will be sent to Centralized Case Processing. Centralized Case Processing will send the case to the appropriate EQMS site when it is updated to Status Code 90. The "M" freeze code along with the Sample Selection Sheet will alert Centralized Case Processing that the case must be sent to EQMS. Due to the lengthy processing time for these cases the freeze code will act as a reminder that the case should be forwarded to the EQMS site. Centralized Case Processing will remove the "M" freeze code prior to sending the case to EQMS.

  11. When a case is returned to an exam group after the 90 Day Letter has been issued, ERCS will automatically "unselect" the case and it will not be sent for EQMS review.

Note:

If all tax periods submitted together as a case do not close or any of the tax periods are placed in Centralized Case Processing suspense pending resolution of a closing problem, then the entire case should be held in Centralized Case Processing until the case can be forwarded to the EQMS site as a complete package.

Transmitting Cases to the EQMS Site

Note:

All cases selected in Centralized Case Processing for EQMS review should be sent to the EQMS site after they are updated to Status Code 90.

  1. Update all tax periods to Status Code 90. Cases can be batched and sent weekly to the EQMS site.

  2. Attach AMDISA print to ensure this is done.

  3. Prepare Form 3210, Document Transmittal. Retain Part 4.

  4. The EQMS site will acknowledge receipt by returning Part 3 of the Form 3210.

  5. Discrepancies between items on Form 3210 and shipping contents should be resolved by telephone whenever possible or returned to the sender by the EQMS site.

Exhibit 4.8.3-3 
Examination Quality Measurement Staff (EQMS) Procedures for Sample Review Case Selection: Technical Services Responsibilities

Cases are subject to sample selection in Technical Services when Status Code 21 is updated to another 2X status.

Cases updated to a suspense status are excluded from sample selection. ERCS will "unselect" a case when updated to a suspense status.

Protested cases that are selected for the EQMS sample should be sent immediately to the EQMS site via ground service. EQMS will review protested cases within 10 business days and return to Technical Services via ground service. If the statute is 210 days or less, Technical Services will contact the EQMS site to exclude the case.

Unagreed cases with at least one agreed/no change year that are selected for the EQMS sample should be sent immediately to the EQMS site via ground service. EQMS will review these cases within 10 business days and return to Technical Services via ground service. If the statute is 210 days or less, Technical Services will contact the EQMS site to exclude the case.

Unagreed cases closing for issuance of statutory notice of deficiency are subject to sample selection in Technical Services. When an unagreed case is selected for EQMS review, Technical Services will place the Sample Selection Sheet on top of the case file and update all returns in the case with an "M" freeze code. When the case defaults it will be sent to Centralized Case Processing. Centralized Case Processing will send the case to the appropriate EQMS site when it is updated to Status Code 90. The "M" freeze code along with the Sample Selection Sheet will alert Centralized Case Processing that the case must be sent to EQMS. Due to the lengthy processing time for these cases the freeze code will act as a reminder that the case should be forwarded to the EQMS site. Centralized Case Processing will remove the "M" freeze code prior to sending the case to EQMS.

  • When an unagreed case is petitioned, ERCS will "unselect" the case when it is updated to Appeals. The "M" freeze code will be removed prior to updating the case to Appeals.

  • When an agreement is received for an unagreed case, the case will be closed to Centralized Case Processing and sent to EQMS when updated to Status Code 90.

When a case is returned to an exam group after the 90 Day Letter has been issued, ERCS will automatically "unselect" the case and it will not be sent for EQMS review.


More Internal Revenue Manual