21.10.1  Embedded Quality (EQ) Program for Accounts Management, Campus Compliance Services, Field Assistance, Tax Exempt/Government Entities, Return Integrity and Correspondence Services (RICS) – Integrity & Verification Operations, and Electronic Products and Services Support

Manual Transmittal

September 20, 2013

Purpose

(1) This transmits revised IRM 21.10.1, Quality Assurance - Embedded Quality (EQ) Program for Accounts Management, Compliance Services, Field Assistance, Tax Exempt/Government Entities, Return Integrity and Correspondence Services (RICS) – Integrity & Verification Operations and Electronic Products and Services Support.

Material Changes

(1) There are editorial changes throughout this IRM section.

(2) IPU 12U1823 issued 11-13-2012 IRM 21.10.1.9.11(7) - Updated to reflect new procedures for SB/SE submission of Collection Operation Customer Satisfaction Survey Sample File.

(3) IPU 12U1823 issued 11-13-2012 IRM 21.10.1.7.7(2) - Updated Feedback Summary section of the table to include CPAS and remove the exception for the ACS product line regarding IRM references.

(4) IPU 12U1823 issued 11-13-2012 IRM Exhibit 21.10.1-5 - Added previously omitted header fields. Reviewer Type/Category, ID Theft, Pay Period, Fraud.

(5) IPU 12U1992 issued 12-28-2012 IRM 21.10.1.8.6 - Centralized PAS Pilot Rebuttal Procedures - Accounts Management (New Section).

(6) IPU 12U1992 issued 12-28-2012 Exhibit 21.10.1-7 - Added ACS Phones to group codes AC and TR.

(7) IPU 12U1992 issued 12-28-2012 IRM 21.10.1.7.6(2) - Add reference to Timeliness Attributes in the existing note. Added a second note linking to the Accounts Management EQRS Flowchart - Coding the Use of Hold.

(8) IPU 13U0280 issued 02-04-2013 IRM 21.10.1.8.1 through IRM 21.10.1.8.6, updated sections to reference the new Form 14448, Quality Review Rebuttal.

(9) IPU 13U0280 issued 02-04-2013 IRM 21.10.1.9.12, added the OFP for Identity Theft cases in AM.

(10) IPU 13U0280 issued 02-04-2013 IRM Exhibit 21.10.1-14, removed NQRS Rebuttal Sheet.

(11) IPU 13U0427 issued 02-27-2013 Exhibit 21.10.1-11 - Removed note related to Accounts Management volume input.

(12) IPU 13U0427 issued 02-27-2013 IRM 21.10.1.9.9(3) - Updated procedures when the customer agrees to take the survey.

(13) IPU 13U0682 issued 04-03-2013 IRM 21.10.1.9.10 - Updated Innocent Spouse (ISP) TF Telephone Customer Satisfaction Survey procedures.

(14) IPU 13U0682 issued 04-03-2013 IRM 21.10.1.9.10.1 - Updated Innocent Spouse (ISP) TF Telephone Customer Satisfaction Survey procedures.

(15) IPU 13U0682 issued 04-03-2013 IRM 21.10.1.9.10.2 - Updated Innocent Spouse (ISP) TF Telephone Customer Satisfaction Survey procedures.

(16) IPU 13U0682 issued 04-03-2013 IRM 21.10.1.7.10 - Corrected the input/edit cutoff date for NQRS reviews from seven to ten days from the end of the reporting period.

(17) IPU 13U0909 issued 05-10-2013 IRM 21.10.1.9.9 - Revised the section on the e-help Phones Customer Satisfaction Survey to clarify the procedures.

(18) IPU 13U0909 issued 05-10-2013 IRM 21.10.1.3(5) - Added a note indicating that SERP Alerts were not part of the 7– day grace period for not charging quality errors when procedures change for Accounts Management.

(19) IPU 13U0909 issued 05-10-2013 Exhibit 21.10.1-13 - Added AM Clerical - Campus Support to the NQRS Reporting Periods chart.

(20) IPU 13U0994 issued 05-28-2013 IRM 21.10.1.9.8.1 - Updated the script to read to callers when offering the SB/SE CCE and AUR Telephone Customer Satisfaction Survey

(21) IPU 13U0994 issued 05-28-2013 IRM 21.10.1.9.7.1 - Updated the script to read to callers when offering the W&I Call Center Environment and AUR Toll Free Telephone Customer Satisfaction Survey.

(22) IPU 13U1022 issued 05-31-2013 IRM 21.10.1.9.10.1 - Updated the Tax Examiner responsibilities related to the Innocent Spouse (ISP) TF Telephone Customer Satisfaction Survey.

(23) IPU 13U1063 issued 06-07-2013 IRM 21.10.1.9.4.2 - Updated the script to read to callers when offering the Telephone Customer Satisfaction Survey.

(24) IPU 13U1094 issued 06-13-2013 IRM 21.10.1.9.11 - Updated survey sample pull procedures for the Collection Operation Customer Satisfaction Survey.

(25) IPU 13U1094 issued 06-13-2013 IRM 21.10.1.9.5.1 - Updated the script to read to callers when offering the PPS Customer Satisfaction Survey.

(26) IPU 13U1094 issued 06-13-2013 IRM 21.10.1.9.6.1 - Updated the script to read to callers when offering the TE/GE Telephone Customer Satisfaction Survey.

(27) IPU 13U1295 issued 07-29-2013 IRM 21.10.1.9.4.2 - Updated survey contact name and Spanish language script.

(28) IRM 21.10.1.1 (1) - Moved paragraph from subsequent section.

(29) IRM 21.10.1.1 (2) - Revised bullet list.

(30) IRM 21.10.1.2 (1) - Moved some content to previous section.

(31) IRM 21.10.1.2 (2) - Revised bullet list.

(32) IRM 21.10.1.2 (3) - Revised to remove AMTAP and add IVO.

(33) IRM 21.10.1.2 (4) - Revised content related to new product lines.

(34) IRM 21.10.1.2 (5) - Revised content related to product review.

(35) IRM 21.10.1.2 (6) - Revised content related to managerial review.

(36) IRM 21.10.1.2 (7) and (8) - Revised content related to Contact Recording.

(37) IRM 21.10.1.2.1 - Changed reference to headquarters to Joint Operations Center.

(38) IRM 21.10.1.2.2 - Added Campus Compliance Services (CCS) Identity Theft Paper to bullet list.

(39) IRM 21.10.1.2.3 - Removed BMF AUR Phones from CPAS review list.

(40) IRM 21.10.1.2.4 - Added paragraph (2) related to local reviews.

(41) IRM 21.10.1.2.5 - Revised paragraph (1) and added note.

(42) IRM 21.10.1.2.6 - Revised bullet list in paragraph (1), added new paragraph (2).

(43) IRM 21.10.1.2.7 - Revised bullet list in paragraph (1).

(44) IRM 21.10.1.2.7.1 - Revised content.

(45) IRM 21.10.1.2.7.2 - Added new subsection detailing the Quality Performance Measurement (QPM) function.

(46) IRM 21.10.1.2.7.3 - Added new subsection detailing the EQ National Support Staff.

(47) IRM 21.10.1.2.7.4 - Rrevised content.

(48) IRM 21.10.1.2.7.5 - Revised content.

(49) IRM 21.10.1.2.7.6 - Removed last paragraph.

(50) IRM 21.10.1.2.7.7 - Revised content, added new paragraph (6).

(51) IRM 21.10.1.2.9 - Removed bullet in paragraph (3).

(52) IRM 21.10.1.3 - Revised content.

(53) IRM 21.10.1.3.1 - Added new section; split previous section.

(54) IRM 21.10.1.3.2 - Revised content, added new paragraph (9).

(55) IRM 21.10.1.3.2.1 - Revised content added new paragraph (4).

(56) IRM 21.10.1.3.2.2 - Revised content.

(57) IRM 21.10.1.4 - Revised content.

(58) IRM 21.10.1.4.3.1 - Revised content, added new paragraphs (2) and (6).

(59) IRM 21.10.1.4.3.3 - Revised content.

(60) IRM 21.10.1.4.6 - Revised content.

(61) IRM 21.10.1.4.6.1 - Revised content.

(62) IRM 21.10.1.4.6.2 - Revised content.

(63) IRM 21.10.1.4.6.3 - Revised content.

(64) IRM 21.10.1.4.9 - Revised content.

(65) IRM 21.10.1.4.10 - Revised content.

(66) IRM 21.10.1.4.16 - Revised content.

(67) IRM 21.10.1.4.16.1 - Revised content, added new paragraphs (6) and (8).

(68) IRM 21.10.1.4.16.3 - Revised content, new paragraphs (1) and (3).

(69) IRM 21.10.1.4.16.4 - Revised content, new paragraphs (3), (4) and (5).

(70) IRM 21.10.1.4.17 - Added new subsection related to IRB Phones.

(71) IRM 21.10.1.5 - Revised section, removed bullet list in paragraph (4).

(72) IRM 21.10.1.5.3.3 - Revised paragraph (1).

(73) IRM 21.10.1.5.5 - Revised content.

(74) IRM 21.10.1.5.5.1 - Revised content; added new paragraph (2).

(75) IRM 21.10.1.5.5.2 - Revised content.

(76) IRM 21.10.1.5.5.3 - Revised content.

(77) IRM 21.10.1.5.6 - Revised reference IRM titles.

(78) IRM 21.10.1.5.6.8 - Revised content.

(79) IRM 21.10.1.5.7.2 - Revised content.

(80) IRM 21.10.1.5.7.3 - Revised content.

(81) IRM 21.10.1.5.9 - Revised content.

(82) IRM 21.10.1.5.9.1 - Revised content.

(83) IRM 21.10.1.5.9.2 - Revised content.

(84) IRM 21.10.1.5.9.3 - Revised CIO Paper sample plan procedures.

(85) IRM 21.10.1.5.11.1 - Revised content, added new paragraph (6) and (8).

(86) IRM 21.10.1.5.11.2 - Revised content.

(87) IRM 21.10.1.5.11.3 - Revised content.

(88) IRM 21.10.1.5.13 - Revised content.

(89) IRM 21.10.1.5.14.2 - Revised content.

(90) IRM 21.10.1.5.19 - Added section for new IRB Non-phones SPRG.

(91) IRM 21.10.1.5.19.1 - Added section defining new IRB Non-phones SPRG.

(92) IRM 21.10.1.7 - Revised content.

(93) IRM 21.10.1.7.1 - Revised content.

(94) IRM 21.10.1.7.4 - Added new paragraph (4).

(95) IRM 21.10.1.7.6 - Revised content, added new paragraph (3).

(96) IRM 21.10.1.7.6.1 - Revised content, added Note in paragraph (1).

(97) IRM 21.10.1.7.6.2 - Revised content.

(98) IRM 21.10.1.7.7 - Revised content.

(99) IRM 21.10.1.7.12 - Revised content.

(100) IRM 21.10.1.7.12.1 - Revised content.

(101) IRM 21.10.1.7.12.2 - Revised content.

(102) IRM 21.10.1.7.13.1 - Revised content.

(103) IRM 21.10.1.7.14.1 - Revised content.

(104) IRM 21.10.1.8 - Revised content.

(105) IRM 21.10.1.8.1 - Revised content, added Note in paragraph (8).

(106) IRM 21.10.1.8.2 - Revised content, added Note is paragraph (6).

(107) IRM 21.10.1.8.3 - Revised content.

(108) IRM 21.10.1.9.4.2 - Updated survey contact information, add Spanish language table.

(109) IRM 21.10.1.9.7.2 - Updated survey contact information.

(110) IRM 21.10.1.9.10.2 - Revised entire section.

(111) Exhibit 21.10.1-5 - Added new header fields: customer name, prisoner and professional decision making.

(112) Exhibit 21.10.1-6 - Corrected names and definitions of Attributes 686 and 687.

(113) Exhibit 21.10.1-7 - Revised content, added Note to group code BL.

(114) Exhibit 21.10.1-9 - Revised to include Return Integrity and Correspondence Services (RICS) – Integrity & Verification Operations.

(115) Exhibit 21.10.1-13 - Updated report weighting for Accounts Paper Adjustments and added AM Identity Theft Paper.

Effect on Other Documents

IRM 21.10.1, dated September 7, 2012 (effective October 1, 2012), is superseded. The following IRM Procedural Updates (IPUs), issued from November 13, 2012 to July 29, 2013, have been incorporated into this IRM: IPU 12U1823, IPU 12U1992, IPU 13U0280, IPU 13U0427, IPU 13U0682, IPU 13U0909, IPU 13U0994, IPU 13U1022, IPU 13U1063, IPU 13U1094 and IPU 13U1295.

Audience

Accounts Management (AM), Campus Compliance Services in Small Business/Self-Employed (SB/SE) and Wage and Investment (W&I), Electronic Products and Services Support (EPSS), Field Assistance (FA), Return Integrity and Correspondence Services (RICS) – Integrity & Verification Operations, and Tax Exempt and Government Organizations (TE/GE)

Effective Date

(10-01-2013)

Carrie Y. Holland
Director, Joint Operations Center
Wage and Investment Division

21.10.1.1  (10-01-2013)
Embedded Quality (EQ) Review Program Overview

  1. Embedded Quality is the system that is used by Accounts Management, Campus Compliance Services, Electronic Products and Services Support, Field Assistance, Return Integrity and Correspondence Services (RICS) – Integrity & Verification Operations, and Tax Exempt/Government Entities for their Embedded Quality Review Program.

  2. This section provides procedures for Campus Embedded Quality program level and site reviews, as well as front-line manager evaluative employee reviews of:

    • Telephone operations

    • Closed Paper case reviews

    • In-process case reviews

    • Responses to taxpayer correspondence

    • Outgoing correspondence and notices

    • Adjustment actions

    • Centralized processes

    • Email responses to IRS web site questions

    • Taxpayer Assistance Center walk-up contacts

  3. This section also provides procedures for:

    • Accessing, adding, editing, and correcting National Quality Review System (NQRS) and Embedded Quality Review System (EQRS) records

    • Generating reports available through NQRS and EQRS

    • Completing the IRS portion of the Customer Satisfaction Survey

21.10.1.2  (10-01-2013)
The Quality Review Process

  1. The Quality Review process provides a method to monitor, measure, and improve the quality of work. Quality Review data is used to provide quality statistics for the Service's Business Results portion of the Balanced Measures, and/or to identify trends, problem areas, training needs, and opportunities for process improvement.

  2. The Embedded Quality (EQ) effort is a way of doing business that builds commitment and capability among all individuals to continually improve customer service, employee satisfaction, and business results. The EQ effort is based on three components:

    • Improving the way quality is measured, calculated, and reported

    • Creating accountability by connecting employee reviews to quality measurement in a way that enables managers and employees to act on the results

    • Improving the design and deployment of the quality resources dedicated to review, analysis, and improvement

    The Embedded Quality System calculates measurement using the percent of applicable coded attributes that are correct based on the number of opportunities within each of five "buckets" . The buckets are defined as follows:

    • Customer Accuracy: giving the correct answer with the correct resolution. "Correct" is measured based on the taxpayer receiving a correct response or resolution to the case or issue, and if appropriate, taking the necessary case actions or case disposition to provide this response or resolution. For the purpose of coding, additional IRS issues or procedures that do not directly impact the taxpayer's issue or case are not considered.

    • Regulatory Accuracy: adhering to statutory/regulatory process requirements when making determinations on taxpayer accounts/cases.

    • Procedural Accuracy: adhering to non-statutory/non-regulatory internal process requirements when making determinations on taxpayer accounts/cases.

    • Professionalism: promoting a positive image of the Service by using effective communication techniques.

    • Timeliness: resolving an issue in the most efficient manner through the use of proper workload management and time utilization techniques.

  3. A "product line" is a major grouping of similar work that is reportable and is measured in the Business Performance Review System (BPRS). Product lines listed below are further defined later in this IRM. The national and local quality reviews for these product lines will be entered into NQRS, and the managerial quality reviews for these product lines will be entered into EQRS. A "Specialized Product Review Group (SPRG)" is a subset of a product line that generally has a separate sample.

    Product Line Specialized Product Review Group (SPRG)
    Accounts Paper Accounts Paper Adjustments
    Accounts Paper IEAR
    Accounts Phones Accounts Phones EIN
    Accounts Phones General
    Accounts Phones International
    Accounts Phones NTA
    Accounts Phones PPS
    Accounts Phones Spanish
    ACS Phones ACS Phones
    ACS Written ACS Case Processing
    ACS Support
    AM Routing AM Routing Default Screener
    AM Specialized Services AMSS CAF/POA
    AMSS EIN
    AMSS International Specialized
    AMSS Support Services
    AMSS Technical Services
    AUR Paper AUR Paper
    AUR Paper CAWR
    AUR Paper FUTA
    AUR Paper PMF
    BMF AUR Paper
    AUR Phones AUR Phones
    BMF AUR Phones
    ASFR Paper ASFR Paper
    ASFR Paper Reconsideration
    ASFR Paper Refund Hold
    ASFR Phones ASFR Phones (Including Refund Hold)
    Centralized Case Processing Collection Paper CLP Lien Paper
    CCP GCP
    CCP MMIA
    CLP Lien Paper CRD
    Centralized Case Processing Phones CCP Lien Phones
    CCP MMIA Phones
    Centralized Case Processing Exam Paper CCP Exam Paper
    CIO Paper CIO Paper
    CIO Phones CIO Phones
    COIC Paper COIC Paper
    COIC Phones COIC Offer Exam
    COIC Process Exam
    Collection Paper Collection Paper
    WHC Paper
    Collection Phones Collection Phones Withholding Compliance
    Collection Phones CSCO
    Collection Phones Combat Zone
    CS Identity Theft Paper CS Identity Theft Paper
    CS Specialized Paper Services Centralized Excise Tax Paper
    Centralized Estate and Gift Tax Paper
    Centralized Transfer Tax Technician
    CS Specialized Phone Services Centralized Excise Tax Phones
    Centralized Estate and Gift Tax Phones
    e-help Phones e-help Phones
    Exam Paper Exam Paper Area Office Support
    Exam Paper Classification
    Exam Paper Discretionary Programs
    Exam Paper EIC Programs
    Exam Paper Flow Through Entities
    Exam Phones Exam Phones Cold Calls
    Exam Phones Extension Routed
    Exam Phones Outgoing Calls
    Forms Order Forms Order NDC
    Innocent Spouse Paper Innocent Spouse Paper
    Innocent Spouse Phones IS Phones Cold Calls
    IS Phones Outgoing
    Integrity & Verification Operations IVO Screening and Verification
    IVO Case Resolution
    Tax Law Phones Tax Law Phones General
    Tax Law Phones International
    Tax Law Phones Rmail Callback
    Tax Law Phones Spanish
    Tax Law Written Tax Law ETLA
    TE/GE Phones TE/GE Telephone
    TE/GE Correspondence TE/GE Correspondence
    Field Assistance FA Tax Law
    FA Accounts
    FA Procedural
    FA Return Preparation
    AM Clerical Campus Support
    Image Control Team

  4. New product lines for Campus Compliance Services must be approved by Quality Performance Measurement Operation, who has responsibility for updates and maintenance of the system, and the EQ National Support Staff, who is responsible for programming the database. Other functions need approval from the areas responsible for quality review and the EQ National Support Staff. The function must be prepared to provide the following information:

    • Is there an existing measure?

    • How many Full Time Equivalents (FTEs) will be needed to review the product line, and how will the function provide them?

    • Who will perform the review (Centralized Quality Review System , Program Analysis System or Centralized Program Analysis System, site quality analysts)?

    • What is the volume of work?

    • Who is recommending this request?

    • Will Contact Recording be used?

  5. Reviews for the Quality Review process are completed by one of the following Centralized Quality Review System (CQRS), Program Analysis System (PAS), Centralized Program Analysis System (CPAS), or sites (for some national quality measure or, local reviews performed for quality improvement). The National Review is a review that measures the quality of the entire product. Review data is compiled using a Data Collection Instrument (DCI). Data from national reviews is entered into the National Quality Review System (NQRS), then rolled up, to provide the business results for the Balanced Measures. These results are the quality ratings for Customer Accuracy, Professionalism, and Timeliness. Data from local reviews performed for quality improvement is entered into NQRS as a Local Review and is not rolled up into the national accuracy rates.

  6. The Managerial Review process creates accountability by connecting employee reviews to the balanced measures. Managers will use the system to track employee performance and training needs. Data from the managerial reviews is entered into the Embedded Quality Review System (EQRS), which maps to an employee's Critical Job Elements and Aspects. It is then rolled up to identify overall employee, team, department and operation scores for Accuracy, Professionalism, and Timeliness. It is not combined in any way with national or local accuracy rates.

  7. The Contact Recording (CR) system records telephone contacts between the Service and customers. The CR system records the complete conversation of every call and randomly selects approximately ten percent of the calls per site, to simultaneously capture both the voice and on-screen computer activity. If a call is one of the random ten percent, managers and quality reviewers will be able to hear and see the entire customer experience. Contact Recording technology allows managers to provide feedback and identify training needs for employees. Managers can access the recordings and allow employees to listen to their own interactions with customers. CR is available for managers and National quality review for most telephone and Field Assistance product lines. Selected telephone calls remain in the system for 60 days.

    Note:

    Periodically, there are systemic problems with Contact Recording. When this happens reviews must be live monitored using the Aspect Toll-Free monitoring system.

  8. All calls handled by the operation are included in the sample and are selected by an algorithm that is programmed into the CR system to randomly select the calls for review.

21.10.1.2.1  (10-01-2013)
Centralized Quality Review System (CQRS)

  1. Centralized Quality Review System (CQRS) is operated by the Joint Operations Center (JOC) to provide independent quality review services for a number of product lines.

  2. CQRS measures the quality of:

    • Tax Law, Accounts, National Distribution Center, Default Screeners, e-help, TE/GE and Automated Collection System (ACS) calls answered by assistors in all sites

    • E-Mail (Electronic Tax Law Assistance – ETLA) responses to questions received through the IRS Website

    • Field Assistance Tax Law, Accounts, Return Preparation and Procedural contacts

21.10.1.2.2  (10-01-2013)
PAS and Site Reviews for the National Measure

  1. PAS and Site Quality Review measures the quality of:

    • Accounts Paper

    • ACS Support

    • Accounts Management Clerical - Campus Support

    • Accounts Management Specialized Services

    • Automated Substitute For Return (ASFR) Paper and Phones

    • Automated Underreporter (AUR) Paper

    • Campus Compliance Services (CCS) Identity Theft Paper

    • Centralized Case Processing Collection (CCP) Paper and Phones

    • Centralized Case Processing (CCP) Exam

    • Centralized Insolvency Operation (CIO)Paper and Phones

    • Centralized Offer In Compromise (COIC) Paper and Phones

    • Collection Paper and Phones

    • Compliance Services (CS) Specialized Paper and Phones Services

    • Exam Paper

    • Innocent Spouse Paper

    • Return Integrity and Correspondence Services - Integrity & Verification Operations

21.10.1.2.3  (10-01-2013)
CPAS Reviews for the National Measure

  1. CPAS measures the quality of:

    • AUR Phones

    • Exam Phones

    • Innocent Spouse Phones

21.10.1.2.4  (10-01-2013)
Local Reviews for Quality Improvement

  1. Local reviews may be performed to focus attention on areas that require improvement. The local quality reviews are performed by staffs reporting to the Quality Assurance Manager, PAS/CPAS Manager, and/or other units that have quality assurance duties. Local quality reviews may also be used for employee development and on-the-job instruction. Accounts Management and Compliance Services functions may also request that local quality reviews be performed on processes that are not subject to the national quality review.

  2. Generally, local reviews should be used for one year when a new product is being implemented. This process is considered a “baseline” period. Often the new procedures being implemented are unstable and need to be adjusted. During this time changes are made to the functional IRMs and training is provided to the employees. The baseline period allows time for the operation to perfect their processes and procedures while receiving feedback on performance.

21.10.1.2.5  (10-01-2013)
Managerial Reviews

  1. Managerial reviews, which are prepared on EQRS, measure employee performance.

    Note:

    Managerial reviews are performed independently from national and local quality reviews. National and local quality review results are never used for evaluation of individual employees.

21.10.1.2.6  (10-01-2013)
Objectives of Quality Review

  1. Quality Review data is used by management to provide a basis for measuring and improving program effectiveness by identifying:

    1. defects resulting from site or systemic action(s) or inaction(s),

    2. drivers of Customer Accuracy,

    3. reason(s) for defect occurrence,

    4. defect trends, and

    5. recommendation for corrective action,

  2. Quality review also provides:

    • a way to ensure the corrective action was effective

    • a vehicle for input to balanced measures, and

    • assistance to management in efforts to improve quality of service

  3. Managerial review data is used by management for some or all of the purposes listed above as well as:

    1. Tracking employee performance and providing input into employee appraisals,

    2. Identifying training needs for individual employees and for groups of employees, and

    3. Planning workload distribution.

    Note:

    Managerial reviews evaluate individual employee performance and are performed independently from national and local quality reviews. National and local quality review results are never used for evaluation of individual employees. See IRM 1.4.16 (Accounts Management Guide for Managers), IRM 1.4.20 (Filing and Payment Compliance Managers Handbook) IRM 1.4.17 (Compliance Managers Handbook) IRM 1.4.11 (Field Assistance Guide for Managers ) and IRM 1.4.18 (Electronic Products and Support Services Managers Guide) for more information on manager responsibilities in conducting managerial reviews.

21.10.1.2.7  (10-01-2013)
Quality Review Roles and Responsibilities

  1. The Quality Review process relies on the teamwork of all of the following:

    • Headquarters Business Operating Divisions (BOD) Quality Analysts, Headquarters Quality Performance Measurement (QPM) Product Line Analysts (PLAs), Process Improvement and Customer Accuracy (PICA) and EQ National Support Staff

    • CQRS - part of the Joint Operations Center (JOC)

    • PAS - operated by Planning and Analysis in each campus

    • CPAS - operated by Quality Performance Measurement (QPM) and reports to the program manager, QPM

    • Planning and Analysis (P&A) Staffs

    • Accounts Management, Campus Compliance Services, Electronic Products and Services Support, Return Integrity and Correspondence Services, Field Assistance and TE/GE Operations

    • Quality Assurance Managers (QAMs) and Field Improvement Managers and Specialists or other quality staffs

    • PAS Managers

21.10.1.2.7.1  (10-01-2013)
Headquarters Business Operating Divisions (BOD)

  1. The Business Operating divisions issue program goals based on the Balanced Measures.

  2. The BOD reviews Quality Assurance programs as part of periodic reviews of Accounts Management, Compliance Services, Electronic Products and Services Support, Return Integrity and Correspondence Services - Integrity & Verification Operations and Field Assistance programs.

21.10.1.2.7.2  (10-01-2013)
Quality Performance Measurement

  1. The mission of the Campus Compliance Service Quality Performance Measurement (QPM) Operation is to provide unbiased, cross BOD administration of the Embedded Quality Review System. QPM spans both SB/SE and W& I Campus Compliance operations and has an enterprise objective to measure performance. By design it is independent from other functions to ensure impartiality and maintain data integrity. QPM regularly interacts with the headquarters’ business owners and all campus/remote office operations to work quality issues including:

    • Providing oversight and coordination of the cross-functional Embedded Quality (EQ) Program.

    • Monitoring EQ program adherence including proper coding of reviews and all aspects of the sampling process.

    • Sponsoring on-going cross-campus discussions to identify trends and issuing guidance and direction to correct global problems.

    • Maintaining the EQ system which includes establishing work arounds for problems identified through coding.

    • Updating EQ tools including job aids, system fields, attribute definitions, and monitoring and resolving Contact Recording issues.

    • Establishing regulations and procedures and providing support for attribute coding for all Campus Compliance reviews.

    • Making all final-authority determinations on elevated rebuttals or adjustments.

    • Serving as the Campus Compliance Services liaison to Accounts Management including coordinating system adjustments to ensure uniform and cohesive data collection.

    • Sponsoring and supporting new Campus Compliance product lines for the system

21.10.1.2.7.3  (10-01-2013)
EQ National Support Staff

  1. The EQ National Support Staff is part of the Joint Operations Center (JOC) and provides support to all EQRS and NQRS users in all functions at all levels. The responsibilities include but are not limited to:

    • Serve as liaison between JOC and various Business Units in the Enterprise on program and quality issues

    • Coordinate with Statistics Of Income (SOI) on weighted report calculations and related programming

    • Support business quality training initiatives

    • Work with business units to develop and define new (Specialized Product Review Groups (SPRGs)

    • Work with the business units to develop new reports as needed

    • Work with various business Product Line Analysts to maintain Data Collection Instruments (DCIs) for current SPRGs

    • Support the EQ Summits between business units and NTEU required in the contract

    • Process Reports Request Central (RRC) submissions for EQRS/NQRS data

    • Issue system alerts to customer base

    • Ensure official Quality measures (Customer Accuracy, Timeliness, Professionalism) as reported by the application are timely and accurate

    • Ensure managerial performance reviews as reported by the application are accurate and reliable

    • Create/submit annual EQRS-C (Campus) enhancement UWR

    • Create/submit annual EQRS-C Maintenance UWR and other UWRs as needed

    • Work with IT (Applications Development, Embedded Quality Section) on all EQRS-C system-related issues

    • Coordinate with IT in completing annual UWR submissions

    • Maintain EQRS-C database reference tables via OL-Matrix web application

    • Serve as Application Point of Contact for all FISMA-related activities and requirements

    • Maintain Embedded Quality Website

21.10.1.2.7.4  (10-01-2013)
Accounts Management/Compliance Services Operations

  1. The P&A Chiefs for Accounts Management and Compliance Services are responsible for the Site Level Business Plans.

  2. The P&A Chiefs are responsible for site reviews of product lines that provide data for the Balanced Measures.

  3. The operations managers are responsible for evaluative reviews on their employees separate and apart from the National Review process on the EQRS side of the database.

  4. If feasible, the QAM and Process Improvement Specialist/Field Improvement Specialist should report directly to the P&A Chief. This eliminates potential conflicts of interest which may occur when more than one department manager is accountable for a product or service. If a formal position is not designated within P&A the responsibilities should be designated to one or more individuals in the operation to ensure that the obligation is met.

21.10.1.2.7.5  (10-01-2013)
Quality Assurance Manager (QAM)

  1. This section only applies to remote and campus locations with functional areas where there is an existing QAM position. In offices/functional areas where no position exists, or in Campus Compliance Services, management will need to ensure these duties are appropriately addressed.

  2. The QAM is responsible for the overall planning, administration, and evaluation of the quality-related sections of the Site Level Business Plans. The QAM will identify problems and work with management to solve them.

  3. The Site Level Business Plans outline procedures for the review of all areas of responsibility. This review process, when combined with CQRS/PAS/CPAS data and other functional data, will help with evaluating the overall quality of operations and making recommendations for improvement.

  4. The QAM will serve as the Quality (QA) manager for the Operation, ensuring that designated quality resources are used to focus on quality improvement efforts.

    Note:

    Results of reviews performed by CQRS/PAS/CPAS staff are not to be used in employee evaluations.

  5. By using trend analysis, the QAM will determine the causes that adversely affect quality. The QAM will assist the management team in initiating processes for employees to improve their quality of service. It is important that lines of communication remain open among the QAM, the QR team, and management in order to identify problem areas, take appropriate corrective actions, and re-evaluate quality to ensure corrective actions result in improved quality.

  6. The QAM or other designated person within the operation will log in and date completed review records that require rework, including reviews by CQRS/PAS/CPAS, where defects have been identified including when FLASH was annotated in the Feedback Summary Remarks field. Managers must return corrected work to the QAM within five working days of receipt. See IRM 21.10.1.7.7(4) for additional instructions on FLASH remarks. The QAM will enter the completion date in the log. The QAM will monitor corrected work to ensure the timeliness and quality of responses to taxpayers.

  7. The QAM will act as the liaison between the CQRS/PAS/CPAS staff and management and is responsible for communicating quality information to all managers in the operation.

  8. The QAM will identify training needs and recommend to the Training Coordinator and/or management, the type of training needed (e.g., on-the-job training or classroom instruction), and assist in the development of additional training exercises and workshops to meet those needs.

  9. The QAM and the CQRS/PAS/CPAS staff will be responsible for the protection of NQRS DCIs and any supporting documentation from legacy systems. All documents and information (including taxpayer information) seen, heard, or handled must remain secure and confidential.

  10. The QAM will serve as the Embedded Quality liaison with managers.

  11. The QAM will work with managers to recognize and reward quality at all levels (e.g., employee-of-the-month, Quality Honor Roll, etc.), using the National Agreement for guidance.

  12. The QAM will be responsible for working with the Training Coordinator and/or management to ensure that EQ training is made available to all who need it. The QAM is responsible for training the quality staff.

  13. The Field Improvement Specialists (FIS) will work closely with Headquarters and the PICA team to manage the portfolio of national projects and best practice evaluation. The expectations for the FIS positions include, but are not limited to, the following.

    1. A regular solicitation for projects will be made to ensure there is no overlap between various sites or functions.

    2. Field Directors recommending projects would be in charge of those selected.

    3. Each Field Director will conduct at least one project with national impact each year.

    4. Field Directors will have the latitude to use FIS resources for local projects.

    5. All projects will be linked to the organization’s strategic goals and direction.

21.10.1.2.7.6  (10-01-2013)
Quality Review/Process Improvement Managers

  1. Throughout this IRM section, the term "Quality Managers" will be used to include the duties of CQRS/PAS/CPAS Managers or other functional Quality Review Managers who perform site reviews for the national quality measure and/or local reviews for quality improvement.

  2. Quality Managers are responsible for ensuring the completion of the national and local reviews.

  3. Quality Managers ensure that all applicable work is sampled and reviewed within sample plan guidelines.

    Note:

    Results of reviews performed by CQRS/PAS/CPAS are not to be used in functional employee evaluations.

  4. The Quality Manager maintains the integrity and quality of the review system by monitoring and reviewing the quality analysts and clerical support. The Quality Manager should routinely monitor the quality sampling selection process. Refer to descriptions of the product lines later in this section for more information on sample universes and sampling guidelines. The Quality Manager should also perform a periodic (at least quarterly) spot check of the work to confirm the case volume matches the count provided by the operations. For example: verify the review content of a single folder to ensure that the volume of the work matches the number of cases in the folder. If these figures do not match, contact the operation to address the reason(s) for the discrepancy and to discuss ways to improve the process accuracy.

  5. The Quality Manager and staff determine the causes that adversely affect quality by using trend analysis to identify the most frequently made defects and root causes. The Quality Manager recommends corrective action and/or improvements to functional areas.

  6. Depending on local procedures, the Quality Manager identifies sample cases that require rework (including cases where FLASH was entered in the Feedback Summary Remarks field) and forwards them to the appropriate functional area. This includes reviews by CQRS where defects have been identified.

  7. The Quality Manager is responsible for working with the Training Coordinator to ensure that the CQRS/PAS/CPAS staff has the appropriate training.

21.10.1.2.7.7  (10-01-2013)
Quality Analyst/PAS/CPAS Analyst

  1. A Quality Analyst or a clerk is responsible for pulling the daily sample.

  2. The quality analyst should perform an unbiased, consistent, and accurate review of all work.

  3. The quality analyst provides the QAM or PAS/CPAS Manager with:

    • Any cases identified as Flash for rework

    • Analysis of types of errors identified during review.

  4. The quality analyst should provide the QAM, QPM or PICA Product Line Quality Analyst, or National Support Staff, as applicable, with recommendations for corrections/improvements on:

    • IRM 21.10.1, Embedded Quality Program for Accounts Management, Compliance Services, Field Assistance and Tax Exempt/Government Entities.

    • Embedded Quality Review Training Material

    • The NQRS DCI

  5. The quality analyst will:

    1. Code the appropriate attribute for each action, while considering the root cause of any defects a single action should not be coded "N" in multiple attributes to avoid a trickle down effect.

    2. Review work using valid sampling techniques as approved by SOI.

    3. Record complete review results using the NQRS DCI.

    4. Measure performance against established quality standards in the functional IRM, publications, and other approved reference sources.

  6. The quality analyst reviewing the work will need to complete training that is appropriate for the SPRG they will review. The following is a list of suggested training topics:

    • Embedded Quality Review Training

    • Quality Review command codes

    • Automatic Data Processing (ADP)/Integrated Data Retrieval System (IDRS) training modules, including on-line accounts and adjustments, and Accounts Management and Compliance Services

    • Technical tax law (as appropriate)

    • Technical Probe and Response Guide

    • Interactive Tax Law Assistant (ITLA)

    • Accounts Collection Service Guide

    • Correspondence guidelines

    • Taxpayer Advocate Service guidelines and criteria

    • Oral statement guidelines

    • Procedural guidelines

    • Communication skills

    • Probing skills

    • Timeliness guidelines

    • e-help

    • TE/GE Specialized Systems

    • TE/GE Probe and Response Guide

21.10.1.2.8  (11-28-2011)
Process Improvement Specialist Roles and Responsibilities

  1. Campus Process Improvement Specialist duties are to:

    1. Identify potential new processes and procedural changes that will improve work processes, quality and level of service for the taxpayers.

    2. Ensure feasible recommendations are presented to enhance procedural, policy, and systemic work practices.

    3. Recommend changes to provide consistency, enhance productivity and efficiency.

    4. Elevate improvement process recommendations to Process Improvement (PI) Manager and Process Improvement Customer Accuracy (PICA) Tax Analysts.

    5. Attend and assist with Training on improvement methods, including the DMAIC process.

    6. Work with PICA to gather facts/data to justify procedural, systemic, and program changes to improve work practices, policies and procedures.

    7. Forward sound recommendations to Process and Program Management (PPM) with solid facts and figures to justify changes.

    8. Ensure approved recommendations are implemented in a timely manner.

    9. Identify and coordinate IRM procedures impacting quality (via PICA then to PPM).

    10. Lead cross-functional improvement discussions and teams at site.

    11. Request action plans (from program owner), when appropriate.

    12. Participate in a least 2 projects (per person) per year (HQ and/or campus); working with PICA.

    13. Report project status monthly (weekly when appropriate) to PICA and PPM while gather information.

    14. Manage the improvement project when additional information must be gathered.

    15. Use at least half of your time (during the year) providing "expert" assistance to Campus P&A staff and Operations Chiefs.

    16. Communicate results, recommendations and related improvement procedures to other sites.

    17. Conduct calibration and consistency analysis, when appropriate.

    18. Establish improvement team to address hot topics during the filing season.

    19. Identify local procedures, job aids, and check sheet and ensure they are approved by Headquarters and used by each site for consistency.

    20. Ensure changes are based on the quality principles which are:
      •Professionalism
      •Customer Accuracy
      •Procedural Accuracy
      •Regulatory Accuracy
      •Timeliness

    21. Use the following reports when analyzing quality issues:
      NQRS Weighted Customer Accuracy Report
      NQRS Customer Accuracy Driver Report
      NQRS Top Defects/Successes by Site Report
      NQRS Ad-hoc Report
      EQRS Customer Accuracy Driver Report
      EQRS Top Defects/Successes by Site Report
      EQRS Ad-hoc Report
      EQRS/NQRS Comparison Report
      SLIM Report
      ETD Report

21.10.1.2.9  (10-01-2013)
Strategy and Program Plans/W&I Operations Plans

  1. The Strategy and Program Plans/W&I Operations Plans are vehicles used to monitor, measure, and evaluate activities consistent with the quality goals of the Balanced Measures.

  2. The Strategy and Program Plans/W&I Operations Plans should include:

    1. Action items that support Balanced Measures initiatives.

    2. Measurement data from CQRS, PAS, CPAS, QR, local reviews, quality staffs, and managerial reviews.

      Note:

      Throughout each month, management must monitor/review the required minimum number of employee telephone calls/cases with taxpayers as set in the Site Quality Action Plan.

  3. The QAM or other local management will assist in the development of Site Level Business Plans which should include requirements to perform reviews to assess the quality of the work including:

    • Managerial reviews of employee work including telephone assistance and written work

    • Timeliness of completed work

    • Accounts Management Accounts and Tax Law work

    • Field Assistance contacts including Return Preparation

21.10.1.2.9.1  (10-01-2006)
Establishing the Strategy and Program Plans/W&I Operations Plans and Site Level Business Plans

  1. When creating the Site Level Business Plans, the QAM will:

    1. Review past Site Level Business Plans and the current year’s Balanced Measures recommendations to understand the program’s goals.

    2. Analyze statistical data, such as NQRS, and EQRS data to identify inconsistencies.

    3. Meet with management to discuss QA objectives.

    4. Establish QA objectives and set priorities using criteria provided by Headquarters and functional management. Local objectives can be added, if desired.

    5. Develop a schedule for implementing each objective of the plan and assign responsibilities.

    6. Work with management to prepare a draft of the plan for review.

    7. Consolidate suggested changes into a final draft for the Operations Chief’s concurrence, which certifies that the plan is operable.

    8. Communicate the applicable portions of the plan to all functional employees, ensuring they understand the plan’s objectives and their roles in the QA process.

  2. After implementation of the Site Level Business Plan, the QAM will monitor plan accomplishments and continue to analyze site data to determine if the operation is meeting objectives. The QAM is expected to recommend needed changes to the plan such as modification or discontinuance of certain objectives.

21.10.1.2.9.2  (10-01-2008)
Strategy and Program Plans/W&I Operations Plans and Site Level Business Plans Resources

  1. Following is a list of resources available to the QAM when creating the quality portion of the Site Level Business Plan. This list is in no particular order and is not meant to be all inclusive. It is designed to suggest the wide variety of data available for consideration.

    • Local work plans

    • NQRS data including local reports

    • EQRS data including local reports

    • Timeliness data

    • Prior year's Operating Guidelines, including plans for QR and managerial involvement

    • Headquarters and functional business plans and reviews

    • Field Assistance data

    • Accounts Management/Compliance Campus reports on functional activities

    • Alert information previously provided to employees

    • Staff feedback

    • Statisticians feedback or reports

    • Taxpayer Advocate Service staff

    • Treasury Inspector General for Tax Administration reports

    • Results of Improvement Projects

    • Focus testing reports

    • General Accounting Office reports

    • ASPECT telephone reports/data

    • Customer satisfaction survey results.

21.10.1.3  (10-01-2013)
Quality Review Research Tools

  1. The following paragraphs may not be all-inclusive, but they will provide a listing of the most frequently used research tools.

  2. A number of IRMs impact the work done by Accounts Management, Compliance Services, Field Assistance, Electronic Products and Services Support and TE/GE. The IRM 21 will often cross-reference these other manuals. Examples include:

    • IRM 4, Examining Process

    • IRM 5, Collecting Process

    • IRM 20.1, Penalty Handbook

    • IRM 3.42, Electronic Tax Administration

    • IRM 13, Taxpayer Advocate Service

    • IRM 11.3, Disclosure of Official Information

    • IRM 2.3, IDRS Terminal Response

    • IRM 2.4, IDRS Terminal Input

    • IRM 2.8, Audit Information Management Systems (AIMS)

    • IRM 10.5.3, Identity Protection Program

    • IRM 20.2, Interest

    • IRM 21.1, Accounts Management and Compliance Services Operations

    • IRM 21.1.7, Campus Support

    • IRM 21.3, Taxpayer Contacts

    • IRM 21.3.7, Processing Third Party Authorizations onto Centralized Authorization File (CAF)

    • IRM 25.6, Statute of Limitations

    • IRM 1.4.10, Integrity & Verification Operation Managers Guide

    • IRM 1.4.11, Field Assistance Guide for Managers

    • IRM 1.4.16, Accounts Management Guide for Managers

    • IRM 1.4.17, Compliance Managers Guide

    • IRM 1.4.18, Electronic Products and Support Services Managers Guide

  3. A number of methods are used to communicate changes, clarifications or corrections to the IRMs and other published products. Among these are:

    • Servicewide Electronic Research Program (SERP)

    • IDRS Bulletins

    • IDRS Message File

    • IRM Update

    • Internal Revenue Bulletins (IRBs)

    • The EQ Website

    • The CQRS Website

    • Quick Alerts and EPSS Communications

    • Quality Alerts

    • Interim Guidance Memoranda

  4. Following are examples of additional sources of information available on SERP:

    • IRM 21, Customer Accounts Services

    • Other product line-specific IRMs (Parts 3, 4 and 5)

    • Probe and Response Guide (P&RG)

    • Interactive Tax Law Assistant (ITLA)

    • Publication Method Guide (PMG)

    • Electronic ACS Guide (EACSG)

    • IAT Tools

    • Technical Communication Documents (TCD)

    • AM Portal

    • TE/GE Probe and Response Guide

    • Telephone Transfer Guide (TTG)

    • Taxpayer Contact Outline

    • Taxpayer Information Publications

    • Forms, Schedules, Instructions

    • Correspondence Letters

    • CP Notices

    • Post-of-Duty Listing and Job Aid

    • Lockbox Addresses

    • Special Procedures Directory

    • State Unemployment Tax Agencies

    • VITA Volunteer Income Tax Assistance/Tax Counselors for the Elderly/American Association or Retired Person Sites

    • On-Line Training Materials

    • Miscellaneous documents of local interest

    • EPSS SERP Portal

    • Document 6209

21.10.1.3.1  (10-01-2013)
Quality Review Exceptions and IRM Deviations

  1. Do not charge quality defects related to IRM or procedural changes until seven calendar days (ten business days for Compliance) after the SERP posting date of the IRM update/change.
    During the seven calendar day grace (ten business days for Compliance) period, national analysts will code "Y" if either the former or new procedure is followed. Informational feedback from local and national reviews will be shared with functional areas prior to the seventh/tenth day.

    Note:

    For National Distribution Center (NDC), the seven calendar day period begins after the date on the Alert/Change.

    Note:

    For EPSS, the seven calendar days begins on the date the EPSS Communication or Quick Alert e-mail was first issued.

    Note:

    For Accounts Management, the seven calendar day grace period for charging quality errors does not apply to SERP Alerts, but, to IRM Procedural Updates (IPUs).

  2. Because of the scope of paper reviews conducted in campuses, each QR/PAS function should establish or have access to a library of all necessary IRMs. Area offices (AOs) should establish a similar IRM library covering all the types of work performed in their site. In most cases, the IRMs can be found on-line through SERP.

  3. Any IRS publication can be cited as a reference source. IRS publications will often be the primary research tool for tax law issues. Procedural issues are often addressed in the publication which can be located electronically on SERP.

  4. Responses to taxpayer questions may refer to specific forms and their instructions. They can be located electronically on SERP under Forms/Letters/Pubs to verify the quality of the response. If erroneous information has been given to the taxpayer, cite the form or instruction to substantiate the defect.

  5. The IRS Electronic Publishing Catalog contains a number of documents that can be used for research purposes. One of the most frequently used is Document 6209, IRS Processing Codes and Information. This document can also be found on-line through SERP.

    Note:

    If there is a discrepancy between the Document 6209 and a specific IRM, the IRM takes precedence.

  6. Use the Probe and Response Guide (P&RG) or the Interactive Tax Law Assistant (ITLA), or the Publication Method Guide in conjunction with Publication 17 and other reference materials to answer customers’ tax law questions. Its use is mandatory for review of the Tax Law product line.

  7. Various automated systems may be needed to conduct reviews. These items include, but are not limited to:

    • Automated Insolvency Systems (AIS)

    • Account Management System (AMS)

    • Automated Collection System (ACS)

    • Automated Underreporter (AUR)

    • Automated Lien System (ALS)

    • Integrated Collection System (ICS)

    • Automated Non-Master File (A-NMF)

    • On-Line Notice Review (OLNR)

    • Locator services, such as credit bureaus and state employment commissions

    • Automated Substitute for Return (ASFR)

    • Report Generation Software (RGS) used by Examination

    • Automated Offer in Compromise (AOIC)

    • Correspondence Imaging System (CIS)

    • Withholding Compliance System (WHCS)

    • e-help Support System (EHSS)

    • Third Party Data Store (TPDS)

    • EP/EO Determination System (EDS)

    • Letter and Information Network User Fee System (LINUS)

    • TE/GE Rulings and Agreements Control (TRAC)

  8. Training materials or locally developed job aids cannot be used to evaluate the quality of a contact or case.

  9. It is essential that all sites and functions follow the same guidelines for coding quality service. Deviations from the IRM; for example, local procedures, or any other practices outside the IRM must be approved by Headquarters (HQ). The procedure for preparing a deviation is as follows:

    • The request must be in memorandum format

    • The memo must state the reason for the deviation, what caused the situation to occur, what is being done to correct it, and the beginning and end date of the deviation (no longer than one year).

    • The memo must be forwarded for approval to the business unit directors.

    • The deviation memo must contain the signatures of all business unit directors that are impacted.

    • The signed deviation memo must be forwarded to HQ Policy, Campus Directors and Quality Performance Measurement (QPM) for implementation.

    Note:

    Deviations are only good for the time specified but never longer than one year. See IRM 1.11.2.2.2, Deviating from the IRM. Deviations are not retroactive and only become effective after obtaining the appropriate signature(s). During the deviation period the work will be reviewed based on the procedures outlined in the deviation.

    Note:

    These procedures do not apply to regular IRM updates processed through SERP IRM Procedural Updates, as these items officially replace the current manual. Also, manuals owned by Field Policy do not fall within this process, as they fall within the interim guidance procedures IRM 1.11.10, Interim Guidance Process.

  10. Interim guidance, e.g., SERP IPU or interim guidance memo, that deviates from the IRM or that establish new practices for temporary procedures or pilot projects must receive prior approval from Headquarters program management. A formal deviation must be filed. See (9) above for instruction on filing a deviation.

21.10.1.3.2  (10-01-2013)
Quality Review Sampling Guidelines

  1. While CQRS, PAS, CPAS or sites perform reviews on product lines or SPRGs for the national measure, each site may also perform local reviews to help improve their quality. The Local button on the NQRS DCI is used to input local reviews.

    Note:

    Local reviews are not used in official measures of product lines and are not included in the calculations of Balanced Measures.

    • Resources for local review are allocated in a site's work plans. Some sites will have resources for both local and national reviews, while others will only have resources for local reviews. This will depend on where the site's national reviews are performed. Resources specifically allocated for local quality improvement reviews should be used to perform these reviews.

    • When performing local reviews for quality improvement, the overall local quality rates may be lower than the national quality rates since local analysts will focus on problem areas.

    • Using the local review option, other areas of work may be reviewed that are not covered under the Balanced Measures.

    • Local reviews for quality improvement are not necessarily expected to be statistically valid samples.

  2. A sample is a representation that displays characteristics of the whole population. Sampling allows the IRS to determine the quality of service without having to review the entire universe of work. Generally, the sample is determined by Specialized Product Review Groups (SPRGs).

  3. In order for a sample to be statistically valid it must be randomly selected. This allows for any case in the sample to have the opportunity to be selected. We ensure the randomness of a sample by selecting the “Nth” case using a skip interval based on the number of required reviews and the population of the work.

  4. A sample, which is really an estimate, must be unbiased and consistent. An unbiased estimate is one where the average value is equal to the actual quality in the population. A consistent estimate is one where the estimate approaches the actual quality in the population as the sample size increases.

  5. Because a sample does not include all cases in the population, any estimate resulting from a sample will not equal the actual quality in the population and will have some variability associated with it. For an estimate to be meaningful, a measure of variability should be included with results. A precision margin and level of confidence can be used to express the variability of an estimate. When added to and subtracted from the estimate, a precision margin identifies the range of values where the actual quality in the population most likely falls. The confidence level indicates how much confidence one can have that the actual population value is within this range. Many IRS quality measures are designed to achieve estimates of 5% precision with 90% confidence. That is, there can be 90% confidence that the actual quality in the population is within plus or minus 5% of the sample estimate.

    • While quality measures are designed to achieve a certain precision (e.g., 5%), the actual precision of an estimate must be calculated using the actual results from the sample.

    • Precision is directly related to sample size. As sample size increases, precision decreases.

    • Precision is also directly related to the estimate of quality. The worst case scenario for an estimate (precision-wise) is 50%. In other words, an estimate of 50% will have the highest precision of all possible samples of the same sample size. Estimates that are closer to either 0% or 100% will be more precise than estimates near 50%.

    • Precision margins should be taken into consideration when determining if a site met its goal.

      Example:

      Assume a site has a goal of 85% and that their sample estimate is 82% with a precision of 4%. Applying the precision margin to the estimate implies that the actual quality in the population is between 78% and 86%. Because 85% lies in this range, the site cannot conclude that they did not meet their goal nor can they conclude that they did meet their goal. However, not taking the precision margin into account would have led the site to conclude that they did not meet their goal.

    • Precision margins should also be taken into consideration when comparing quality estimates between offices or different time periods from the same office.

  6. Samples should be designed conservatively. One piece of information that is necessary when designing a sample is an estimate of the actual quality being measured. This estimate, often obtained from reviews from prior years, is used in the process for determining the sample size. An assumed quality of 50% will result in the largest sample size. Therefore, in order to get a more conservative sample size, either slightly increase or decrease the assumed quality rate so that it is closer to 50%. This will result in a sample size that should provide the desired precision.

    Example:

    If an office had an 88% quality rate for a certain SPRG during the prior year, then assuming an estimated quality rate of 80% will result in a sample size that is sufficiently large.

  7. When designing a sample, an office must decide how often quality estimates are necessary. The individual needs of the office, as well as the resources assigned to the quality review, will help determine whether estimates should be made on a weekly, biweekly, monthly, quarterly, or annual basis. Making estimates on a daily basis is usually not recommended. In addition, in many cases, a weekly estimate requires a sample size that is not practically possible. Often, monthly is the shortest time period for which IRS quality estimates are both practical and recommended.

  8. Some offices may consider merging several types of similar work. There may be several items at the site that need to be tested. However, none are large enough to justify an individual sample, and all are too large to take a census. A merged sample, producing a composite estimate, would be a possible solution in this situation. Information describing the different types of work would have to be included with all estimates from these merged samples.

  9. While CQRS, PAS, CPAS or sites perform reviews on product lines or SPRGs for the national measure, each site may also perform local reviews to help improve their quality. There are two ways to designate reviews as local. You can use the local button on the DCI in NQRS or you can use the group code “BL” (baseline) (Exhibit 21.10.1-7).

    Note:

    Local reviews are not used in official measures of product lines and are not included in the calculations of Business Results of the Balanced Measures.

    • Resources for local review are allocated in a site's work plans. Some sites will have resources for both local and national reviews, while others will only have resources for local reviews. This will depend on where the site's national reviews are performed. Resources specifically allocated for local quality improvement reviews should be used to perform these reviews. When performing local reviews for quality improvement, the overall local quality rates may be lower than the national quality rates since local analysts will focus on problem areas.

    • Local reviews for quality improvement are not necessarily expected to be statistically valid samples.

21.10.1.3.2.1  (10-01-2013)
Selecting the Quality Sample

  1. The following steps are used to determine a sample size:

    1. How often quality estimates are necessary (e.g., quarterly, daily, weekly, monthly).

    2. The level of precision (e.g., 3%, 5%).

    3. The level of confidence (e.g., 90%, 95%, 99%). (Generally, a confidence interval of 90% with a precision margin of 5% is used.), and

    4. A hypothetical "best guess" of the quality rate expected.This guess should be conservative (See IRM 21.10.1.3.2) and can be obtained by reviewing historical quality rates for similar product lines or SPRGs.

      Note:

      If the resulting sample size is too large for the allocated resources, consider reducing the quality estimates. Contact a Headquarters quality analyst for assistance with this.

  2. Unless otherwise specified in IRM 21.10.1, all cases with a closing count must be made available for the National Quality Review System (NQRS) process. This includes cases reviewed by managers, On-the-Job Instructors (OJIs), and cases that are subject to 100% review.

  3. Sampling assumptions must be determined. Unclear or inappropriate assumptions could lead to a sample that is not random, resulting in estimates that are biased, unrepresentative of the population, or inconsistent. This could call the statistical validity of the estimate into question.

  4. Sites must provide sampling assumptions to the Headquarters Process Improvement Customer Accuracy (PICA) or Product Line Analyst (PLA) responsible for the product. The PICA and PLA will provide due dates for the sampling assumptions. The site must estimate the total volumes closed for the SPRG for each period. The PICA and the PLA will provide Statistics of Income (SOI) with the sampling assumptions. The SOI staff will calculate the sample size for most paper SPRGs. Samples are determined quarterly.

  5. Apply a skip interval (see below) to the population to select the sample

    1. The skip interval is equal to the population, divided by the sample size.

    2. Calculate and use a random start number to select the first case (See IRM 21.10.1.3.2.1(6). The random start is between 1 and the skip interval.

    3. Use the skip interval to select the rest of the sample see IRM 21.10.1.3.2.1(7).

    4. If all of the cases are not available at the same time, then sample cases as the work arrives.

    5. If all of the cases are available at the same time and are stored electronically, then software should be used to sort the list of cases either randomly or by a variable related to the quality measure (e.g., time, case ID) prior to applying the skip interval.

    6. If used correctly, a skip interval will ensure that a sample is spread appropriately across the population with estimates that are relatively unbiased.

  6. A random start is needed to apply the skip interval. The random start is provided as part of the sample plan determined by Statistics of Income (SOI).

  7. Use the following procedures to select a sample using a skip interval.

    1. On the first day of the sampling period, use the random start number to identify and select the first sampled case.

    2. Use the skip interval to select the subsequent documents for review. In other words, select the ‘nth’ case after the random start case, and continue selecting every ‘nth’ case thereafter.

    3. If the population of cases spans more than one day, then the skip interval must continue between days. Begin each new day’s count with the number of cases remaining following the last document selected from the previous day.

      Example:

      Assume a skip interval of "8" and that there were 5 cases remaining after applying the skip interval over Monday’s entire population. Then, continuing the skip interval sequence of 8 into the next day, the case count would begin at 6 on Tuesday. Therefore, the first case selected on Tuesday would be case number 3, the second would be case number 11, the third would be case number 19, etc.

    4. A random start number is used only once per quarter, even if the skip interval changes during the quarter.

    5. If the required sample size has been met, continue applying the skip interval through the last case in the population. This will ensure that all work has an equal chance of being selected.

  8. If the size of the population is not known then a skip interval cannot be calculated and another method must be used to select the sample. One alternative to manually spread the sample among the population is as follows.

    1. Establish the time period of the estimate and the sample size.

    2. Using the sample size and the number of business days in the sampling period, determine the average daily sample size

    3. Depending on the flow of the work for this particular review, manually spread the weekly sample appropriately among the days of the week. If you expect the work to be distributed evenly across the week and hours of the day, then divide the sample evenly among the days of the week. In addition, if you expect the work to be distributed evenly throughout individual days, then the sample can be split evenly between the morning and afternoon hours. If, on the other hand, it is known that certain types of work are more likely to occur on certain days (e.g., Mondays, Tuesdays) or during certain times of day (e.g., afternoons), then the sample can be shifted accordingly to follow the workload more accurately.

    4. Spread the sample appropriately among each member of the team or unit performing the type of work being reviewed.

    5. When selecting cases for review during designated dates and times, use one of the methods in IRM 21.10.1.3.2.1(5) to incorporate randomness.

      Example:

      If a case must be selected from the 50 cases handled by a particular employee on Tuesday morning, then a random number table can be used to select a random number between 1 and 50.

    6. Document all decisions made and procedures used throughout the process of manually spreading the sample across the population.

      Note:

      Because sampled cases are selected without the use of a skip interval, it is not automatically ensured that the sample is spread appropriately across the time period being measured and among the assistors included in the review. It is also not ensured that all cases in the population will have an equal chance of being selected. For these reasons, samples selected using the above procedures will have some amount of bias. Selecting the "most random sample possible" given local resources will help minimize this bias.

21.10.1.3.2.2  (10-01-2013)
Revising the Quality Sample

  1. If the site experiences much higher or lower volumes than predicted, the site may change it's skip interval within the quarter.

    1. The new skip interval may be implemented ONLY at the beginning of a sampling month, NEVER in the middle of a sampling month. Because NQRS generates weighted reports monthly, skip intervals must remain constant within any given month.

    2. Never simply grab extra cases, drop selected cases, seek out cases of special interest, or use different methods to select cases in the same sample. Each of these situations could lead to a sample that provides biased results.

    3. Contact the Headquarters quality analyst for assistance in determining the new skip interval.

21.10.1.3.2.3  (10-01-2007)
Weighted Sampling

  1. In sampling, every sampled case represents a certain number of cases in the population. The exact number of cases a sampled case represents will depend on both the sample size and the actual size of the population from which it was selected. When a quality estimate is a combination of two or more separate samples (e.g., a fiscal year report for a single SPRG for a single site), it is necessary to account for the fact that each sampled case included in the overall estimate may not represent the same number of cases in the overall population. Weighting is used to ensure that every sampled case has the appropriate amount of influence on the overall cumulative estimate.

    Example:

    A quality estimate for a single SPRG in a single site for a planning period may consist of three individual samples, one from each month. Therefore, the planning period quality estimate is weighted by the three individual monthly SPRG volumes. This will make certain that each month’s influence on the planning period estimate is directly related to the total number of cases handled during that month.

  2. NQRS provides both weighted and unweighted estimates of quality.

  3. Unweighted estimates that combine more than one site, time period, or SPRG are not considered statistically valid. Such estimates should only be used internally. Their statistical limitations should be taken into consideration when basing business decisions on them.

21.10.1.3.3  (10-01-2012)
Quality Review Time Reporting

  1. See IRM 25.8.1, OFP Codes Overview, for appropriate Work Planning and Control (WP&C) Organization, Function, and Program (OFP) time reporting codes.

21.10.1.3.4  (10-01-2012)
Quality Review Records Retention

  1. IRM 1.15.21, Records Control Schedule for Strategic Planning Division provides specific guidelines on the retention period for National Quality Review System (NQRS) records. National quality review printed reports may be destroyed when superseded or no longer needed. Source documentation relating to non-evaluative national and local product reviews may be destroyed after data input has been validated.

  2. EQRS records are systemically removed from the database after 5 years. NQRS records are removed after 7 years.

21.10.1.4  (10-01-2013)
Quality Review of Phone Calls

  1. CQRS monitors recorded contacts through the use of the Ultra 10 software system. These monitors are conducted for all Accounts Management toll-free phone SPRGs (except for National Taxpayer Advocate), ACS Phones for Compliance Services, e-help Phones for Electronic Products and Services, and all Field Assistance (CARE) product lines (except for Adjustments). CPAS monitors the AUR Phones, BMF AUR Phones, Exam Phones and Innocent Spouse Phones SPRGs. PAS monitors the remaining Compliance Services Phone SPRGs. The data from these reviews may be used for the Business Results of the Balanced Measures.

  2. Managerial reviews of these phone calls are not included in the Business Results calculations; they are used in employee evaluative documentation and to identify training issues.

  3. The Verint Ultra 10 Contact Recording (CR) system is a telephone tool used by Accounts Management, Electronic Products and Services Support, and Compliance Services to record incoming "toll free" telephone contacts, some of which may be selected for quality review. Incoming calls are answered with an additional announcement that states, "Your call may be monitored or recorded for quality purposes." The Verint Ultra system records the audio, and occasionally capture the screen, of all telephone calls coming into the Service via the ASPECT Communication voice response unit. See IRM 21.10.1.2 (7) for more information on CR.

  4. Managers and Quality Analysts use CR to perform required random reviews of incoming telephone contacts. CR allows for a more cost effective review as there is no lag time between calls. The PAS/CPAS analysts use a Shared In-Box to retrieve their daily sample. Reviewers must include the CR identification number or Router Call Key on the Data Collection Instrument used to capture the call review. Calls recorded in this system are available for National review the next business day and every effort should be made to complete the National Review daily.

  5. The system stores data by Standard Employee Identifier (SEID) for 45 days on calls that are not reviewed, 60 days on calls that are reviewed and a maximum of 18 months can be requested.

    Note:

    At this time the COIC Phones SPRG will not use the CR system because they are not using the ASPECT Communication voice response unit.

  6. On a call that was Contact Recorded, if the taxpayer requests that the recording be stopped (aka: "Stop on Demand" ) CQRS/PAS/CPAS will not review the call. If a "Stop on Demand" call is randomly selected for the national sample, it will be rejected and systemically replaced by CR.

  7. When performing a telephone review, the analyst will use the employee's identification number provided on the call:

    • If the analyst is unable to capture the employee's identification number on the call, the last name of the employee, as captured during the review, will be entered into the Employee’s Name field.

    • CQRS will capture the employee's SEID (as it appears on CR from the PBXID field on Ultra 10) as the identification number.

      Note:

      If an employee's workstation is not configured properly, their extension will show and will be captured instead of the SEID.

    • PAS/CPAS analysts will use the following identifiers in the Employee Name field if the situation warrants:
      1.U – If PAS/CPAS analysts are unable to determine both the ID # and the last name,
      2.N – If the employee does not give either an ID # or a name at any time during the conversation,
      3.I – If the PAS/CPAS analyst could not capture the full ID # or last name, but was able to get a portion of the ID # (less than 10 Digits). An example for this situation would be 99999985XX (last 2 digits not captured). If the ID number is given and the analyst got the first two numbers and could not decipher the middle two numbers but got the last 3, the analyst should indicate 99999XX521 in the employee name section.

  8. For a sample call to be counted as a phone review, the taxpayer does not have to remain on the line until all adjustment actions are complete. Even if the employee does not complete all work until the next business day, the call is still counted as part of the sample.

  9. If an Accounts Phone call subject becomes a Tax Law or ACS issue, or vice versa, code the complete call for Professionalism and Timeliness, and code any issue(s) addressed for Accuracy. If the call is transferred and no action was taken to resolve the taxpayer's issue, code the case for all applicable buckets except Customer Accuracy, which will be "not applicable." When this happens, code attribute 004 Call Transfer.

  10. The Master Attribute Job Aid (MAJA) for the phone Product Lines and SPRGs are located on the http://eq.web.irs.gov..

21.10.1.4.1  (02-23-2012)
Accounts Phones Product Line

  1. The Accounts Phones Product line consists of six Specialized Product Review Groups, (SPRGs). The six SPRGs are Employer Identification Number (EIN), General Account Calls, International, National Taxpayer Advocate (NTA), Priority Practitioner Support (PPS), and Spanish calls.

  2. Reviewing Accounts Phones allows us to monitor and improve the quality of responses to a taxpayer's questions about his/her account.

  3. Incorrect/incomplete action (per IRM guidelines) which results in incorrect calculations must exceed a $5.00 threshold before charging a defect for the national review.

21.10.1.4.1.1  (10-01-2006)
Accounts Phones Measure

  1. Accounts Phones will be measured for Timeliness, Professionalism, Customer Accuracy, Regulatory/Statutory Accuracy and Procedural Accuracy. These are the measures that are available and may be reported under the Balanced Measurement System. See IRM 21.10.1.7.3 for more information on the measures.

21.10.1.4.1.2  (10-01-2003)
Definition of EIN Calls SPRG

  1. EIN calls include any questions received on the ASPECT EIN application(s) such as.

    1. Any call relating to a taxpayer’s request for an Employer Identification Number (EIN).

    2. Any call regarding procedural issues (how to complete Form SS-4, where to fax/mail Form SS-4 etc.).

21.10.1.4.1.3  (10-01-2006)
Sample Procedures for EIN Calls

  1. SOI develops sampling plans for EIN Phone calls monitored at CQRS. Samples from CQRS are valid at the site level on at least a quarterly basis and nationally on a monthly basis.

  2. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  3. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1.

  4. Managerial reviews are not subject to a sampling plan.

21.10.1.4.1.4  (10-01-2006)
Definition of General Account Calls SPRG

  1. General Account Calls include any questions received on the ASPECT Accounts Phone Balance Due, Advanced Accounts or Procedural applications for General Account Calls. This does not include calls received on the designated Spanish or International Account applications. General Accounts calls include:

    1. Any call relating to a taxpayer's account (Individual Master File (IMF) or Business Master File (BMF),

    2. Any call regarding entity information, the processing of a tax return, corrections to errors found during processing, or corrections resulting from adjustments or audit assessments,

    3. Any call regarding procedural issues (where to file a return, when and where to make payments, etc.).

21.10.1.4.1.5  (10-01-2008)
Sample Procedures for General Accounts Calls SPRG

  1. SOI develops sampling for Account Phone calls monitored at CQRS. Samples from CQRS are valid at the site level on a monthly basis.

  2. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  3. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2and IRM 21.10.1.3.2.1.

  4. Managerial reviews are not subject to a sampling plan.

21.10.1.4.1.6  (10-01-2003)
Definition of International Calls SPRG

  1. International Calls include any questions received on the designated ASEPCT applications for International such as:

    1. Any international (foreign, non-resident, etc.) call relating to a taxpayer's account (IMF or BMF),

    2. Any international call regarding entity information, the processing of a tax return, corrections to errors found during processing or corrections resulting from adjustments or audit assessments,

    3. Any international call regarding procedural issues (where to file a return, when and where to make payments, etc.).

21.10.1.4.1.7  (10-01-2007)
Sample Procedures for International Calls

  1. SOI Staff develops a combined sample plan for International Tax Law and Accounts Phone calls monitored at CQRS. Sample size from CQRS is valid at the site level on a quarterly basis.

  2. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  3. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. SeeIRM 21.10.1.3.2 and IRM 21.10.1.3.2.1.

  4. Managerial reviews are not subject to a sampling plan.

21.10.1.4.1.8  (10-01-2003)
Definition of NTA Calls SPRG

  1. National Taxpayer Advocate calls include any calls relating to a taxpayer's account (IMF or BMF) received on the designated NTA ASPECT applications.

21.10.1.4.1.9  (10-01-2008)
Sample Procedures for NTA Calls

  1. CR and the associated screen shots may be used by managers to evaluate the contact.

  2. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1

  3. Managerial reviews are not subject to a sampling plan.

21.10.1.4.1.10  (10-01-2003)
Definition of PPS SPRG

  1. PPS Calls include any call from a tax practitioner relating to his or her client's/taxpayer's account (IMF or BMF) or any other questions received on the ASPECT applications for PPS.

21.10.1.4.1.11  (10-01-2006)
Sample Procedures for PPS

  1. SOI develops sampling plans for PPS calls monitored at CQRS. Samples from CQRS are valid at the site level on at least a quarterly basis and nationally on a monthly basis.

  2. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  3. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1.

  4. Managerial reviews are not subject to a sampling plan.

21.10.1.4.1.12  (10-01-2003)
Definition of Spanish Tax Law and Account Calls SPRG

  1. Spanish Calls include any questions received on the ASPECT applications for Spanish Calls including:

    • Any call relating to a tax law question from the taxpayer

    • Any call relating to a taxpayer's account (IMF or BMF)

    • Any call regarding entity information, the processing of a tax return, corrections to errors found during processing, or corrections resulting from adjustments or audit assessments

    • Any call regarding procedural issues (where to file a return, when and where to make payments, etc.)

21.10.1.4.1.13  (10-01-2007)
Sample Procedures for Spanish Account Calls

  1. SOI Staff develops a combined sampling plan for Spanish Tax Law and Spanish Accounts calls monitored at CQRS. Samples from CQRS are valid (which includes Spanish Accounts and Spanish Tax Law combined) at the national level on a quarterly basis. Statistical validity at the site level varies for Puerto Rico based on time of year. For all other sites combined, the statistical validity is quarterly.

  2. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  3. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1.

  4. Managerial reviews are not subject to a sampling plan.

21.10.1.4.2  (10-01-2002)
ACS Phones Product Line

  1. Automated Collection System (ACS) Phone calls are reviewed to measure and improve the quality of our responses to taxpayer inquiries about balance due and return delinquency accounts.

21.10.1.4.2.1  (04-08-2008)
Definition of ACS Phones Product Line

  1. ACS is a computerized inventory system of balance due accounts and return delinquency accounts after normal notice routines occur.

  2. ACS Phones calls are defined as any call received on an IMF or BMF account in 22 or Taxpayer Delinquency Investigation (TDI) status assigned to ACS, and any other calls received on the ASPECT ACS application.

  3. Incorrect/incomplete action (per IRM guidelines) which results in incorrect calculations must exceed a $5.00 threshold before charging a defect for the national review.

21.10.1.4.2.2  (01-05-2011)
Sample Procedures for ACS Phones

  1. SOI Staff develops sampling plans for ACS Phones' calls monitored at CQRS. Samples from CQRS are valid at the site level on a quarterly basis and nationally on a monthly basis.

  2. All ACS Phones’ calls will be included in the universe of calls subjected to sampling per the SOI algorithm. This includes cases reviewed by managers, On-the-Job Instructors (OJIs), and cases subjected to 100% review.

  3. Local reviews are not performed for the national measure of the ACS Phones product line and therefore, are not included in the sampling plan. However, if time and staffing permits, each site should also perform local reviews to aid in the quality improvement of the product line. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1 for more information on local reviews.

  4. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  5. Managerial reviews are not subject to a sampling plan.

21.10.1.4.2.3  (10-01-2003)
ACS Phones Measures

  1. ACS Phones will be measured for Timeliness, Professionalism, Customer Accuracy, Regulatory/Statutory Accuracy, and Procedural Accuracy. These are the measures that are available and may be reported under the Balanced Measurement System. See IRM 21.10.1.7.3 for more information on the measures.

21.10.1.4.2.4  (10-01-2011)
Roles and Responsibilities of the ACS Phones Analyst

  1. The Centralized Quality Review System (CQRS) Analyst for ACS Phones will complete an unbiased, consistent, and accurate review of ACS Phones including follow-up actions taken once the taxpayer has hung up. Even if the call site employee does not complete all work until the next business day, the call is still counted as part of the sample.

  2. ACS Phones Quality analysts will complete an NQRS Data Collection Instrument (DCI) for each case reviewed. All appropriate fields will be completed to indicate quality standards having been met or not met. Analysts’ narratives will provide the basis for their findings and include applicable IRM references for coded defects. Whenever Attribute 715 is coded with a defect, the driver must be indicated in parenthesis immediately following "715" in the Feedback/Summary Remarks on the DCI,i.e., N=715, Correct/Complete Response Resolution (003). NQRS reviews will provide a basis for defect analysis and reporting error trends.

    Note:

    Before Attribute 508 Appropriate Procedural Action/Answer is coded, the analyst should research to see if another attribute describes the action/answer given by the employee. Only use this attribute as a last resort. When it is decided to code Attribute 508 either "Y" or "N" , clearly explain in the Feedback Summary section of the DCI why this attribute was selected. If a defect is charged, clearly describe the defect and provide an IRM reference to support the coding.

  3. Enter the word FLASH in the Feedback Summary Remarks section of the DCI to identify a defect that requires immediate (by the next business day) corrective action by the operation. For example, recalling a notice/letter before it is issued or correcting an adjustment to an account. See IRM 21.10.1.7.7, EQRS/NQRS Remarks Section for additional information.

    Note:

    Defects requiring correction that are not annotated with FLASH are to be completed by the Operation within five working days. See IRM 21.10.1.2.7.5

  4. Review data will be input within 24 hours of review to NQRS.

  5. Refer to the Embedded Quality Website http://eq.web.irs.gov weekly to glean updated information on the use of attributes in the ACS Phones review, obtain the latest Master Attribute Job Aid (MAJA) and Quality Grams, etc.

  6. Consult the ACS Phones Product Line Analyst for coding assistance or to interpret attribute usage, whenever necessary.

21.10.1.4.3  (10-01-2008)
ASFR Phones Product Line

  1. ASFR Phones are reviewed to measure and improve the quality of responses given to taxpayer inquiries received on the ASFR and ASFR Refund Hold toll-free lines.

21.10.1.4.3.1  (10-01-2013)
Roles and Responsibilities of the ASFR Phone Analyst

  1. The ASFR Phone PAS Analysts will complete an unbiased, consistent, and accurate review of ASFR and ASFR Refund Hold calls.

  2. ASFR Phones PAS Analysts will review the entire call to identify actions required. Analysts will ensure that appropriate actions are updated on IDRS, ASFR systems and/or AMS and the actions taken clearly support the disposition of the call as required by the procedural IRM of the SPRG.

  3. ASFR Phone PAS Analysts will complete an NQRS Data Collection Instrument (DCI) for each case reviewed using the MAJA for the SPRG as guidance for coding. All appropriate fields will be completed to indicate quality standards having been met or not met. Analysts’ narratives will provide the basis for their findings and include applicable IRM references for coded defects. NQRS reviews will provide a basis for defect analysis and reporting error trends.

  4. Enter the word FLASH in the Feedback Summary Remarks section of the DCI to identify a defect that requires immediate (by the next business day) corrective action by the operation. For example, recalling a notice/letter before it is issued or correcting an adjustment to an account. See IRM 21.10.1.7.7, EQRS/NQRS Remarks Section for additional information.

    Note:

    Defects requiring correction that are not annotated with FLASH are to be completed by the Operation within five working days. See IRM 21.10.1.2.7.5.

  5. Review data will be input daily to NQRS, whenever possible.

  6. Refer to the Embedded Quality Website http://eq.web.irs.govweekly to glean updated information on the use of attributes, obtain the latest Quality Job Aid, Quality Gram, etc.

  7. In a monthly report to be shared with the ASFR Operation, provide suggestions for improvement by:

    • Identifying most frequently occurring defects

    • Analyzing root causes of defects

    • Verifying sampling plans and guidelines

    • Reviewing methods used to capture needed information

  8. It is also recommended that NQRS analysts meet at least quarterly with the EQRS analysts to confer, compare and review attribute usage. These meetings should be used as a forum to discuss and agree on the use of each attribute in the ASFR Phones Smart DCI. The Product Line Analyst should be consulted for assistance with interpreting attribute usage when necessary.


More Internal Revenue Manual