21.10.1  Embedded Quality (EQ) Program for Accounts Management, Campus Compliance Services, Field Assistance, Tax Exempt/Government Entities, Return Integrity and Compliance Services (RICS) – Integrity and Verification Operations, and Electronic Products and Services Support

Manual Transmittal

September 26, 2014

Purpose

(1) This transmits revised IRM 21.10.1, Quality Assurance - Embedded Quality (EQ) Program for Accounts Management, Compliance Services, Field Assistance, Tax Exempt/Government Entities, Integrity and Verification Operations, and Electronic Products and Services Support.

Material Changes

(1) IPU 14U0465 issued 03-10-2014 IRM 21.10.1.2 Updated to add references to the AM Identity Theft Phones and Paper SPRGs and the Adjustments Paper Non - CIS SPRG.

(2) IPU 14U0465 issued 03-10-2014 IRM 21.10.1.2.7.7(6) Updated to include the Coding Consistency Course to the suggested quality analyst training.

(3) IPU 13U1624 issued 11-06-2013 IRM 21.10.1.3 - Moved several paragraphs from 21.10.1.3.1 to this section.

(4) IPU 14U0206 issued 01-29-2014 Updated IRM 21.10.1.3 (10) - editorial change.

(5) IPU 13U1624 issued 11-06-2013 IRM 21.10.1.3.1 - Moved several paragraph from this section to 21.10.1.3. Switched paragraphs 2 and 3 and made a slight change to the new paragraph 2.

(6) IPU 13U1744 issued 12-13-2013 IRM 21.10.1.3, 21.10.1.4.14.4 and 21.10.1.4.14.8 to include a reference to a change in what is in-scope for Accounts Management Tax Law beginning January 2nd.

(7) IPU 14U0465 issued 03-10-2014 IRM 21.10.1.4.1 Updated to include a reference to AM Identity Theft Phones.

(8) IPU 14U0465 issued 03-10-2014 IRM 21.10.1.4.1.14 Added new section, Definition of AM Identity Theft Phones.

(9) IPU 14U0465 issued 03-10-2014 IRM 21.10.1.4.1.15 Added new section , Sample Procedures for AM Identity Theft Phones.

(10) IPU 14U0206 issued 01-29-2014 Updated IRM 21.10.1.4.6 (2) - added reference to IRM 1.4.51.

(11) IPU 14U0206 issued 01-29-2014 Updated IRM 21.10.1.4.6.1 (3) and (4) - editorial change.

(12) IPU 14U0206 issued 01-29-2014 Updated IRM 21.10.1.4.10 (3) to remove reference to the Examination Toll Free Telephone Guide.

(13) IPU 14U0465 issued 03-10-2014 IRM 21.10.1.5.1, IRM 21.10.1.5.1.2 and IRM 21.10.1.5.1.3.2 Updated to include a reference to the AM Identity Theft Paper SPRG.

(14) IPU 14U0465 issued 03-10-2014 IRM 21.10.1.5.1.2 Revised list of items "excluded" from Accounts Paper.

(15) IPU 14U0465 issued 03-10-2014 IRM 21.10.1.5.1.3.3 Editorial change.

(16) IPU 14U0465 issued 03-10-2014 IRM 21.10.1.5.1.3.3.2 Updated to add a reference to AM Identity Theft Paper.

(17) IPU 14U0206 issued 01-29-2014 Updated IRM 21.10.1.5.9 (3) - editorial change.

(18) IPU 14U0206 issued 01-29-2014 Updated IRM 21.10.1.5.9.1 (6) and (7) - editorial change.

(19) IPU 14U0206 issued 01-29-2014 Updated IRM 21.10.1.5.9.4 (1) to remove the reference to a Professionalism measure.

(20) IPU 14U0569 issued 03-27-2014 IRM 21.10.1.7.7(2) - Updated to specifically reference including any missed letters in the remarks.

(21) IPU 14U0206 issued 01-29-2014 Updated IRM 21.10.1.8.6, Centralized PAS Rebuttal Procedures - Accounts Management.

(22) IPU 14U0926 issued 05-29-2014 IRM 21.10.1.8.6 - Revised Centralized PAS Rebuttal Procedures for Accounts Management.

(23) IPU 14U0465 issued 03-10-2014 IRM 21.10.1.9.3 Updated to add a reference to IRM 21.3.4.3.4 and removed sections 21.10.1.9.3.1, 21.10.1.9.3.1.2 and 21.10.1.9.3.1.3.

(24) IPU 14U0569 issued 03-27-2014 IRM 21.10.1.9.3 - Updated verbiage slightly.

(25) IPU 14U0706 issued 04-16-2014 IRM 21.10.1.9.5.1 - Updated to add PPS Customer Satisfaction Survey transfer numbers to new sites taking PPS calls.

(26) IPU 14U0206 issued 01-29-2014 Updated IRM 21.10.1.9.8.1 to add a Spanish Language version of the script.

(27) IPU 14U0569 issued 03-27-2014 IRM 21.10.1.9.8.1 - Updated to add "SPN" to the teleset display message for Spanish Language Survey Calls.

(28) IPU 13U1624 issued 11-06-2013 IRM 21.10.1.9.8.2 - Updated Toll-Free Transfer Numbers.

(29) IPU 14U0206 issued 01-29-2014 Updated IRM 21.10.1.9.4.2 (2), Telephone Customer Satisfaction Surveys, Spanish Language Script.

(30) IPU 14U0595 issued 04-01-2014 IRM 21.10.1.9.9 - Updated Customer Satisfaction survey telephone numbers for e-help phones.

(31) IPU 13U1744 issued 12-13-2013 IRM 21.10.1.9.14, Injured Spouse Customer Satisfaction Surveys.

(32) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.4 - Revised section.

(33) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.4.1 - Revised section.

(34) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.4.2 - Section removed.

(35) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.5 - Removed Practitioner Priority Services (PPS) Customer Satisfaction Survey and combined in new section IRM 21.10.1.9.5, W&I CAS/ACS Telephone Customer Satisfaction Survey.

(36) IPU 14U1053 issued 06-25-2014 IRM 21.10.1.9.5 - Changed section title to include SB/SE ACS.

(37) IPU 14U1053 issued 06-25-2014 IRM 21.10.1.9.6 - Added Spanish Language script to W and I AUR Telephone Customer Satisfaction Survey to be consistent with SB/SE.

(38) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.6 - Removed TE/GE Telephones Customer Satisfaction Survey information and combined with section IRM 21.10.1.9.5 W&I CAS/ACS Telephone Customer Satisfaction Survey.

(39) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.7 - Revised and renumbered to IRM 21.10.1.9.6.

(40) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.7.1 - Section removed.

(41) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.7.2 - Section removed.

(42) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.8 - Revised and renumbered to IRM 21.10.1.9.7.

(43) IPU 14U1109 issued 07-07-2014 IRM 21.10.1.9.8 - Updated the e-help Phones Customer Satisfaction Survey procedures to include the speed dial numbers effective July 7, 2014.

(44) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.8.1 - Section removed.

(45) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.8.2 - Section removed.

(46) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.9 - Renumbered to IRM 21.10.1.9.8.

(47) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.10 - Revised and renumbered to IRM 21.10.1.9.9.

(48) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.10.1 - Section removed.

(49) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.10.2 - Section removed.

(50) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.11 - Renumbered to IRM 21.10.1.9.10.

(51) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.12 - Renumbered to IRM 21.10.1.9.11.

(52) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.13 - Renumbered to IRM 21.10.1.9.12.

(53) IPU 14U1019 issued 06-18-2014 IRM 21.10.1.9.14 - Renumbered to IRM 21.10.1.9.13.

(54) IPU 14U0926 issued 05-29-2014 Exhibit 21.10.1-6 - Removed broken links to the Probe and Response Guide.

(55) IPU 14U0926 issued 05-29-2014 Exhibit 21.10.1-11 - Updated NQRS Month Ending Cutoff Dates - Cycle SPRGs table.

(56) IPU 14U0926 issued 05-29-2014 Exhibit 21.10.1-12 - Updated NQRS Month Ending Cutoff Dates - Calendar SPRGs table.

(57) IPU 13U1744 issued 12-13-2013 IRM 21.10.1–13 to correct typos and add Account Phones - AM Identity Theft Phones.

(58) IRM 21.10.1 - Editorial changes have been made throughout this IRM.

(59) IRM 21.10.1.2 - Removed ASFR Recon and ASFR Refund Hold.

(60) IRM 21.10.1.2(3) - Removed FA Accounts, FA Tax Law and FA Return Preparation SPRGs, added FA Contacts SPRG.

(61) IRM 21.10.1.2(3) - Removed Exam Phones Cold Calls, Exam Phones Outgoing Calls, IS Phones Cold Calls, IS Phones Outgoing SPRGs; Added Exam Phones and IS Phones SPRGs.

(62) IRM 21.10.1.2.1(2) - Removed the FA Return Preparation SPRG.

(63) IRM 21.10.1.2.7.6(6) - Expanded on Quality Manager responsibility cases that require rework.

(64) IRM 21.10.1.2.7.7((6) - Added list item for suggested training topics.

(65) IRM 21.10.1.2.9(3) - Removed FA Return Preparation SPRG.

(66) IRM 21.10.1.3 - Added references to IRM 10 and 25.

(67) IRM 21.10.1.3.1(1) - Clarified time period when quality defects can be charged.

(68) IRM 21.10.1.3.1.(2) -Added statement that deviations are not retroactive.

(69) IRM 21.10.1.3.2(1) - Removed paragraph.

(70) IRM 21.10.1.3.2(6) - Clarified sample design.

(71) IRM 21.10.1.3.2(8) - Clarified information regarding local reviews.

(72) IRM 21.10.1.3.2.2 - Skip interval change requires HQ approval.

(73) IRM 21.10.1.3.4 - Changed reference to IRM 1.15.21 to Document 12990.

(74) IRM 21.10.1.4.1 - Clarified that the five dollar threshold related to Customer Accuracy defects.

(75) IRM 21.10.1.4.1.11 - Changed to show validity at the site level on a monthly basis.

(76) IRM 21.10.1.4.1.13 - Changed to show validity at the site level on a monthly basis.

(77) IRM 21.10.1.4.1.15 - Specified SOI sample here was for AM ID Theft Calls.

(78) IRM 21.10.1.4.1.3 - Changed to show validity on a bi-monthly basis.

(79) IRM 21.10.1.4.10 - Removed reference of two separate SPRGs.

(80) IRM 21.10.1.4.12.3 - Changed to show validity at the national level on a monthly basis.

(81) IRM 21.10.1.4.13 - Removed reference of two separate SPRGs.

(82) IRM 21.10.1.4.14.4 - Changed to indicate a national only measure beginning in FY14.

(83) IRM 21.10.1.4.14.8(1) - Changed to reflect that the sample for Accounts Phones Spanish is valid on a monthly basis.

(84) IRM 21.10.1.4.16.1(8) - Specified this relates to WHC and CSCO Phones.

(85) IRM 21.10.1.4.16.2(2) - Added paragraph on WHC Phones.

(86) IRM 21.10.1.4.2.2 - Changed to reflect that the sample for ACS Phones is valid monthly at the site and national level.

(87) IRM 21.10.1.4.5.5 - Added Regulatory and Procedural Accuracy.

(88) IRM 21.10.1.4.9.2 - Changed to show validity at the site level on a monthly basis.

(89) IRM 21.10.1.5.1 - Clarified that the five dollar threshold related to Customer Accuracy defects.

(90) IRM 21.10.1.5.1.2 - Updated information on received date determination on Accounts Paper.

(91) IRM 21.10.1.5.1.3.3 - Paragraph on CIS sampling added.

(92) IRM 21.10.1.5.1.3.3.2 - Section removed and CIS sampling included in 21.10.1.5.1.3.3.

(93) IRM 21.10.1.5.11(1) - Added new paragraph.

(94) IRM 21.10.1.5.11(2) - Added local review clarification.

(95) IRM 21.10.1.5.11.3(3) - Updated sample for Group Code 81.

(96) IRM 21.10.1.5.11.3(5) - Merged paragraph into paragraph 4.

(97) IRM 21.10.1.5.7.1(8) - Added paragraph about analyst meeting time frames.

(98) IRM 21.10.1.5.7.7 - Added Regulatory and Procedural Accuracy.

(99) IRM 21.10.1.6 - Removed FA Return Preparation SPRG and term "counter" .

(100) IRM 21.10.1.6.1.2(1) - Removed FA Return Preparation SPRG.

(101) IRM 21.10.1.6.2(1) - Redefined Field Assistance Product Line.

(102) IRM 21.10.1.6.2(1) a-d - Removed bullets.

(103) IRM 21.10.1.6.2(2) - Removed paragraph.

(104) IRM 21.10.1.7.12.2 - Updated the note about Accounts Paper volume input.

(105) IRM 21.10.1.7.12.2(4) - Added a bullet regarding Campus Compliance volume input.

(106) IRM 21.10.1.7.5 - Added paragraph 3 related to the review date field for Campus Compliance.

(107) IRM 21.10.1.7.7(2) - Expanded the Flash instructions in the Feedback Summary Remarks.

(108) IRM 21.10.1.7.7(3) - Expanded instructions where corrective action is needed.

(109) IRM 21.10.1.7.8 - Slight change to the verbiage in paragraph (1).

(110) IRM 21.10.1.8.2 (2) - Added a note in paragraphs (2), (5) and (8).

(111) IRM 21.10.1.8.2(8) - Slight change to 3rd level rebuttal verbiage for AM CQRS reviews.

(112) IRM 21.10.1.8.3(1) - Added statement about other header fields.

(113) IRM 21.10.1.8.3(3) - Removed paragraph.

(114) IRM 21.10.1.8.3(5) - Merged paragraph into paragraph 4.

(115) IRM 21.10.1.8.3(6) - Merged last two list items.

(116) IRM 21.10.1.8.6(10) - Removed third If/Then due to redundancy.

(117) IRM 21.10.1.9.11 - Revised section on Accounts Management Adjustments Customer Satisfaction Survey, added 1320X to EO.

(118) IRM 21.10.1.9.11.1 - Sites are no longer required to retain case data from the Adjustments Surveys of Correspondence.

(119) IRM 21.10.1.9.7(2) b - Removed "and the required call summarization" from the instructions.

(120) Exhibit 21.10.1–1 - Removed Exhibit and renumbered all others.

(121) Exhibit 21.10.1-7 - Added attributes 015 and 754.

(122) Exhibit 21.10.1-8 - Added IC, ID, IE, IP, IS, IU.

(123) Exhibit 21.10.1-14 - Removed FA Accounts, FA Tax Law and FA Return Preparation SPRGs, added FA Contacts SPRG.

(124) Exhibit 21.10.1-14 - Removed ASFR Recon and ASFR Refund Hold.

(125) Exhibit 21.10.1-14 - Changed Exam Phones Cold Calls, IS Phones Cold Calls to Exam Phones and IS Phones respectively.

(126) Exhibit 21.10.1-14 - Changed some Campus Compliance paper SPRGs from closed date to review data SPRGs.

(127) Exhibit 21.10.1–16 - Removed Exhibit.

Effect on Other Documents

IRM 21.10.1, dated September 20, 2013 (effective October 1, 2013), is superseded. The following IRM Procedural Updates (IPUs), issued from November 6, 2013 to July 7, 2014, have been incorporated into this IRM: IPU 13U1624, 13U1774, 14U0206, 14U0465, 14U0569, 14U0595, 14U0706, 14U0926, 14U1019, 14U1053, and 14U1109.

Audience

Accounts Management (AM), Campus Compliance Services in Small Business/Self-Employed (SB/SE) and Wage and Investment (W&I), Electronic Products and Services Support (EPSS), Field Assistance (FA), Return Integrity and Correspondence Services (RICS) – Integrity and Verification Operations, and Tax Exempt and Government Organizations (TE/GE)

Effective Date

(10-01-2014)

Karen Michaels
Acting Director, Joint Operations Center
Wage and Investment Division

21.10.1.1  (10-01-2013)
Embedded Quality (EQ) Review Program Overview

  1. Embedded Quality is the system that is used by Accounts Management, Campus Compliance Services, Electronic Products and Services Support, Field Assistance, Return Integrity and Correspondence Services (RICS) – Integrity and Verification Operations, and Tax Exempt/Government Entities for their Embedded Quality Review Program.

  2. This section provides procedures for Campus Embedded Quality program level and site reviews, as well as front-line manager evaluative employee reviews of:

    • Telephone operations

    • Closed Paper case reviews

    • In-process case reviews

    • Responses to taxpayer correspondence

    • Outgoing correspondence and notices

    • Adjustment actions

    • Centralized processes

    • Email responses to IRS web site questions

    • Taxpayer Assistance Center walk-up contacts

  3. This section also provides procedures for:

    • Accessing, adding, editing, and correcting National Quality Review System (NQRS) and Embedded Quality Review System (EQRS) records

    • Generating reports available through NQRS and EQRS

    • Completing the IRS portion of the Customer Satisfaction Survey

21.10.1.2  (10-01-2014)
The Quality Review Process

  1. The Quality Review process provides a method to monitor, measure, and improve the quality of work. Quality Review data is used to provide quality statistics for the Service's Business Results portion of the Balanced Measures, and/or to identify trends, problem areas, training needs, and opportunities for process improvement.

  2. The Embedded Quality (EQ) effort is a way of doing business that builds commitment and capability among all individuals to continually improve customer service, employee satisfaction, and business results. The EQ effort is based on three components:

    • Improving the way quality is measured, calculated, and reported

    • Creating accountability by connecting employee reviews to quality measurement in a way that enables managers and employees to act on the results

    • Improving the design and deployment of the quality resources dedicated to review, analysis, and improvement

    The Embedded Quality System calculates measurement using the percent of applicable coded attributes that are correct based on the number of opportunities within each of five "buckets" . The buckets are defined as follows:

    • Customer Accuracy: giving the correct answer with the correct resolution. "Correct" is measured based on the taxpayer receiving a correct response or resolution to the case or issue, and if appropriate, taking the necessary case actions or case disposition to provide this response or resolution. For the purpose of coding, additional IRS issues or procedures that do not directly impact the taxpayer's issue or case are not considered.

    • Regulatory Accuracy: adhering to statutory/regulatory process requirements when making determinations on taxpayer accounts/cases.

    • Procedural Accuracy: adhering to non-statutory/non-regulatory internal process requirements when making determinations on taxpayer accounts/cases.

    • Professionalism: promoting a positive image of the Service by using effective communication techniques.

    • Timeliness: resolving an issue in the most efficient manner through the use of proper workload management and time utilization techniques.

  3. A "product line" is a major grouping of similar work that is reportable and is measured in the Business Performance Review System (BPRS). Product lines listed below are further defined later in this IRM. The national and local quality reviews for these product lines will be entered into NQRS, and the managerial quality reviews for these product lines will be entered into EQRS. A "Specialized Product Review Group (SPRG)" is a subset of a product line that generally has a separate sample.

    Product Line Specialized Product Review Group (SPRG)
    Accounts Paper Accounts Paper Adjustments
    Accounts Paper IEAR
    AM Identity Theft Paper
    Accounts Phones Accounts Phones EIN
    Accounts Phones General
    Accounts Phones International
    Accounts Phones NTA
    Accounts Phones PPS
    Accounts Phones Spanish
    AM Identity Theft Phones
    ACS Phones ACS Phones
    ACS Written ACS Case Processing
    ACS Support
    AM Miscellaneous Adjustment Paper Non-CIS
    AM Routing AM Routing Default Screener
    AM Specialized Services AMSS CAF/POA
    AMSS EIN
    AMSS International Specialized
    AMSS Support Services
    AMSS Technical Services
    AUR Paper AUR Paper
    AUR Paper CAWR
    AUR Paper FUTA
    AUR Paper PMF
    BMF AUR Paper
    AUR Phones AUR Phones
    BMF AUR Phones
    ASFR Paper ASFR Paper
    ASFR Phones ASFR Phones (Including Refund Hold)
    Centralized Case Processing Collection Paper CLP Lien Paper
    CCP GCP
    CCP MMIA
    CLP Lien Paper CRD
    Centralized Case Processing Phones CCP Lien Phones
    CCP MMIA Phones
    Centralized Case Processing Exam Paper CCP Exam Paper
    CIO Paper CIO Paper
    CIO Phones CIO Phones
    COIC Paper COIC Paper
    COIC Phones COIC Offer Exam
    COIC Process Exam
    Collection Paper Collection Paper
    WHC Paper
    Collection Phones Collection Phones Withholding Compliance
    Collection Phones CSCO
    Collection Phones Combat Zone
    CS Identity Theft Paper CS Identity Theft Paper
    CS Specialized Paper Services Centralized Excise Tax Paper
    Centralized Estate and Gift Tax Paper
    Centralized Transfer Tax Technician
    CS Specialized Phone Services Centralized Excise Tax Phones
    Centralized Estate and Gift Tax Phones
    e-help Phones e-help Phones
    Exam Paper Exam Paper Area Office Support
    Exam Paper Classification
    Exam Paper Discretionary Programs
    Exam Paper EIC Programs
    Exam Paper Flow Through Entities
    Exam Phones  
     
     
    Exam Phones
    Forms Order Forms Order NDC
    Innocent Spouse Paper Innocent Spouse Paper
    Innocent Spouse Phones  
     
    Innocent Spouse Phones
    Integrity and Verification Operations IVO Screening and Verification
    IVO Case Resolution
    Tax Law Phones Tax Law Phones General
    Tax Law Phones International
    Tax Law Phones Rmail Callback
    Tax Law Phones Spanish
    Tax Law Written Tax Law ETLA
    TE/GE Phones TE/GE Telephone
    TE/GE Correspondence TE/GE Correspondence
    Field Assistance  
     
    FA Procedural
     
    FA Contacts
    AM Clerical Campus Support
    Image Control Team
  4. New product lines for Campus Compliance Services must be approved by Quality Performance Measurement Operation, who has responsibility for updates and maintenance of the system, and the EQ National Support Staff, who is responsible for programming the database. Other functions need approval from the areas responsible for quality review and the EQ National Support Staff. The function must be prepared to provide the following information:

    • Is there an existing measure?

    • How many Full Time Equivalents (FTEs) will be needed to review the product line, and how will the function provide them?

    • Who will perform the review (Centralized Quality Review System , Program Analysis System or Centralized Program Analysis System, site quality analysts)?

    • What is the volume of work?

    • Who is recommending this request?

    • Will Contact Recording be used?

  5. Reviews for the Quality Review process are completed by one of the following Centralized Quality Review System (CQRS), Program Analysis System (PAS), Centralized Program Analysis System (CPAS), or sites (for some national quality measure or, local reviews performed for quality improvement). The National Review is a review that measures the quality of the entire product. Review data is compiled using a Data Collection Instrument (DCI). Data from national reviews is entered into the National Quality Review System (NQRS), then rolled up, to provide the business results for the Balanced Measures. These results are the quality ratings for Customer Accuracy, Professionalism, and Timeliness. Data from local reviews performed for quality improvement is entered into NQRS as a Local Review and is not rolled up into the national accuracy rates.

  6. The Managerial Review process creates accountability by connecting employee reviews to the balanced measures. Managers will use the system to track employee performance and training needs. Data from the managerial reviews is entered into the Embedded Quality Review System (EQRS), which maps to an employee's Critical Job Elements and Aspects. It is then rolled up to identify overall employee, team, department and operation scores for Accuracy, Professionalism, and Timeliness. It is not combined in any way with national or local accuracy rates.

  7. The Contact Recording (CR) system records telephone contacts between the Service and customers. The CR system records the complete conversation of every call and randomly selects approximately ten percent of the calls per site, to simultaneously capture both the voice and on-screen computer activity. If a call is one of the random ten percent, managers and quality reviewers will be able to hear and see the entire customer experience. Contact Recording technology allows managers to provide feedback and identify training needs for employees. Managers can access the recordings and allow employees to listen to their own interactions with customers. CR is available for managers and National quality review for most telephone and Field Assistance product lines. Selected telephone calls remain in the system for 60 days.

    Note:

    Periodically, there are systemic problems with Contact Recording. When this happens reviews must be live monitored using the Aspect Toll-Free monitoring system.

  8. All calls handled by the operation are included in the sample and are selected by an algorithm that is programmed into the CR system to randomly select the calls for review.

21.10.1.2.1  (10-01-2014)
Centralized Quality Review System (CQRS)

  1. Centralized Quality Review System (CQRS) is operated by the Joint Operations Center (JOC) to provide independent quality review services for a number of product lines.

  2. CQRS measures the quality of:

    • Tax Law, Accounts, National Distribution Center, Default Screeners, e-help, Tax Exempt/Government Entities (TE/GE) and Automated Collection System (ACS) calls answered by assistors in all sites

    • E-Mail (Electronic Tax Law Assistance – ETLA) responses to questions received through the IRS Website

    • Field Assistance Tax Law, Accounts and Procedural contacts

21.10.1.2.2  (10-01-2013)
Program Analysis System (PAS) and Site Reviews for the National Measure

  1. PAS and Site Quality Review measures the quality of:

    • Accounts Paper

    • ACS Support

    • Accounts Management Clerical - Campus Support

    • Accounts Management Specialized Services

    • Automated Substitute For Return (ASFR) Paper and Phones

    • Automated Underreporter (AUR) Paper

    • Campus Compliance Services (CCS) Identity Theft Paper

    • Centralized Case Processing Collection (CCP) Paper and Phones

    • Centralized Case Processing (CCP) Exam

    • Centralized Insolvency Operation (CIO)Paper and Phones

    • Centralized Offer In Compromise (COIC) Paper and Phones

    • Collection Paper and Phones

    • Compliance Services (CS) Specialized Paper and Phones Services

    • Exam Paper

    • Innocent Spouse Paper

    • Return Integrity and Correspondence Services - Integrity and Verification Operations

21.10.1.2.3  (10-01-2013)
Centralized Program Analysis System (CPAS) Reviews for the National Measure

  1. CPAS measures the quality of:

    • AUR Phones

    • Exam Phones

    • Innocent Spouse Phones

21.10.1.2.4  (10-01-2013)
Local Reviews for Quality Improvement

  1. Local reviews may be performed to focus attention on areas that require improvement. The local quality reviews are performed by staffs reporting to the Quality Assurance Manager, PAS/CPAS Manager, and/or other units that have quality assurance duties. Local quality reviews may also be used for employee development and on-the-job instruction. Accounts Management and Compliance Services functions may also request that local quality reviews be performed on processes that are not subject to the national quality review.

  2. Generally, local reviews should be used for one year when a new product is being implemented. This process is considered a “baseline” period. Often the new procedures being implemented are unstable and need to be adjusted. During this time changes are made to the functional IRMs and training is provided to the employees. The baseline period allows time for the operation to perfect their processes and procedures while receiving feedback on performance.

21.10.1.2.5  (10-01-2013)
Managerial Reviews

  1. Managerial reviews, which are prepared on EQRS, measure employee performance.

    Note:

    Managerial reviews are performed independently from national and local quality reviews. National and local quality review results are never used for evaluation of individual employees.

21.10.1.2.6  (10-01-2013)
Objectives of Quality Review

  1. Quality Review data is used by management to provide a basis for measuring and improving program effectiveness by identifying:

    1. Defects resulting from site or systemic action(s) or inaction(s),

    2. Drivers of Customer Accuracy,

    3. Reason(s) for defect occurrence,

    4. Defect trends, and

    5. Recommendation for corrective action

  2. Quality review also provides:

    • A way to ensure the corrective action was effective

    • A vehicle for input to balanced measures, and

    • Assistance to management in efforts to improve quality of service

  3. Managerial review data is used by management for some or all of the purposes listed above as well as:

    1. Tracking employee performance and providing input into employee appraisals,

    2. Identifying training needs for individual employees and for groups of employees, and

    3. Planning workload distribution.

    Note:

    Managerial reviews evaluate individual employee performance and are performed independently from national and local quality reviews. National and local quality review results are never used for evaluation of individual employees. See IRM 1.4.16 (Accounts Management Guide for Managers), IRM 1.4.20 (Filing and Payment Compliance Managers Handbook) IRM 1.4.17 (Compliance Managers Handbook) IRM 1.4.11 (Field Assistance Guide for Managers ) and IRM 1.4.18 (Electronic Products and Support Services Managers Guide) for more information on manager responsibilities in conducting managerial reviews.

21.10.1.2.7  (10-01-2013)
Quality Review Roles and Responsibilities

  1. The Quality Review process relies on the teamwork of all of the following:

    • Headquarters Business Operating Divisions (BOD) Quality Analysts, Headquarters Quality Performance Measurement (QPM) Product Line Analysts (PLAs), Process Improvement and Customer Accuracy (PICA) and EQ National Support Staff

    • CQRS - part of the Joint Operations Center (JOC)

    • PAS - operated by Planning and Analysis in each campus

    • CPAS - operated by Quality Performance Measurement (QPM) and reports to the program manager, QPM

    • Planning and Analysis (P&A) Staffs

    • Accounts Management, Campus Compliance Services, Electronic Products and Services Support, Return Integrity and Correspondence Services, Field Assistance and TE/GE Operations

    • Quality Assurance Managers (QAMs) and Field Improvement Managers and Specialists or other quality staffs

    • PAS Managers

21.10.1.2.7.1  (10-01-2013)
Headquarters Business Operating Divisions (BOD)

  1. The Business Operating divisions issue program goals based on the Balanced Measures.

  2. The BOD reviews Quality Assurance programs as part of periodic reviews of Accounts Management, Compliance Services, Electronic Products and Services Support, Return Integrity and Correspondence Services - Integrity and Verification Operations and Field Assistance programs.

21.10.1.2.7.2  (10-01-2013)
Quality Performance Measurement (QPM)

  1. The mission of the Campus Compliance Service Quality Performance Measurement (QPM) Operation is to provide unbiased, cross BOD administration of the Embedded Quality Review System. QPM spans both SB/SE and W& I Campus Compliance operations and has an enterprise objective to measure performance. By design it is independent from other functions to ensure impartiality and maintain data integrity. QPM regularly interacts with the headquarters’ business owners and all campus/remote office operations to work quality issues including:

    • Providing oversight and coordination of the cross-functional Embedded Quality (EQ) Program.

    • Monitoring EQ program adherence including proper coding of reviews and all aspects of the sampling process.

    • Sponsoring on-going cross-campus discussions to identify trends and issuing guidance and direction to correct global problems.

    • Maintaining the EQ system which includes establishing work arounds for problems identified through coding.

    • Updating EQ tools including job aids, system fields, attribute definitions, and monitoring and resolving Contact Recording issues.

    • Establishing regulations and procedures and providing support for attribute coding for all Campus Compliance reviews.

    • Making all final-authority determinations on elevated rebuttals or adjustments.

    • Serving as the Campus Compliance Services liaison to Accounts Management including coordinating system adjustments to ensure uniform and cohesive data collection.

    • Sponsoring and supporting new Campus Compliance product lines for the system

21.10.1.2.7.3  (10-01-2013)
Embedded Quality (EQ) National Support Staff

  1. The EQ National Support Staff is part of the Joint Operations Center (JOC) and provides support to all EQRS and NQRS users in all functions at all levels. The responsibilities include but are not limited to:

    • Serve as liaison between JOC and various Business Units in the Enterprise on program and quality issues

    • Coordinate with Statistics Of Income (SOI) on weighted report calculations and related programming

    • Support business quality training initiatives

    • Work with business units to develop and define new (Specialized Product Review Groups (SPRGs)

    • Work with the business units to develop new reports as needed

    • Work with various business Product Line Analysts to maintain Data Collection Instruments (DCIs) for current SPRGs

    • Support the EQ Summits between business units and NTEU required in the contract

    • Process Reports Request Central (RRC) submissions for EQRS/NQRS data

    • Issue system alerts to customer base

    • Ensure official Quality measures (Customer Accuracy, Timeliness, Professionalism) as reported by the application are timely and accurate

    • Ensure managerial performance reviews as reported by the application are accurate and reliable

    • Create/submit annual EQRS-C (Campus) enhancement UWR

    • Create/submit annual EQRS-C Maintenance UWR and other UWRs as needed

    • Work with IT (Applications Development, Embedded Quality Section) on all EQRS-C system-related issues

    • Coordinate with IT in completing annual UWR submissions

    • Maintain EQRS-C database reference tables via OL-Matrix web application

    • Serve as Application Point of Contact for all FISMA-related activities and requirements

    • Maintain Embedded Quality Websitehttp://eq.web.irs.gov

21.10.1.2.7.4  (10-01-2013)
Accounts Management/Compliance Services Operations

  1. The P&A Chiefs for Accounts Management and Compliance Services are responsible for the Site Level Business Plans.

  2. The P&A Chiefs are responsible for site reviews of product lines that provide data for the Balanced Measures.

  3. The operations managers are responsible for evaluative reviews on their employees separate and apart from the National Review process on the EQRS side of the database.

  4. If feasible, the QAM and Process Improvement Specialist/Field Improvement Specialist should report directly to the P&A Chief. This eliminates potential conflicts of interest which may occur when more than one department manager is accountable for a product or service. If a formal position is not designated within P&A the responsibilities should be designated to one or more individuals in the operation to ensure that the obligation is met.

21.10.1.2.7.5  (10-01-2013)
Quality Assurance Manager (QAM)

  1. This section only applies to remote and campus locations with functional areas where there is an existing QAM position. In offices/functional areas where no position exists, or in Campus Compliance Services, management will need to ensure these duties are appropriately addressed.

  2. The QAM is responsible for the overall planning, administration, and evaluation of the quality-related sections of the Site Level Business Plans. The QAM will identify problems and work with management to solve them.

  3. The Site Level Business Plans outline procedures for the review of all areas of responsibility. This review process, when combined with CQRS/PAS/CPAS data and other functional data, will help with evaluating the overall quality of operations and making recommendations for improvement.

  4. The QAM will serve as the Quality (QA) manager for the Operation, ensuring that designated quality resources are used to focus on quality improvement efforts.

    Note:

    Results of reviews performed by CQRS/PAS/CPAS staff are not to be used in employee evaluations.

  5. By using trend analysis, the QAM will determine the causes that adversely affect quality. The QAM will assist the management team in initiating processes for employees to improve their quality of service. It is important that lines of communication remain open among the QAM, the QR team, and management in order to identify problem areas, take appropriate corrective actions, and re-evaluate quality to ensure corrective actions result in improved quality.

  6. The QAM or other designated person within the operation will log in and date completed review records that require rework, including reviews by CQRS/PAS/CPAS, where defects have been identified including when FLASH was annotated in the Feedback Summary Remarks field. Managers must return corrected work to the QAM within five working days of receipt. See IRM 21.10.1.7.7(3) for additional instructions on FLASH remarks. The QAM will enter the completion date in the log. The QAM will monitor corrected work to ensure the timeliness and quality of responses to taxpayers.

  7. The QAM will act as the liaison between the CQRS/PAS/CPAS staff and management and is responsible for communicating quality information to all managers in the operation.

  8. The QAM will identify training needs and recommend to the Training Coordinator and/or management, the type of training needed (e.g., on-the-job training or classroom instruction), and assist in the development of additional training exercises and workshops to meet those needs.

  9. The QAM and the CQRS/PAS/CPAS staff will be responsible for the protection of NQRS DCIs and any supporting documentation from legacy systems. All documents and information (including taxpayer information) seen, heard, or handled must remain secure and confidential.

  10. The QAM will serve as the Embedded Quality liaison with managers.

  11. The QAM will work with managers to recognize and reward quality at all levels (e.g., employee-of-the-month, Quality Honor Roll, etc.), using the National Agreement for guidance.

  12. The QAM will be responsible for working with the Training Coordinator and/or management to ensure that EQ training is made available to all who need it. The QAM is responsible for training the quality staff.

  13. The Field Improvement Specialists (FIS) will work closely with Headquarters and the PICA team to manage the portfolio of national projects and best practice evaluation. The expectations for the FIS positions include, but are not limited to, the following.

    1. A regular solicitation for projects will be made to ensure there is no overlap between various sites or functions.

    2. Field Directors recommending projects would be in charge of those selected.

    3. Each Field Director will conduct at least one project with national impact each year.

    4. Field Directors will have the latitude to use FIS resources for local projects.

    5. All projects will be linked to the organization’s strategic goals and direction.

21.10.1.2.7.6  (10-01-2014)
Quality Review/Process Improvement Managers

  1. Throughout this IRM section, the term "Quality Managers" will be used to include the duties of CQRS/PAS/CPAS Managers or other functional Quality Review Managers who perform site reviews for the national quality measure and/or local reviews for quality improvement.

  2. Quality Managers are responsible for ensuring the completion of the national and local reviews.

  3. Quality Managers ensure that all applicable work is sampled and reviewed within sample plan guidelines.

    Note:

    Results of reviews performed by CQRS/PAS/CPAS are not to be used in functional employee evaluations.

  4. The Quality Manager maintains the integrity and quality of the review system by monitoring and reviewing the quality analysts and clerical support. The Quality Manager should routinely monitor the quality sampling selection process. Refer to descriptions of the product lines later in this section for more information on sample universes and sampling guidelines. The Quality Manager should also perform a periodic (at least quarterly) spot check of the work to confirm the case volume matches the count provided by the operations. For example: verify the review content of a single folder to ensure that the volume of the work matches the number of cases in the folder. If these figures do not match, contact the operation to address the reason(s) for the discrepancy and to discuss ways to improve the process accuracy.

  5. The Quality Manager and staff determine the causes that adversely affect quality by using trend analysis to identify the most frequently made defects and root causes. The Quality Manager recommends corrective action and/or improvements to functional areas.

  6. The Quality Manager identifies sample cases that require rework (including cases where FLASH was entered in the Feedback Summary Remarks field) and forwards them to the appropriate functional area. This includes reviews by CQRS where defects have been identified. All defects must be reworked to ensure each taxpayer receives the correct information and all appropriate adjustments are made to their account(s). The manager of the employee who created the defect is responsible for ensuring the defect is corrected within the applicable time frame after the two day consistency period. IRM 21.10.1.8.6 (2). Flash defects must be corrected by the next business day. All other corrective actions not marked as Flash defects must be corrected within five business days.

  7. The Quality Manager is responsible for working with the Training Coordinator to ensure that the CQRS/PAS/CPAS staff has the appropriate training.

21.10.1.2.7.7  (10-01-2014)
Quality Analyst/PAS/CPAS Analyst

  1. A Quality Analyst or a clerk is responsible for pulling the daily sample.

  2. The quality analyst should perform an unbiased, consistent, and accurate review of all work.

  3. The quality analyst provides the QAM or PAS/CPAS Manager with:

    • Any cases identified as Flash for rework

    • Analysis of types of errors identified during review.

  4. The quality analyst should provide the QAM, QPM or PICA Product Line Quality Analyst, or National Support Staff, as applicable, with recommendations for corrections/improvements on:

    • IRM 21.10.1, Quality Assurance - Embedded Quality (EQ) Program for Accounts Management, Compliance Services, Field Assistance, Tax Exempt/Government Entities, Return Integrity and Compliance Services (RICS) - Integrity and Verification Operations and Electronic Products and Services Support

    • Embedded Quality Review Training Material

    • The NQRS DCI

  5. The quality analyst will:

    1. Code the appropriate attribute for each action, while considering the root cause of any defects a single action should not be coded "N" in multiple attributes to avoid a trickle down effect.

    2. Review work using valid sampling techniques as approved by SOI.

    3. Record complete review results using the NQRS DCI.

    4. Measure performance against established quality standards in the functional IRM, publications, and other approved reference sources.

  6. The quality analyst reviewing the work will need to complete training that is appropriate for the SPRG they will review. The following is a list of suggested training topics:

    • Embedded Quality Review Training

    • Quality Review command codes

    • Automatic Data Processing (ADP)/Integrated Data Retrieval System (IDRS) training modules, including on-line accounts and adjustments, and Accounts Management and Compliance Services

    • Coding Consistency Course

    • Technical tax law (as appropriate)

    • Technical Probe and Response Guide

    • Interactive Tax Law Assistant (ITLA)

    • Accounts Collection Service Guide

    • Correspondence guidelines

    • Taxpayer Advocate Service guidelines and criteria

    • Oral statement guidelines

    • Procedural guidelines

    • Communication skills

    • Probing skills

    • Timeliness guidelines

    • e-help

    • TE/GE Specialized Systems

    • TE/GE Probe and Response Guide

    • Continuing Professional Education (CPE) classes related to their assigned SPRG when available

21.10.1.2.8  (11-28-2011)
Process Improvement Specialist Roles and Responsibilities

  1. Campus Process Improvement Specialist duties are to:

    1. Identify potential new processes and procedural changes that will improve work processes, quality and level of service for the taxpayers.

    2. Ensure feasible recommendations are presented to enhance procedural, policy, and systemic work practices.

    3. Recommend changes to provide consistency, enhance productivity and efficiency.

    4. Elevate improvement process recommendations to Process Improvement (PI) Manager and Process Improvement Customer Accuracy (PICA) Tax Analysts.

    5. Attend and assist with Training on improvement methods, including the DMAIC process.

    6. Work with PICA to gather facts/data to justify procedural, systemic, and program changes to improve work practices, policies and procedures.

    7. Forward sound recommendations to Process and Program Management (PPM) with solid facts and figures to justify changes.

    8. Ensure approved recommendations are implemented in a timely manner.

    9. Identify and coordinate IRM procedures impacting quality (via PICA then to PPM).

    10. Lead cross-functional improvement discussions and teams at site.

    11. Request action plans (from program owner), when appropriate.

    12. Participate in a least 2 projects (per person) per year (HQ and/or campus); working with PICA.

    13. Report project status monthly (weekly when appropriate) to PICA and PPM while gather information.

    14. Manage the improvement project when additional information must be gathered.

    15. Use at least half of your time (during the year) providing "expert" assistance to Campus P&A staff and Operations Chiefs.

    16. Communicate results, recommendations and related improvement procedures to other sites.

    17. Conduct calibration and consistency analysis, when appropriate.

    18. Establish improvement team to address hot topics during the filing season.

    19. Identify local procedures, job aids, and check sheet and ensure they are approved by Headquarters and used by each site for consistency.

    20. Ensure changes are based on the quality principles which are:
      •Professionalism
      •Customer Accuracy
      •Procedural Accuracy
      •Regulatory Accuracy
      •Timeliness

    21. Use the following reports when analyzing quality issues:
      NQRS Weighted Customer Accuracy Report
      NQRS Customer Accuracy Driver Report
      NQRS Top Defects/Successes by Site Report
      NQRS Ad-hoc Report
      EQRS Customer Accuracy Driver Report
      EQRS Top Defects/Successes by Site Report
      EQRS Ad-hoc Report
      EQRS/NQRS Comparison Report
      SLIM Report
      ETD Report

21.10.1.2.9  (10-01-2014)
Strategy and Program Plans/W&I Operations Plans

  1. The Strategy and Program Plans/W&I Operations Plans are vehicles used to monitor, measure, and evaluate activities consistent with the quality goals of the Balanced Measures.

  2. The Strategy and Program Plans/W&I Operations Plans should include:

    1. Action items that support Balanced Measures initiatives.

    2. Measurement data from CQRS, PAS, CPAS, QR, local reviews, quality staffs, and managerial reviews.

      Note:

      Throughout each month, management must monitor/review the required minimum number of employee telephone calls/cases with taxpayers as set in the Site Quality Action Plan.

  3. The QAM or other local management will assist in the development of Site Level Business Plans which should include requirements to perform reviews to assess the quality of the work including:

    • Managerial reviews of employee work including telephone assistance and written work

    • Timeliness of completed work

    • Accounts Management Accounts and Tax Law work

    • Field Assistance contacts

21.10.1.2.9.1  (10-01-2006)
Establishing the Strategy and Program Plans/W&I Operations Plans and Site Level Business Plans

  1. When creating the Site Level Business Plans, the QAM will:

    1. Review past Site Level Business Plans and the current year’s Balanced Measures recommendations to understand the program’s goals.

    2. Analyze statistical data, such as NQRS, and EQRS data to identify inconsistencies.

    3. Meet with management to discuss QA objectives.

    4. Establish QA objectives and set priorities using criteria provided by Headquarters and functional management. Local objectives can be added, if desired.

    5. Develop a schedule for implementing each objective of the plan and assign responsibilities.

    6. Work with management to prepare a draft of the plan for review.

    7. Consolidate suggested changes into a final draft for the Operations Chief’s concurrence, which certifies that the plan is operable.

    8. Communicate the applicable portions of the plan to all functional employees, ensuring they understand the plan’s objectives and their roles in the QA process.

  2. After implementation of the Site Level Business Plan, the QAM will monitor plan accomplishments and continue to analyze site data to determine if the operation is meeting objectives. The QAM is expected to recommend needed changes to the plan such as modification or discontinuance of certain objectives.

21.10.1.2.9.2  (10-01-2008)
Strategy and Program Plans/W&I Operations Plans and Site Level Business Plans Resources

  1. Following is a list of resources available to the QAM when creating the quality portion of the Site Level Business Plan. This list is in no particular order and is not meant to be all inclusive. It is designed to suggest the wide variety of data available for consideration.

    • Local work plans

    • NQRS data including local reports

    • EQRS data including local reports

    • Timeliness data

    • Prior year's Operating Guidelines, including plans for QR and managerial involvement

    • Headquarters and functional business plans and reviews

    • Field Assistance data

    • Accounts Management/Compliance Campus reports on functional activities

    • Alert information previously provided to employees

    • Staff feedback

    • Statisticians feedback or reports

    • Taxpayer Advocate Service staff

    • Treasury Inspector General for Tax Administration reports

    • Results of Improvement Projects

    • Focus testing reports

    • General Accounting Office reports

    • ASPECT telephone reports/data

    • Customer satisfaction survey results.

21.10.1.3  (10-01-2014)
Quality Review Research Tools

  1. The following paragraphs may not be all-inclusive, but they will provide a listing of the most frequently used research tools.

  2. A number of IRMs impact the work done by Accounts Management, Compliance Services, Field Assistance, Electronic Products and Services Support and TE/GE. The IRM 21 will often cross-reference these other manuals. Examples include:

    • IRM 4, Examining Process

    • IRM 5, Collecting Process

    • IRM 20.1, Penalty Handbook

    • IRM 3.42, Electronic Tax Administration

    • IRM 13, Taxpayer Advocate Service

    • IRM 11.3, Disclosure of Official Information

    • IRM 2.3, IDRS Terminal Response

    • IRM 2.4, IDRS Terminal Input

    • IRM 2.8, Audit Information Management Systems (AIMS)

    • IRM 10.5.3, Identity Protection Program

    • IRM 20.2, Interest

    • IRM 21.1, Accounts Management and Compliance Services Operations

    • IRM 21.1.7, Campus Support

    • IRM 21.3, Taxpayer Contacts

    • IRM 21.3.7, Processing Third Party Authorizations onto Centralized Authorization File (CAF)

    • IRM 25.6, Statute of Limitations

    • IRM 1.4.10, Integrity and Verification Operation Managers Guide

    • IRM 1.4.11, Field Assistance Guide for Managers

    • IRM 1.4.16, Accounts Management Guide for Managers

    • IRM 1.4.17, Compliance Managers Guide

    • IRM 1.4.18, Electronic Products and Support Services Managers Guide

  3. A number of methods are used to communicate changes, clarifications or corrections to the IRMs and other published products. Among these are:

    • Servicewide Electronic Research Program (SERP)

    • IDRS Bulletins

    • IDRS Message File

    • IRM Update

    • Internal Revenue Bulletins (IRBs)

    • The EQ Website

    • The CQRS Website

    • Quick Alerts and EPSS Communications

    • Quality Alerts

    • Interim Guidance Memoranda

  4. Following are examples of additional sources of information available on SERP:

    • IRM 21, Customer Accounts Services

    • Other product line-specific IRM (IRM 3, IRM 4, and IRM 5)

    • IRM 10, Security, Privacy and Assurance and IRM 25, Special Topics

    • Interactive Tax Law Assistant (ITLA)

    • Publication Method Guide (PMG)

    • Electronic ACS Guide (EACSG)

    • IAT Tools

    • Technical Communication Documents (TCD)

    • AM Portal

    • TE/GE Probe and Response Guide

    • Telephone Transfer Guide (TTG)

    • Taxpayer Contact Outline

    • Taxpayer Information Publications

    • Forms, Schedules, Instructions

    • Correspondence Letters

    • SNIP

    • Post-of-Duty Listing and Job Aid

    • Lockbox Addresses

    • Special Procedures Directory

    • State Unemployment Tax Agencies

    • VITA Volunteer Income Tax Assistance/Tax Counselors for the Elderly/American Association or Retired Person Sites

    • On-Line Training Materials

    • Miscellaneous documents of local interest

    • EPSS SERP Portal

    • Document 6209

  5. Because of the scope of paper reviews conducted in campuses, each QR/PAS function should establish or have access to a library of all necessary IRMs. Area offices (AOs) should establish a similar IRM library covering all the types of work performed in their site. In most cases, the IRMs can be found on-line through SERP.

  6. Any IRS publication can be cited as a reference source. IRS publications will often be the primary research tool for tax law issues. Procedural issues are often addressed in the publication which can be located electronically on SERP.

  7. Responses to taxpayer questions may refer to specific forms and their instructions. They can be located electronically on SERP under Forms/Letters/Pubs to verify the quality of the response. If erroneous information has been given to the taxpayer, cite the form or instruction to substantiate the defect.

  8. The IRS Electronic Publishing Catalog contains a number of documents that can be used for research purposes. One of the most frequently used is Document 6209, IRS Processing Codes and Information. This document can also be found on-line through SERP.

    Note:

    If there is a discrepancy between the Document 6209 and a specific IRM, the IRM takes precedence.

  9. Use the Probe and Response Guide (P&RG) or the Interactive Tax Law Assistant (ITLA), or the Publication Method Guide in conjunction with Publication 17 and other reference materials to answer customers’ tax law questions. Its use is mandatory for review of the Tax Law product line.

    Note:

    Starting January 2, 2014 see IRM 21.1.1.6(5), (6) and (7), Customer Service Representative (CSR) Duties for new tax law procedures.

  10. Various automated systems may be needed to conduct reviews. These items include, but are not limited to:

    • Automated Insolvency System (AIS)

    • Account Management System (AMS)

    • Automated Collection System (ACS)

    • Automated Underreporter (AUR)

    • Automated Lien System (ALS)

    • Integrated Collection System (ICS)

    • Automated Non-Master File (A-NMF)

    • On-Line Notice Review (OLNR)

    • Locator services, such as credit bureaus and state employment commissions

    • Automated Substitute for Return (ASFR)

    • Report Generation Software (RGS) used by Examination

    • Automated Offer in Compromise (AOIC)

    • Correspondence Imaging System (CIS)

    • Withholding Compliance System (WHCS)

    • e-help Support System (EHSS)

    • Third Party Data Store (TPDS)

    • EP/EO Determination System (EDS)

    • Letter and Information Network User Fee System (LINUS)

    • TE/GE Rulings and Agreements Control (TRAC)

  11. Training materials or locally developed job aids cannot be used to evaluate the quality of a contact or case.

21.10.1.3.1  (10-01-2014)
Quality Review Exceptions and IRM Deviations

  1. Quality defects related to IRM or procedural changes will be charged seven calendar days (ten business days for Compliance) after the SERP posting date of the IRM update/change.
    During the seven calendar day grace period (ten business days for Compliance) , national analysts will code "Y" if either the former or new procedure is followed. Informational feedback from local and national reviews will be shared with functional areas prior to the seventh/tenth day.

    Note:

    For National Distribution Center (NDC), the seven calendar day period begins after the date on the Alert/Change.

    Note:

    For EPSS, the seven calendar days begins on the date the EPSS Communication or Quick Alert e-mail was first issued.

    Note:

    For Accounts Management, the seven calendar day grace period for charging quality errors does not apply to SERP Alerts, but, to IRM Procedural Updates (IPUs).

  2. Any guidance that deviates from the IRM or that establish new practices for temporary procedures or pilot projects must receive prior approval from Headquarters program management. A formal deviation must be filed. See (3) below for instruction on filing a deviation. Deviations are not retroactive.

  3. It is essential that all sites and functions follow the same guidelines for coding quality service. Deviations from the IRM; for example, local procedures, or any other practices outside the IRM must be approved by Headquarters (HQ). The procedure for preparing a deviation is as follows:

    • The request must be in memorandum format

    • The memo must state the reason for the deviation, what caused the situation to occur, what is being done to correct it, and the beginning and end date of the deviation (no longer than one year).

    • The memo must be forwarded for approval to the business unit directors.

    • The deviation memo must contain the signatures of all business unit directors that are impacted.

    • The signed deviation memo must be forwarded to HQ Policy, Campus Directors and Quality Performance Measurement (QPM) for implementation.

    Note:

    Deviations are only good for the time specified but never longer than one year. See IRM 1.11.2.2.4, When Procedures Deviate from the IRM . Deviations are not retroactive and only become effective after obtaining the appropriate signature(s). During the deviation period the work will be reviewed based on the procedures outlined in the deviation.

    Note:

    These procedures do not apply to regular IRM updates processed through SERP IRM Procedural Updates, as these items officially replace the current manual. Also, manuals owned by Field Policy do not fall within this process, as they fall within the interim guidance procedures IRM 1.11.10, Interim Guidance Process.

21.10.1.3.2  (10-01-2014)
Quality Review Sampling Guidelines

  1. A sample is a representation that displays characteristics of the whole population. Sampling allows the IRS to determine the quality of service without having to review the entire universe of work. Generally, the sample is determined by Specialized Product Review Groups (SPRGs).

  2. In order for a sample to be statistically valid it must be randomly selected. This allows for any case in the sample to have the opportunity to be selected. The randomness of a sample is ensured by selecting the “Nth” case using a skip interval based on the number of required reviews and the population of the work.

  3. A sample, which is really an estimate, must be unbiased and consistent. An unbiased estimate is one where the average value is equal to the actual quality in the population. A consistent estimate is one where the estimate approaches the actual quality in the population as the sample size increases.

  4. Because a sample does not include all cases in the population, any estimate resulting from a sample will not equal the actual quality in the population and will have some variability associated with it. For an estimate to be meaningful, a measure of variability should be included with results. A precision margin and level of confidence can be used to express the variability of an estimate. When added to and subtracted from the estimate, a precision margin identifies the range of values where the actual quality in the population most likely falls. The confidence level indicates how much confidence one can have that the actual population value is within this range. Many IRS quality measures are designed to achieve estimates of 5 percent precision with 90 percent confidence. That is, there can be 90 percent confidence that the actual quality in the population is within plus or minus 5 percent of the sample estimate.

    • While quality measures are designed to achieve a certain precision (e.g., 5 percent), the actual precision of an estimate must be calculated using the actual results from the sample.

    • Precision is directly related to sample size. As sample size increases, precision decreases.

    • Precision is also directly related to the estimate of quality. The worst case scenario for an estimate (precision-wise) is 50 percent. In other words, an estimate of 50 percent will have the highest precision of all possible samples of the same sample size. Estimates that are closer to either 0 percent or 100 percent will be more precise than estimates near 50 percent.

    • Precision margins should be taken into consideration when determining if a site met its goal.

      Example:

      Assume a site has a goal of 85 percent and that their sample estimate is 82 percent with a precision of 4 percent. Applying the precision margin to the estimate implies that the actual quality in the population is between 78 percent and 86 percent. Because 85 percent lies in this range, the site cannot conclude that they did not meet their goal nor can they conclude that they did meet their goal. However, not taking the precision margin into account would have led the site to conclude that they did not meet their goal.

    • Precision margins should also be taken into consideration when comparing quality estimates between offices or different time periods from the same office.

  5. Samples are designed conservatively by Statistics of Income (SOI). One piece of information that is necessary when designing a sample is an estimate of the actual quality being measured. This estimate, often obtained from reviews from prior years, is used in the process for determining the sample size. An assumed quality of 50 percent will result in the largest sample size. Therefore, in order to get a more conservative sample size, either slightly increase or decrease the assumed quality rate so that it is closer to 50 percent. This will result in a sample size that should provide the desired precision.

    Example:

    If an office had an 88 percent quality rate for a certain SPRG during the prior year, then assuming an estimated quality rate of 80 percent will result in a larger sample size.

  6. When designing a sample, a decision must be made on how often quality estimates are necessary. The individual needs of the SPRG, as well as the resources assigned to the quality review, will help determine whether estimates should be made on a daily, weekly, biweekly, monthly, quarterly, or annual basis.

  7. Some SPRGs may consider merging several types of similar work. There may be several items at the site that need to be tested. However, none are large enough to justify an individual sample. A merged sample, producing a composite estimate, would be a possible solution in this situation. Information describing the different types of work would have to be included with all estimates from these merged samples.

  8. While CQRS, PAS, or CPAS perform reviews on product lines or SPRGs for the national measure, each site may also perform local reviews to help improve their quality. There are two ways to designate reviews as local. You can use the local button on the DCI in NQRS or you can use the group code “BL” (baseline) (Exhibit 21.10.1-6).

    Note:

    Local reviews are not used in official measures of the product line or SPRG.

    • When performing local reviews for quality improvement, the overall local quality rates may be lower than the national quality rates since local analysts will focus on problem areas.

    • Local reviews for quality improvement are not necessarily expected to be statistically valid samples.

21.10.1.3.2.1  (10-01-2014)
Selecting the Quality Sample

  1. The following steps are used to determine a sample size:

    1. How often quality estimates are necessary (e.g., quarterly, daily, weekly, monthly).

    2. The level of precision (e.g., 3 percent, 5 percent).

    3. The level of confidence (e.g., 90 percent, 95 percent, 99 percent). (Generally, a confidence interval of 90 percent with a precision margin of 5 percent is used.).

    4. A hypothetical "best guess" of the quality rate expected. This guess should be conservative (See IRM 21.10.1.3.2) and can be obtained by reviewing historical quality rates for similar product lines or SPRGs.

      Note:

      If the resulting sample size is too large for the allocated resources, consider reducing the quality estimates. Contact a Headquarters quality analyst for assistance with this.

  2. Unless otherwise specified in IRM 21.10.1, all cases with a closing count must be made available for the National Quality Review System (NQRS) process. This includes cases reviewed by managers, On-the-Job Instructors (OJIs), and cases that are subject to 100 percent review.

  3. Sampling assumptions must be determined. Unclear or inappropriate assumptions could lead to a sample that is not random, resulting in estimates that are biased, unrepresentative of the population, or inconsistent. This could call the statistical validity of the estimate into question.

  4. Sites must provide sampling assumptions to the Headquarters Process Improvement Customer Accuracy (PICA) or Product Line Analyst (PLA) responsible for the product. The PICA and PLA will provide due dates for the sampling assumptions. The site must estimate the total volumes closed for the SPRG for each period. The PICA and the PLA will provide Statistics of Income (SOI) with the sampling assumptions. The SOI staff will calculate the sample size for most paper SPRGs. Samples are determined quarterly.

  5. Apply a skip interval (see below) to the population to select the sample:

    1. The skip interval is equal to the population, divided by the sample size.

    2. Calculate and use a random start number to select the first case (See IRM 21.10.1.3.2.1(6). The random start is between 1 and the skip interval.

    3. Use the skip interval to select the rest of the sample see IRM 21.10.1.3.2.1(7).

    4. If all of the cases are not available at the same time, then sample cases as the work arrives.

    5. If all of the cases are available at the same time and are stored electronically, then software should be used to sort the list of cases either randomly or by a variable related to the quality measure (e.g., time, case ID) prior to applying the skip interval.

    6. If used correctly, a skip interval will ensure that a sample is spread appropriately across the population with estimates that are relatively unbiased.

  6. A random start is needed to apply the skip interval. The random start is provided as part of the sample plan determined by Statistics of Income (SOI).

  7. Use the following procedures to select a sample using a skip interval:

    1. On the first day of the sampling period, use the random start number to identify and select the first sampled case.

    2. Use the skip interval to select the subsequent documents for review. In other words, select the ‘nth’ case after the random start case, and continue selecting every ‘nth’ case thereafter.

    3. If the population of cases spans more than one day, then the skip interval must continue between days. Begin each new day’s count with the number of cases remaining following the last document selected from the previous day.

      Example:

      Assume a skip interval of "8" and that there were 5 cases remaining after applying the skip interval over Monday’s entire population. Then, continuing the skip interval sequence of 8 into the next day, the case count would begin at 6 on Tuesday. Therefore, the first case selected on Tuesday would be case number 3, the second would be case number 11, the third would be case number 19, etc.

    4. A random start number is used only once per quarter, even if the skip interval changes during the quarter.

    5. If the required sample size has been met, continue applying the skip interval through the last case in the population. This will ensure that all work has an equal chance of being selected.

  8. If the size of the population is not known then a skip interval cannot be calculated and another method must be used to select the sample. One alternative to manually spread the sample among the population is as follows:

    1. Establish the time period of the estimate and the sample size.

    2. Using the sample size and the number of business days in the sampling period, determine the average daily sample size

    3. Depending on the flow of the work for this particular review, manually spread the weekly sample appropriately among the days of the week. If you expect the work to be distributed evenly across the week and hours of the day, then divide the sample evenly among the days of the week. In addition, if you expect the work to be distributed evenly throughout individual days, then the sample can be split evenly between the morning and afternoon hours. If, on the other hand, it is known that certain types of work are more likely to occur on certain days (e.g., Mondays, Tuesdays) or during certain times of day (e.g., afternoons), then the sample can be shifted accordingly to follow the workload more accurately.

    4. Spread the sample appropriately among each member of the team or unit performing the type of work being reviewed.

    5. When selecting cases for review during designated dates and times, use one of the methods in IRM 21.10.1.3.2.1(5) to incorporate randomness.

      Example:

      If a case must be selected from the 50 cases handled by a particular employee on Tuesday morning, then a random number table can be used to select a random number between 1 and 50.

    6. Document all decisions made and procedures used throughout the process of manually spreading the sample across the population.

      Note:

      Because sampled cases are selected without the use of a skip interval, it is not automatically ensured that the sample is spread appropriately across the time period being measured and among the assistors included in the review. It is also not ensured that all cases in the population will have an equal chance of being selected. For these reasons, samples selected using the above procedures will have some amount of bias. Selecting the "most random sample possible" given local resources will help minimize this bias.

21.10.1.3.2.2  (10-01-2014)
Revising the Quality Sample

  1. If the site experiences much higher or lower volumes than predicted, the site may change it's skip interval within the quarter.

    1. The new skip interval may be implemented ONLY at the beginning of a sampling month, NEVER in the middle of a sampling month and only with the approval of HQ and under the guidance of an SOI statistician. Because NQRS generates weighted reports monthly, skip intervals must remain constant within any given month.

    2. Never simply grab extra cases, drop selected cases, seek out cases of special interest, or use different methods to select cases in the same sample. Each of these situations could lead to a sample that provides biased results.

    3. Contact the Headquarters quality analyst for assistance in determining the new skip interval.

21.10.1.3.2.3  (10-01-2007)
Weighted Sampling

  1. In sampling, every sampled case represents a certain number of cases in the population. The exact number of cases a sampled case represents will depend on both the sample size and the actual size of the population from which it was selected. When a quality estimate is a combination of two or more separate samples (e.g., a fiscal year report for a single SPRG for a single site), it is necessary to account for the fact that each sampled case included in the overall estimate may not represent the same number of cases in the overall population. Weighting is used to ensure that every sampled case has the appropriate amount of influence on the overall cumulative estimate.

    Example:

    A quality estimate for a single SPRG in a single site for a planning period may consist of three individual samples, one from each month. Therefore, the planning period quality estimate is weighted by the three individual monthly SPRG volumes. This will make certain that each month’s influence on the planning period estimate is directly related to the total number of cases handled during that month.

  2. NQRS provides both weighted and unweighted estimates of quality.

  3. Unweighted estimates that combine more than one site, time period, or SPRG are not considered statistically valid. Such estimates should only be used internally. Their statistical limitations should be taken into consideration when basing business decisions on them.

21.10.1.3.3  (10-01-2012)
Quality Review Time Reporting

  1. See IRM 25.8.1, OFP Codes Overview, for appropriate Work Planning and Control (WP&C) Organization, Function, and Program (OFP) time reporting codes.

21.10.1.3.4  (10-01-2014)
Quality Review Records Retention

  1. Document 12990, Records Control Schedules provides specific guidelines on the retention period for National Quality Review System (NQRS) records. National quality review printed reports may be destroyed when superseded or no longer needed. Source documentation relating to non-evaluative national and local product reviews may be destroyed after data input has been validated.

  2. EQRS records are systemically removed from the database after 5 years. NQRS records are removed after 7 years.

21.10.1.4  (10-01-2014)
Quality Review of Phone Calls

  1. CQRS monitors recorded contacts through the use of the Ultra 10 software system. These monitors are conducted for all Accounts Management toll-free phone SPRGs (except for National Taxpayer Advocate), ACS Phones for Compliance Services, e-help Phones for Electronic Products and Services Support and all Field Assistance (CARE) product lines (except for Adjustments). CPAS monitors the AUR Phones, BMF AUR Phones, Exam Phones and Innocent Spouse Phones SPRGs. PAS monitors the remaining Compliance Services Phone SPRGs. The data from these reviews may be used for the Business Results of the Balanced Measures.

  2. Managerial reviews of these phone calls are not included in the Business Results calculations; they are used in employee evaluative documentation and to identify training issues.

  3. The Verint Ultra 10 Contact Recording (CR) system is a telephone tool used by Accounts Management, Electronic Products and Services Support, and Compliance Services to record incoming "toll free" telephone contacts, some of which may be selected for quality review. Incoming calls are answered with an additional announcement that states, "Your call may be monitored or recorded for quality purposes." The Verint Ultra system records the audio, and occasionally capture the screen, of all telephone calls coming into the Service via the ASPECT Communication voice response unit. See IRM 21.10.1.2 (7) for more information on CR.

  4. Managers and Quality Analysts use CR to perform required random reviews of incoming telephone contacts. CR allows for a more cost effective review as there is no lag time between calls. The PAS/CPAS analysts use a Shared In-Box to retrieve their daily sample. Reviewers must include the CR identification number or Router Call Key on the Data Collection Instrument used to capture the call review. Calls recorded in this system are available for National review the next business day and every effort should be made to complete the National Review daily.

  5. The system stores data by Standard Employee Identifier (SEID) for 45 days on calls that are not reviewed, 60 days on calls that are reviewed and a maximum of 18 months can be requested.

    Note:

    At this time the COIC Phones SPRG will not use the CR system because they are not using the ASPECT Communication voice response unit.

  6. On a call that was Contact Recorded, if the taxpayer requests that the recording be stopped (aka: "Stop on Demand" ) CQRS/PAS/CPAS will not review the call. If a "Stop on Demand" call is randomly selected for the national sample, it will be rejected and systemically replaced by CR.

  7. When performing a telephone review, the analyst will use the employee's identification number provided on the call:

    • If the analyst is unable to capture the employee's identification number on the call, the last name of the employee, as captured during the review, will be entered into the Employee’s Name field.

    • CQRS will capture the employee's SEID (as it appears on CR from the PBXID field on Ultra 10) as the identification number.

      Note:

      If an employee's workstation is not configured properly, their extension will show and will be captured instead of the SEID.

    • PAS/CPAS analysts will use the following identifiers in the Employee Name field if the situation warrants:
      1. U – If PAS/CPAS analysts are unable to determine both the ID # and the last name,
      2. N – If the employee does not give either an ID # or a name at any time during the conversation,
      3. I – If the PAS/CPAS analyst could not capture the full ID # or last name, but was able to get a portion of the ID # (less than 10 Digits). An example for this situation would be 99999985XX (last 2 digits not captured). If the ID number is given and the analyst got the first two numbers and could not decipher the middle two numbers but got the last 3, the analyst should indicate 99999XX521 in the employee name section.

  8. For a sample call to be counted as a phone review, the taxpayer does not have to remain on the line until all adjustment actions are complete. Even if the employee does not complete all work until the next business day, the call is still counted as part of the sample.

  9. If an Accounts Phone call subject becomes a Tax Law or ACS issue, or vice versa, code the complete call for Professionalism and Timeliness, and code any issue(s) addressed for Accuracy. If the call is transferred and no action was taken to resolve the taxpayer's issue, code the case for all applicable buckets except Customer Accuracy, which will be "not applicable." When this happens, code attribute 004 Call Transfer.

  10. The Master Attribute Job Aid (MAJA) for the phone Product Lines and SPRGs are located on the http://eq.web.irs.gov.

21.10.1.4.1  (10-01-2014)
Accounts Phones Product Line

  1. The Accounts Phones Product line consists of seven Specialized Product Review Groups, (SPRGs). The seven SPRGs are Employer Identification Number (EIN), General Account Calls, International, National Taxpayer Advocate (NTA), Priority Practitioner Support (PPS), AM Identity Theft Phones and Spanish calls.

  2. Reviewing Accounts Phones allows us to monitor and improve the quality of responses to a taxpayer's questions about his/her account.

  3. Incorrect/incomplete action (per IRM guidelines) which results in incorrect calculations must exceed a $5.00 threshold before charging a Customer Accuracy defect for the national review. This should be charged as a Procedural defect only.

21.10.1.4.1.1  (10-01-2006)
Accounts Phones Measure

  1. Accounts Phones will be measured for Timeliness, Professionalism, Customer Accuracy, Regulatory/Statutory Accuracy and Procedural Accuracy. These are the measures that are available and may be reported under the Balanced Measurement System. See IRM 21.10.1.7.3 for more information on the measures.

21.10.1.4.1.2  (10-01-2003)
Definition of EIN Calls SPRG

  1. EIN calls include any questions received on the ASPECT EIN application(s) such as.

    1. Any call relating to a taxpayer’s request for an Employer Identification Number (EIN)

    2. Any call regarding procedural issues (how to complete Form SS-4, where to fax/mail Form SS-4 etc.)

21.10.1.4.1.3  (10-01-2014)
Sample Procedures for EIN Calls

  1. SOI develops sampling plans for EIN Phone calls monitored at CQRS. Samples from CQRS are valid at the site level on a bi-monthly basis.

  2. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  3. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1.

  4. Managerial reviews are not subject to a sampling plan.

21.10.1.4.1.4  (10-01-2006)
Definition of General Account Calls SPRG

  1. General Account Calls include any questions received on the ASPECT Accounts Phone Balance Due, Advanced Accounts or Procedural applications for General Account Calls. This does not include calls received on the designated Spanish or International Account applications. General Accounts calls include:

    1. Any call relating to a taxpayer's account (Individual Master File (IMF) or Business Master File (BMF)

    2. Any call regarding entity information, the processing of a tax return, corrections to errors found during processing, or corrections resulting from adjustments or audit assessments

    3. Any call regarding procedural issues (where to file a return, when and where to make payments, etc.)

21.10.1.4.1.5  (10-01-2008)
Sample Procedures for General Accounts Calls SPRG

  1. SOI develops sampling for Account Phone calls monitored at CQRS. Samples from CQRS are valid at the site level on a monthly basis.

  2. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  3. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1.

  4. Managerial reviews are not subject to a sampling plan.

21.10.1.4.1.6  (10-01-2003)
Definition of International Calls SPRG

  1. International Calls include any questions received on the designated ASEPCT applications for International such as:

    1. Any international (foreign, non-resident, etc.) call relating to a taxpayer's account (IMF or BMF)

    2. Any international call regarding entity information, the processing of a tax return, corrections to errors found during processing or corrections resulting from adjustments or audit assessments

    3. Any international call regarding procedural issues (where to file a return, when and where to make payments, etc.)

21.10.1.4.1.7  (10-01-2014)
Sample Procedures for International Calls

  1. SOI Staff develops a combined sample plan for International Tax Law and Accounts Phone calls monitored at CQRS. Sample size from CQRS is valid at the site level on a monthly basis.

  2. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  3. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1.

  4. Managerial reviews are not subject to a sampling plan.

21.10.1.4.1.8  (10-01-2014)
Definition of NTA Calls SPRG

  1. National Taxpayer Advocate (NTA) calls include any calls relating to a taxpayer's account (IMF or BMF) received on the designated NTA ASPECT applications.

21.10.1.4.1.9  (10-01-2008)
Sample Procedures for NTA Calls

  1. CR and the associated screen shots may be used by managers to evaluate the contact.

  2. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1.

  3. Managerial reviews are not subject to a sampling plan.

21.10.1.4.1.10  (10-01-2003)
Definition of PPS SPRG

  1. PPS Calls include any call from a tax practitioner relating to his or her client's/taxpayer's account (IMF or BMF) or any other questions received on the ASPECT applications for PPS.

21.10.1.4.1.11  (10-01-2014)
Sample Procedures for PPS

  1. SOI develops sampling plans for PPS calls monitored at CQRS. Samples from CQRS are valid at the site and national level on a monthly basis.

  2. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  3. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1.

  4. Managerial reviews are not subject to a sampling plan.

21.10.1.4.1.12  (10-01-2003)
Definition of Spanish Tax Law and Account Calls SPRG

  1. Spanish Calls include any questions received on the ASPECT applications for Spanish Calls including:

    • Any call relating to a tax law question from the taxpayer

    • Any call relating to a taxpayer's account (IMF or BMF)

    • Any call regarding entity information, the processing of a tax return, corrections to errors found during processing, or corrections resulting from adjustments or audit assessments

    • Any call regarding procedural issues (where to file a return, when and where to make payments, etc.)

21.10.1.4.1.13  (10-01-2014)
Sample Procedures for Spanish Account Calls

  1. SOI Staff develops a combined sampling plan for Spanish Tax Law and Spanish Accounts calls monitored at CQRS. Samples from CQRS are valid (which includes Spanish Accounts and Spanish Tax Law combined) at the site level on a monthly basis. Statistical validity at the site level varies for Puerto Rico based on time of year. For all other sites combined, the statistical validity is monthly.

  2. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  3. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1.

  4. Managerial reviews are not subject to a sampling plan.

21.10.1.4.1.14  (10-01-2014)
Definition of AM Identity Theft Phones

  1. Accounts Management (AM) Identity Theft Calls include any call from an individual to report that their SSN or ITIN has been misused to obtain goods or services, to report other complaints of identity theft, and/or request protection of their tax account information or any questions received on the ASPECT ID Theft Phone application(s). See IRM 21.9.2.1, Identity Theft - General Information and IRM 21.9.2.2, Identity Theft - Expanded Procedures for additional guidance.

21.10.1.4.1.15  (03-10-2014)
Sample Procedures for AM Identity Theft Phones

  1. SOI develops sampling for ID Theft Phone calls monitored at CQRS. Samples from CQRS are valid at the site level on a monthly basis.

  2. Ultra (Contact Recording) and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  3. A site may want to perform local reviews to aid in the quality improvement of the product line. Local review sampling guidelines have been included. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1.

  4. Managerial reviews are not subject to a sampling plan.

21.10.1.4.2  (10-01-2002)
ACS Phones Product Line

  1. Automated Collection System (ACS) Phone calls are reviewed to measure and improve the quality of our responses to taxpayer inquiries about balance due and return delinquency accounts.

21.10.1.4.2.1  (04-08-2008)
Definition of ACS Phones Product Line

  1. ACS is a computerized inventory system of balance due accounts and return delinquency accounts after normal notice routines occur.

  2. ACS Phones calls are defined as any call received on an IMF or BMF account in 22 or Taxpayer Delinquency Investigation (TDI) status assigned to ACS, and any other calls received on the ASPECT ACS application.

  3. Incorrect/incomplete action (per IRM guidelines) which results in incorrect calculations must exceed a $5.00 threshold before charging a defect for the national review.

21.10.1.4.2.2  (10-01-2014)
Sample Procedures for ACS Phones

  1. SOI Staff develops sampling plans for ACS Phones' calls monitored at CQRS. Samples from CQRS are valid at the site and national level on a monthly basis.

  2. All ACS Phones’ calls will be included in the universe of calls subjected to sampling per the SOI algorithm. This includes cases reviewed by managers, On-the-Job Instructors (OJIs), and cases subjected to 100 percent review.

  3. Local reviews are not performed for the national measure of the ACS Phones product line and therefore, are not included in the sampling plan. However, if time and staffing permits, each site should also perform local reviews to aid in the quality improvement of the product line. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1 for more information on local reviews.

  4. CR and the associated screen shots may be used by managers or CQRS to evaluate the contact.

  5. Managerial reviews are not subject to a sampling plan.

21.10.1.4.2.3  (10-01-2003)
ACS Phones Measures

  1. ACS Phones will be measured for Timeliness, Professionalism, Customer Accuracy, Regulatory/Statutory Accuracy, and Procedural Accuracy. These are the measures that are available and may be reported under the Balanced Measurement System. See IRM 21.10.1.7.3 for more information on the measures.

21.10.1.4.2.4  (10-01-2011)
Roles and Responsibilities of the ACS Phones Analyst

  1. The Centralized Quality Review System (CQRS) Analyst for ACS Phones will complete an unbiased, consistent, and accurate review of ACS Phones including follow-up actions taken once the taxpayer has hung up. Even if the call site employee does not complete all work until the next business day, the call is still counted as part of the sample.

  2. ACS Phones Quality analysts will complete an NQRS Data Collection Instrument (DCI) for each case reviewed. All appropriate fields will be completed to indicate quality standards having been met or not met. Analysts’ narratives will provide the basis for their findings and include applicable IRM references for coded defects. Whenever Attribute 715 is coded with a defect, the driver must be indicated in parenthesis immediately following "715" in the Feedback/Summary Remarks on the DCI, i.e., N=715, Correct/Complete Response Resolution (003). NQRS reviews will provide a basis for defect analysis and reporting error trends.

    Note:

    Before Attribute 508 Appropriate Procedural Action/Answer is coded, the analyst should research to see if another attribute describes the action/answer given by the employee. Only use this attribute as a last resort. When it is decided to code Attribute 508 either "Y" or "N" , clearly explain in the Feedback Summary section of the DCI why this attribute was selected. If a defect is charged, clearly describe the defect and provide an IRM reference to support the coding.

  3. Enter the word FLASH in the Feedback Summary Remarks section of the DCI to identify a defect that requires immediate (by the next business day) corrective action by the operation. For example, recalling a notice/letter before it is issued or correcting an adjustment to an account. See IRM 21.10.1.7.7, EQRS/NQRS Remarks Section for additional information.

    Note:

    Defects requiring correction that are not annotated with FLASH are to be completed by the Operation within five working days. See IRM 21.10.1.2.7.5

  4. Review data will be input within 24 hours of review to NQRS.

  5. Refer to the Embedded Quality Website (http://eq.web.irs.gov) weekly to glean updated information on the use of attributes in the ACS Phones review, obtain the latest Master Attribute Job Aid (MAJA) and Quality Grams, etc.

  6. Consult the ACS Phones Product Line Analyst for coding assistance or to interpret attribute usage, whenever necessary.

21.10.1.4.3  (10-01-2014)
ASFR Phones Product Line

  1. Automated Substitute for Returns (ASFR) Phones are reviewed to measure and improve the quality of responses given to taxpayer inquiries received on the ASFR and ASFR Refund Hold toll-free lines.

21.10.1.4.3.1  (10-01-2013)
Roles and Responsibilities of the ASFR Phone Analyst

  1. The ASFR Phone PAS Analysts will complete an unbiased, consistent, and accurate review of ASFR and ASFR Refund Hold calls.

  2. ASFR Phones PAS Analysts will review the entire call to identify actions required. Analysts will ensure that appropriate actions are updated on IDRS, ASFR systems and/or AMS and the actions taken clearly support the disposition of the call as required by the procedural IRM of the SPRG.

  3. ASFR Phone PAS Analysts will complete an NQRS Data Collection Instrument (DCI) for each case reviewed using the MAJA for the SPRG as guidance for coding. All appropriate fields will be completed to indicate quality standards having been met or not met. Analysts’ narratives will provide the basis for their findings and include applicable IRM references for coded defects. NQRS reviews will provide a basis for defect analysis and reporting error trends.

  4. Enter the word FLASH in the Feedback Summary Remarks section of the DCI to identify a defect that requires immediate (by the next business day) corrective action by the operation. For example, recalling a notice/letter before it is issued or correcting an adjustment to an account. See IRM 21.10.1.7.7, EQRS/NQRS Remarks Section for additional information.

    Note:

    Defects requiring correction that are not annotated with FLASH are to be completed by the Operation within five working days. See IRM 21.10.1.2.7.5.

  5. Review data will be input daily to NQRS, whenever possible.

  6. Refer to the Embedded Quality Website (http://eq.web.irs.gov) weekly to glean updated information on the use of attributes, obtain the latest Quality Job Aid, Quality Gram, etc.

  7. In a monthly report to be shared with the ASFR Operation, provide suggestions for improvement by:

    • Identifying most frequently occurring defects

    • Analyzing root causes of defects

    • Verifying sampling plans and guidelines

    • Reviewing methods used to capture needed information

  8. It is also recommended that NQRS analysts meet at least quarterly with the EQRS analysts to confer, compare and review attribute usage. These meetings should be used as a forum to discuss and agree on the use of each attribute in the ASFR Phones Smart DCI. The Product Line Analyst should be consulted for assistance with interpreting attribute usage when necessary.

21.10.1.4.3.2  (04-08-2008)
Definition of ASFR Phone Product Line

  1. ASFR Phones are defined as any call received as a result of ASFR issued 30 day (2566) and 90 day (3219) letters or Refund Hold (CP63) letters generated from IDRS.

  2. CR and the associated screen shots (when available) may be used by managers or PAS to evaluate the contact.

  3. Incorrect/incomplete action (per IRM guidelines) which results in incorrect calculations must exceed a $5.00 threshold before charging a defect for the national review.

21.10.1.4.3.3  (10-01-2013)
Sample Procedures for ASFR Phones

  1. SOI develops sampling plans for ASFR Phones monitored by the sites at the request of the Quality Performance Measurement staff (QPM). Samples from the sites are weighted daily and are valid at the site level and national level on a quarterly basis.

  2. The National Review will consist of a daily random sample of calls covering the entire telephone operational day. See IRM 21.10.1.3.2 and IRM 21.10.1.3.2.1 for sampling guidelines.

  3. The PAS Analyst must sample and review telephone calls while the caller is on-line or by using contact recording (CR). Verify that all input actions are completed within two business days from the day the call was received.

  4. CR and the associated screen shots should be used when evaluating the contact.

  5. The PAS Analyst must monitor a complete call, from start to finish, for the call to be considered a valid part of the sample.

  6. EQRS reviews are not subject to a sample plan.

21.10.1.4.3.4  (10-01-2007)
ASFR Phones Measures

  1. ASFR Phones will be measured for Timeliness, Professionalism, Customer Accuracy, Regulatory/Statutory Accuracy and Procedural Accuracy. These are the measures that are available and may be reported under the Balanced Measurement System. See IRM 21.10.1.7.3 for more information on the measures.

21.10.1.4.4  (10-01-2014)
AUR Phones Product Line

  1. Automated Underreporter (AUR) Phones is a separate Automated Underreporter product line review in NQRS and is reviewed within the Embedded Quality system. There are two (2) Specialized Product Review Groups (SPRGs) under the AUR Phones Product Line.

    • AUR Phones

    • BMF AUR Phones

  2. AUR Phones Product Line is reviewed to measure and improve the quality of responses given to taxpayer/caller inquiries received on the AUR toll-free telephone lines.

  3. CPAS Analysts and managers will primarily use IRM 4.19.3, IMF Automated Underreporter, IRM 4.119.4, BMF Underreporter (BMF-AUR), IRM 21.10.1 and the Attribute Matrix Job Aids during the review of AUR Phone Calls.

21.10.1.4.4.1  (10-01-2012)
Roles and Responsibilities of the AUR Phones Product Line Analyst

  1. The CPAS Analysts will complete an unbiased, consistent, and accurate review of AUR Phones and BMF AUR Phones cases, including follow-up actions taken once the taxpayer has hung up. Even if the employee does not complete all work until the next business day, the call is still counted as part of the sample.

  2. For each call reviewed, complete an NQRS DCI indicating whether or not the quality standards were met. Provide as much information as possible, including all IRM references for coding defects. This review should identify defects as outlined in IRS Regulations and IRM guidelines and procedures, and provide a basis for error analysis and error trends.

  3. Enter the word FLASH in the Feedback Summary Remarks section of the DCI to identify a defect that requires immediate (by the next business day) corrective action by the operation. For example, recalling a notice/letter before it is issued or correcting an adjustment to an account. See IRM 21.10.1.7.7, EQRS/NQRS Remarks Section for additional information.

    Note:

    Defects requiring correction that are not annotated with FLASH are to be completed by the Operation within five working days. See IRM 21.10.1.2.7.5.

  4. Review data will be input daily to NQRS.

  5. Refer to the Embedded Quality Website (http://eq.web.irs.gov) weekly to obtain updated information on the use of attributes in the Operations, the latest Quality Grams, etc.

  6. In a quarterly report, which is to be shared with the Operations, provide suggestions for improvement by:

    • Identifying most frequently made defects

    • Analyzing root causes of defects

    • Reviewing guidelines or sampling procedures

    • Reviewing methods used to capture needed information

21.10.1.4.4.2  (04-08-2008)
Definition of AUR Phones SPRG

  1. AUR Phones is defined as any phone call received on the AUR toll-free line through the ASPECT system on an IMF account by the Automated Underreporter Operation. These include both open AUR cases and closed AUR cases in the Reconsideration process, and any immediate account actions required to be taken by the AUR employee once the taxpayer terminates the call. The call is still counted as part of the sample even if the employee does not complete all work until the next business day.

  2. Incorrect/incomplete action (per IRM guidelines) which results in incorrect calculations must exceed a $5.00 threshold before charging a defect for the national review.


More Internal Revenue Manual