4.26.18  Embedded Quality

Manual Transmittal

February 07, 2013


(1) This transmits revised text for IRM 4.26.18, Bank Secrecy Act, Embedded Quality.

Material Changes

(1) Terms and titles were updated to current usage.

(2) Citations were renumbered from 31 CFR Part 103 to 31 CFR Chapter X, effective March 1, 2011.

Effect on Other Documents

This supersedes IRM 4.26.18, dated July 1, 2007. This section includes the contents of the BSA National Quality Review Staff desk guide.


The Intended audience is employees of the Bank Secrecy Act Program in the Small Business/Self-Employed (SB/SE) Division.

Effective Date


William P. Marshall
Director, Fraud/BSA
Small Business/Self-Employed  (07-01-2007)
Introduction to Embedded Quality

  1. Embedded Quality is an automated quality review system which allows reviewers to rate case quality using quality standards called "attributes." The use of the same attributes for both managerial (EQRS) and national (NQRS) reviews provides consistency and comparability.

  2. The Embedded Quality Review System (EQRS) is used by managers to document all case-related reviews of employees, including closed case reviews, 4502 reviews, workload reviews, etc. Attributes in EQRS are linked to Critical Job Elements (CJEs) to assist managers in performance evaluation. Procedures for EQRS may be found in the BSA Manager's Tool Kit and in BSA EQRS training course #19054.

  3. The National Quality Review System (NQRS) is used by BSA national reviewers to review both sample and mandatory review cases. Sample cases are collected using a statistically valid sample so results are representative of the entire BSA case universe.

  4. NQRS data is used by management to assess program performance and to identify training and educational needs to improve work quality.  (02-07-2013)
NQRS Responsibilities

  1. The NQRS program is administered within the Workload Identification, Selection, Delivery, and Monitoring (WISDM) section of BSA Policy.

  2. BSA national quality reviewers, supervised by a WISDM Section Chief, are responsible for the following:

    1. Review of a statistically valid sample of Title 31 closed cases from the Enterprise Computing Center – Detroit (ECC-DET);

    2. Review of a statistically valid sample of Form 8300 closed cases from the ECC-DET;

    3. Review of Title 31 Letter 1112 follow-up cases as part of the statistical sampling process and as assigned by the WISDM Section Chief;

    4. Review of Form 8300 follow-up penalty cases as part of the statistical sampling process and as assigned by the WISDM Section Chief; and

    5. Review of mandatory review cases including intentional disregard penalty cases for routing to Appeals as assigned by the WISDM Section Chief.

  3. BSA national quality reviewers are responsible for other duties and assignments including:

    1. Conducting case quality discussions, reviews, and presentations at the territory and group levels, as requested;

    2. Assisting with territory operational reviews, as requested;

    3. Providing feedback to the territories by way of NQRS Newsletters and Quality Alerts on significant issues noted in NQRS case reviews; and

    4. Review of monthly EQRS/NQRS quality reports for trends and patterns.

  4. To accomplish these objectives, BSA national reviewers should ensure:

    • Timely completion of NQRS case reviews;

    • Accurate and consistent application of the attributes;

    • Accuracy of NQRS database input; and

    • Timely responses to the groups and territories.  (02-07-2013)

  1. An attribute is defined as a specific aspect of case work that the review measures. "Attribute" is synonymous with "quality standard." Attributes compare to and replace the quality standards formerly used by SB/SE.

  2. Attributes are concise statements of SB/SE BSA's expectations for quality examinations and are guidelines to assist examiners to fulfill their professional responsibilities. Each attribute is defined by elements representing components that are present in a quality examination.

  3. Attributes provide objective criteria against which case quality is assessed and are grouped into the following five quality measurement categories:

    • Timeliness,

    • Professionalism,

    • Regulatory accuracy,

    • Procedural accuracy, and

    • Customer accuracy.

  4. Attributes directly link to the examiner's critical job elements (CJEs) in EQRS.

  5. Attributes are divided into six categories for ease of data input, as follows:

    • Exam planning (100s),

    • Investigative/examination techniques (400s),

    • Timeliness (500s),

    • Customer relations/professionalism (600s),

    • Documentation/reports (700s), and

    • Customer accuracy (800).

  6. The BSA SharePoint site and the EQ Home page provide reference materials to aid the reviewer in rating cases. These reference materials include:

    1. The BSA NQRS Embedded Quality Job Aide, which contains attribute definitions, available reason codes, points to consider, case examples, IRM references, attribute rating guides, and applicable measurement categories;

    2. The BSA NQRS Attribute Reference Guide, which has similar information as the Job Aide, condensed into chart format but adds references to BSA reengineering work papers; and

    3. CJE to Attribute Crosswalks, which match the attributes to the critical job elements for BSA Revenue Agent and BSA Compliance Officer positions.  (02-07-2013)
Scoring System

  1. Each attribute and aspect is worded as a question. This format minimizes the need for interpretation and is scored as:

    • A "Yes" meaning met,

    • A "No" meaning not met, or

    • An "N/A" meaning not applicable.

  2. This scoring system provides maximum flexibility for reviewers, while emphasizing select key elements and aspects that represent requirements for a quality examination.  (02-07-2013)
Key Elements

  1. Key elements and aspects are those selected items that are prerequisites of a quality case and are essential to IRS goals.

  2. Every attribute contains one or more key elements/aspects.  (02-07-2013)
Case Selection

  1. CTR Operations at the ECC-DET selects a valid sample of closures for NQRS review. The sample size is determined statistically to provide a valid sample at the national level. The sampling plan is determined from the number of closures per the annual BSA work plan.

  2. CTR Operations separately selects Title 31 cases and Form 8300 cases using the statistically valid sample selection rate determined for that type case.

  3. Survey cases are included in the closures from which the sample is drawn.

  4. CTR Operations sends all selected cases to the WISDM Section Chief. The Section Chief assigns the cases to the reviewers.  (02-07-2013)
Other Cases Subject to NQRS Review

  1. In addition to the closed case statistical sampling from CTR Operations, the following Title 31 cases are also subject to NQRS review:

    • Letter 1112 follow-up cases based on the statistical valid sample for Title 31 cases and as assigned by the WISDM Section Chief; and

    • Special project cases, as designated.

  2. In addition to the closed case statistical sampling from CTR Operations, the following Form 8300 cases are also subject to NQRS review:

    • Mandatory review of all Form 8300 intentional disregard penalty cases being sent to Chief, Appeals;

    • Form 8300 follow-up penalty cases based on the statistical valid sample for Form 8300 cases and as assigned by the WISDM Section Chief; and

    • Special project cases, as designated.

  3. Mandatory review of Form 5104 referral cases are currently conducted by BSA headquarter analysts.

  4. Mandatory review of Title 31 and Form 8300 headquarters examination cases are currently conducted by BSA headquarter analysts.

  5. Mandatory review cases are to be sent by the group manager with a Special Handling Notice.  (02-07-2013)
Case Review Procedures

  1. The first step in the Embedded Quality process is the review of the case. The purpose of the national BSA quality review process is to provide management with measures on how effectively work processes are carried out and how these processes can be improved.

  2. National reviewers are responsible for reviewing BSA cases for quality using five measurement categories:

    • Timeliness (Efficient issue resolution using workload management and time utilization techniques);

    • Professionalism (Effective communication techniques);

    • Regulatory accuracy (Adhering to statutory/regulatory process requirements);

    • Procedural accuracy (Non-statutory/regulatory internal process requirements); and

    • Customer accuracy (Correct and complete case resolution with no adverse impact on the customer).

  3. It is important that reviewers have an in-depth understanding of the EQ attributes and process measures, as well as a solid understanding of BSA law, requirements, and procedures. Furthermore, it is important that reviews are complete, accurate, and consistent to yield a true and valid sampling that management can rely on to improve examination processes. See IRM, for details on "Consistency of Reviews."

  4. Closed case review results are entered into a web-based Data Collection Instrument (DCI) on the National Quality Review System (NQRS). See IRM (below) for detailed input procedures.

  5. Generating Reports: Once the review is completed, the results are included in the raw data from which the EQ system generates reports. Reports for Title 31 and Form 8300 case reviews are generated separately to ensure separation of the Title 31 and Title 26 data. See IRM (below) for more on reports.

  6. For help in reviewing cases, reviewers can access reference materials on the BSA SharePoint Drive and the Embedded Quality (EQ) home page. Available reference materials include:

    1. The BSA NQRS Job Aide

    2. Attribute Reference Guides and Crosswalks

    3. Link to the BSA IRM 4.26

    4. 31 CFR Chapter X

    5. Link to FinCEN's website

    6. Link to the Embedded Quality Home Page

    7. Embedded Quality Users Guide  (02-07-2013)
NQRS Reviewers Guide

  1. The Embedded Quality Job Aide has been developed to be used as a guide in completing the NQRS database input screens, including headers and attributes. A separate job aide has been developed for EQRS.

  2. The NQRS Job Aide provides operational and job aide definitions, points to consider, reason codes (If available), rating guide explanations, attribute examples, and IRM references for each attribute in the guide.  (02-07-2013)
Introduction to NQRS Database

  1. Case reviews are input into the web-based system known as the National Quality Review System (NQRS).

  2. Each case review is input using a Data Collection Instrument (DCI). The DCI guides the review process and documents the reviewer's comments. This documentation can be extracted from the system through statistical and informational analysis reports.

  3. The NQRS database is accessed through the EQ home page at http://mysbse.web.irs.gov/mgrsact/eq/default.aspx.

  4. Reviewers must complete an Online Form 5081 to obtain a password to access the NQRS system.  (07-01-2007)
Completion of Database Input

  1. The DCI provides the principal documentation for the reviewer's case evaluation and conclusions. A DCI is completed for each case reviewed in the NQRS system.

  2. Header input procedures: The first input section of the DCI is the data input of headers that capture basic closed case information, such as "Who, What, When, Where, and Why." This information is categorized into four groupings:

    • Review information: considers important review data

    • Case information: considers basic case information

    • Process measures: considers case actions taken by the examiner

    • Special Use: considers special tracking for local or national purposes


      Header fields in bold are mandatory fields that must be filled to complete the DCI.

  3. Evaluating and Coding the Attributes: The next step in the case review process is the evaluation of the case by rating attributes that are specific to a Specialized Product Review Group (SPRG). BSA is one of the SPRGs. Attributes on the DCI are either required or optional. All required attributes (in bold) must be rated "Yes" or "No" in order for the DCI to be recorded as complete. Other attribute fields should be completed if information is available to ensure an accurate quality case review.

  4. Attribute reason codes: Most of the attributes have reason codes. Once the attributes are rated, at least one reason code must be selected (where reason codes are present) for each attribute rated as not met ("N" ). The reason code that best applies should be selected. If none apply, "Other" may be selected and a narrative should be entered describing the reason.

  5. Reason Code Narratives: In addition to selecting a reason code on all attributes rated "N," the reviewer is required to write a narrative in the attribute narrative box describing specifically why a case failed a particular attribute. Positive comments may also be entered for attributes that have been met and are encouraged in order to provide a complete and balanced case review.

  6. Narrative Guidelines: NQRS reviewers need to be thorough in documenting case performance by providing clear, concise, and specific descriptions of any errors in order to identify problem areas. Positive actions should also be described in specifics. Citing IRM or other references adds validity to the narratives. Narrative remarks that are too general may not offer sufficient detail to allow either specific recommendations for improvement or recognition of good performance.

  7. Completing the DCI: Once the headers, attributes, and narratives are completed, the reviewer should select the "Complete Review" screen. At this point, the EQ system notifies the reviewer if there are any required items needed to complete the case review and/or provides suggestions of recommended items. The reviewer will not be able to complete the case review until all required input is addressed.  (07-01-2007)
Consistency of Reviews

  1. Since the attribute ratings in NQRS form the basis of quality performance measurement for BSA, it is important that attribute ratings are complete and accurate. Reviewers must strive to be consistent over time in rating similar case actions on all reviews.

  2. Incomplete or inaccurate national quality reviews can lead to a distortion of Business Results - Quality Performance. Further, general narrative remarks may not provide sufficient detail to allow for meaningful drivers for improvement.

  3. Guidelines for consistent reviews:

    • Rate all attributes that apply to the case being reviewed.

    • Rate good performance as well as errors/opportunities for improvement.

    • There is no need to rate actions that are not applicable to the case being reviewed; the system will automatically rate them as "N/A" .

    • Multiple reason codes may be selected for multiple errors.

    • Input specific narrative comments for both positive and negative actions to ensure more accurate reviews.  (02-07-2013)
Consistency Checks

  1. Consistency checks of national reviews can be performed in several ways:

    1. Reviewers can independently evaluate the same case. Subsequent comparison of the ratings and discussion can reveal inconsistent treatment.

    2. The WISDM Section Chief can critique completed case reviews on a regular basis and provide feedback to reinforce expectations of the reviews. Also, available management information reports can be used to evaluate consistency.

    3. To facilitate consistency between NQRS and EQRS, reviewers can lead group meeting discussions on specific elements and case scenarios to develop consensus.

  2. Consistency checks of EQRS are to be conducted by Territory Managers on a regular basis. National reviewers may be called in to assist in this process.  (07-01-2007)
NQRS Reports

  1. The NQRS database includes basic pro forma reports designed to assist in analyzing the data collected through case reviews. These reports include:

    • DCI Reports (Shows attribute scores and narratives);

    • Organizational Reports (Embedded Quality Score, Interim Quality Score, Measurement Category Score, Attribute Results, Top Ten Defects/Successes, and Customer Drivers);

    • Attribute Narrative Report; and

    • Ad hoc Reports.

  2. Reports are generated separately for Title 31 and Form 8300 case work.

  3. NQRS reports can be generated periodically, as needed. At a minimum, these reports are generated and analyzed on a quarterly basis.

  4. The NQRS Reviewer's Guide should be used when interpreting or analyzing NQRS reports. The definitions of attributes and process measures are narrow and rules for ratings are specific to the individual elements.  (07-01-2007)
Management Use of NQRS Data

  1. NQRS data is used by management to assess program quality performance and to identify training and education needs.

  2. NQRS data cannot be used to evaluate individual employee performance. This is the role of the EQRS, where attributes are linked to Critical Job Elements (CJEs).  (02-07-2013)
Disposition of Cases After Review

  1. Statistically sampled review cases from CTR operations at the ECC-DET are returned to ECC-DET by NQRS review upon completion of the case reviews.

  2. Statistically sampled Title 31 Letter 1112 follow-up case reviews are returned to the originating WISDM group.

  3. Statistically sampled Form 8300 follow-up penalty case reviews are returned to the originating WISDM group.

  4. Mandatory review cases are returned as directed by the WISDM Section Chief.

More Internal Revenue Manual