3.30.28  Individual Review

3.30.28.1  (03-01-2007)
Overview

  1. The purpose of this Section is to provide the scope, quality requirements, identify the quality processes, responsibilities, and explain the terms used throughout this IRM.

3.30.28.1.1  (03-01-2007)
Quality Assurance Program (QAP)

  1. The Individual Review Manual provides general instructions for maintaining the Quality Assurance Program (QAP) on campuses, and specific instructions for doing individual review.

    Note:

    Time spent performing 100 percent reviews must not be charged to function 880 and program 00XXX.

  2. The QAP provides information, gathered through individual review, for campuses to look at broader, quality issues. The information forms the basis for identifying:

    1. The nature and causes of certain production bottlenecks.

    2. New or additional training requirements.

    3. Procedural, programming, or systemic deficiencies.

    4. Error trends.

  3. The QAP reports provide the results of Individual Review only.

  4. The provisions in this Section relating to "Measurable" work, individual review, and (PCA) Production and Control Accounting / (TEPS) Total Evaluation Performance System are applicable to all campuses.

  5. PCA continues to operate as a parallel system to TEPS. Corrections to all data in the Organization, Function, and Program Codes (OFP) Consistency File and the PCA Employee Master File are made through present PCA procedures (see IRM 3.30.50, Performance Evaluation Reporting System).

  6. Work performed by personnel in the Examination and Customer Service Operations is not measured on an individual basis and not included in PCA/TEPS. See 26 CFR part 801 and the provisions of the Restructuring and Reform Act of 1998 (RRA 98), section 1204.

  7. Reference to Quality Assurance Operations (QAO) throughout this manual also refers to quality review personnel in decentralized campuses.

  8. All administrative information was deleted from this IRM. It can be found in IRM 3.0.273, Administrative Reference Guide.

3.30.28.1.2  (03-01-2007)
Program Objectives

  1. The quality program’s primary objective is to prevent problems and eliminate the potential for errors in our processing system. QAO fulfills its mission to further the efficient and effective performance of all campus operations by achieving the following quality objectives:

    1. Manage an on-going, balanced QAP to provide information to managers on the quality of their operations.

    2. Analyze operations to identify systemic, procedural, or personnel related problems, and recommend action as appropriate.

    3. Engage in specific problem analyses at the request of management or upon identification of persistent problems.

    4. Conduct individual review (on some campuses) to improve the skill level and productivity of campus employees.

3.30.28.1.3  (03-01-2007)
Quality Review (QR) Processes and Responsibilities

  1. There are many types of review and programs to provide management with data about the quality of work. The Quality Program consists of the following quality processes:

    • The Quality Improvement Process (QIP).

    • Quality Analysis.

    • The Program Analysis System (PAS).

    • Process Review.

    • Management Review.

    • Individual Review.

    • Unit or Group Review.

    • Systems Process Analysis.

    • CAPR (Computer Assisted Pipeline Review).

    • OFP/Product Review—TEPS High Quality Work (HQW).

  2. The individual review process is designed primarily to random sample an individual’s work. The results provide feedback for performance evaluations, ratings, and rankings for promotions, awards, release recall, and incentive pay.

  3. The following systems record the individual review results:

    • The Total Evaluation Performance System (TEPS).

    • The Performance Evaluation Reporting System (PERS).

    • The Campus Quality Assurance Program (QAP).

3.30.28.1.3.1  (03-01-2007)
SC Duties

  1. The Quality Assurance Operations (or designated area) is responsible for the overall implementation of the Campus Quality Assurance Program. The QAO is responsible for:

    1. Editing Form 3927, Quality Assurance Defect List, to establish, update, or delete the codes on the Quality Assurance Defect Code File.

    2. Delivering the forms timely to the Batching Unit for ISRP (Integrated Submission and Remittance Processing) input.

    3. Updating and adding approved defect codes to the Automated (Auto) 3926 System.

    4. Coordinating the numbering and batching of Forms 3926, Quality Assurance Review Records, with the designated operation for ISRP input of changes which were not keyed into the Auto 3926 System.

    5. Coordinating the update and maintenance of the OFP Consistency File and Employee Master File via PCA/TEPS.

    6. Ensuring that the OFP codes used for Forms 3081 (Employee Time Report) reporting are identical to those on the Quality Assurance Cumulative Data File by description and number.

    7. Receiving and preparing adjustments for input of the quality data into PCA/TEPS.

    8. Providing training necessary to review and analyze data collected through the QAP.

    9. Reviewing locally proposed defect codes.

  2. Sections within the Quality Assurance Operations are responsible for:

    1. Coordinating and directing sampling reviews of data collected.

    2. Keying review data into the Auto 3926 System.

    3. Conducting technical review of locally developed defect codes for relevance of defect code language to error conditions identified in the performed reviews.

    4. Coordinating with unit supervisors/TEPS coordinators for input of adjustments to the quality portion of the Individual Performance Report (IPR) from PCA/TEPS.

    5. Preparing and submitting IPR quality adjustments to the Reports Unit for processing.

    6. Ensuring that all elements of the QAP are implemented as prescribed.

    7. Coordinating all of the above responsibilities with the Quality Assurance Operations for final approval.

  3. Data Conversion is responsible for:

    1. Inputting Forms 3926 each week timely.

    2. Inputting Forms 3927 for establishing or updating the Defect Code File as required.

  4. Computer Services (or Information Systems and Accounting) Division is responsible for producing the QAP reports (PCA and PCE runs) timely.

  5. The Reports Unit or designated area is responsible for updating and distributing the PCA/TEPS reports timely.

  6. Campus management determines the frequency and extent to which defects/defective documents identified during the review process are returned to the originating employees for correction.

3.30.28.1.3.2  (03-01-2007)
N.O. Duties

  1. Headquarters (MSPC–QA:M) is responsible for the following:

    1. Monitoring the effectiveness of the QAP.

    2. Maintaining the QAP procedures.

    3. Coordinating systemic and procedural changes.

3.30.28.1.3.3  (03-01-2007)
Internal Monitor Program

  1. The Internal Monitor Program, maintained by Internal Audit, is used to extract data on returns/documents in order to determine the magnitude of potential problems.

  2. As the need arises to use this program, coordinate with Internal Audit to arrange scheduling of the computer runs.

3.30.28.1.4  (03-01-2007)
Glossary

  1. Operation Quarterly Numerical Performance Standards for Employees Report—shows the Quality Program Clusters (QPC) for the program and functions for Individual Review and Process Review.

  2. Completed Work—work which required actions or decisions have been taken and is ready to be released by the employee to another function or disposed of as prescribed.

  3. Confidence Level—indicates the degree of trust in the results of the sampling process. For example, if a confidence interval is chosen to be 90%, this means that if we took 100 more samples of the same size from the same work, we would expect about 90 of those 100 additional samples to have estimates that fall within the upper and lower bounds of the Confidence Interval computed using the original sample.

  4. Cumulative Data File—a file which contains a numeric listing of OFP’s, sample sizes, number of defects, defect percent, quality standards, postings, adjustment dates, and documents processed.

  5. Defect—any error condition on a document/case/return.

  6. Defect Code—a three digit numeric code designated on the Defect List which pertains to a specific error condition.

  7. Defective Document/Case/Return —a document/case/return which contains one or more defects.

  8. Defect List—the list containing the defect codes established by the use of Form 3927 and/or issued in the Campus Quality Assurance Defect List. See Exhibit 3.30.28-17. Local defect codes can be added to the Headquarters defect list.

  9. Defect Code File—a computer file established by Headquarters for campus use. It contains all the defect codes and descriptions pertaining to specific error conditions.

  10. High Quality Standard (HQS)—the minimum acceptable accuracy rate of High Quality Work.

  11. High Quality Work (HQW)—OFP/Grade which management determined it would contribute towards a measured rating for Quality and historically had an average accuracy rate equal to or greater than the HQS. Rather than performing Individual Review on these OFPGs, OFP/Product Review will be performed to determine whether the work remains HQW. A minimum of 230 documents per OFP that is HQW are to be reviewed during the first eight (8) weeks of each quarter. Samples are to be pulled at least four (4) times per week. The employees working the HQW will systemically receive the minimum employee effectiveness score required to achieve the five (5) rating for that OFP/Grade.

  12. Individual Performance Report (IPR) —this weekly report provides the most recent information about each program worked by an employee for the previous week ending. It includes hours per OFP, volume, documents reviewed, percent accurate, and adjustments.

  13. Individual Performance Summary Report (IPSR)—this report is generated quarterly and provides cumulative data about each program worked by an employee. It includes hours, volume, documents reviewed, percent accurate, employee indexes (quality), effectiveness (efficiency) and overall rating for the prior twelve month period.

  14. Individual Review—a process designed to sample an individual’s work for the purpose of providing feedback to the employee and to provide the basis for evaluation of the employee’s performance.

  15. Learning Curve—hours provided to an employee to learn the required work of a specific OFP/Grade. Employees report volume for those hours but are not held accountable for the Numerical Performance Standards for the OFP/Grade.

  16. Local Defect Codes—these codes are established by each campus where additional information is desired. These codes are not to be used for individual evaluation.

  17. Management Review—review conducted by the manager on completed work performed by trained employees to determine effectiveness of training and readiness of the employee for individual review. For SCRIPS Managerial review, see the Supervisor Handbook for the Service Center Recognition/Image Processing System, Chapter 12.

  18. Measurable—for the employee’s work to be measurable in quality the employee must:

    1. Have 60 days on a measured performance plan.

    2. Work OFP’s with numerical performance standards.

    3. Achieve at least a 90% confidence level with a varying precision in the accuracy rate on the IPSR.

    4. Have completed the Learning Curve for the OFP/Grade.

  19. Numerical Quality Performance Standard —a rate of performance expressed as a percent accurate. Numerical Quality Performance Standards can be established for all measurable Organization Function Program (OFP)/grade combinations.

  20. Numerical Performance Standard —a value against which employee performance is measured. It is expressed as the number of documents per hour (quantity) or as a percent accurate (quality). This value is established by management.

  21. OFP Consistency File—a file that contains valid OFP information which is used to validate OFP codes used in PCA/TEPS, the QAP, and Work Planning and Control (WP&C) related projects.

  22. OFP/Product Review—a small random sample of work to determine whether the percent accurate has fallen below the High Quality Standard (HQS). Review is on OFP rather than OFP/Grade. The criteria for this review are located in IRM 3.43.401, Total Evaluation Performance System (TEPS). See IRM 3.30.165, Incentive Pay System for procedures to process HQW for Incentive Pay programs.

  23. Organization Function Program (OFP) —the Organization, a five position number, represents (from left to right) the Division, Operation, Section, Unit or Group. The Function, a three position number, represents a work action, group of actions, or specific action. The Program, a five position number, represents a specific program or operation.

  24. Percent Accurate—a figure representing the accuracy rate achieved by the employee per OFP/grade combination.

  25. Performance Evaluation Reporting System (PCA)—the system which reflects quantity and quality data as recorded on Forms 3081 and Forms 3926. This data is fed into the TEPS. (*The employee evaluation aspect of PCA has been replaced by TEPS).

  26. Precision—the degree of accuracy of sampled results.

  27. Procedural/Systemic or Organizational Errors—these errors are caused by faulty instructions, training or other systemic deficiencies beyond the control of the employee. These errors are charged to the unit/group as applicable. The frequency and/or volume of errors assessed might be indicative of unit/group effectiveness in meeting the unit/group objectives.

  28. Process Review—a work in process review method designed to provide immediate feedback of error trends for any process. The objective is to locate and stop the causes of errors "up front," and measure the accuracy and "fitness for use" of products output by any function.

  29. Program Analysis System (PAS)—a highly structured method of analysis that identifies systemic deficiencies and provides a basis for evaluating program effectiveness. This analysis, conducted by personnel within the Quality Assurance Operations, identifies taxpayer and processing errors, assesses the reasons for the error occurrences and recommends solutions.

  30. Quality Analysis—an appraisal of an operation or process to determine and report on whether that operation/process is functioning properly and whether it is conforming to established procedures, or to determine the core cause of quality problems.

  31. Quality Improvement—the process for breaking through to superior, unprecedented levels of performance. By identifying problems, areas of chronic waste and areas of improvement, and approaching them with methodical problem-solving process, project teams can determine true causes and reach solutions which reduce the level of waste and error. The result is an improved system or process which ultimately saves resources for the Service.

  32. Quality Program Cluster (QPC)—OFP/Grades with similar quality accuracy rates.

  33. Random Numbers Table—a table of numbers created by generating the digits "Zero" through "Nine" one after another in such a way that the order of the digits cannot be predicted.

  34. Rateable— an employee is rateable when he or she:

    1. Spends at least 25% of his/her total time on measurable work.

    2. Spends at least 40% of his/her direct time on measurable work.

  35. Recommended Annual Sample (RAS) —sample size for 52 weeks rating period for the QPC. The recommended sample size for all QPC's is 260 documents per annual rating period.

  36. Sample Size—the number of units selected for individual review.

  37. Sampling Accomplishment/Request Report (SAR)—a weekly report that monitors the amount of review (sampling) being performed on the employee’s work. It shows the amount of work sampled by QPC on a periodic and cumulative basis. It also shows the RAS, numbers of reviews remaining in the employee’s rating period, errors, percent accurate, and the volume of items produced by an employee on a periodic and cumulative basis.

  38. System/Process Analysis—a structured approach to reviewing a system/process to determine how people, material, equipment, methods and environment impact output, if the system is delivering as it was intended, if the system is stable and predictable and to identify improvement opportunities and recommend improvements.

  39. Total Evaluation Performance System (TEPS)—provides for the evaluation of campus bargaining unit employees under the numerical performance standards’ concept as mandated by National Agreement.

  40. Unit or Group Review—a review of a sample of a particular type of work completed within a unit or group.

3.30.28.2  (03-01-2007)
Individual Review Sampling

  1. This section provides procedures and guidelines for: sampling work, recording the selected samples, and processing unacceptable samples.

3.30.28.2.1  (03-01-2007)
Sampling Guidelines

  1. IRM 25.8.1 provides a list of valid organizations, functions, and programs used with Integrated Management Planning Information System (IMPIS). The 5995A database contains detailed exhibits of valid OFP combinations.

  2. All work processed during each week must be equally available for selection. This does not mean that all units will be reviewed. It means that the reviewer must have an opportunity to randomly select any unit for review after processing is completed.

  3. All work must be made available for review before being released to the next function. (Exception: sampling and review of Error Register can be accomplished after direct data input). Personnel, who supervise the work being processed, are responsible for routing and making available all completed work to reviewers in sufficient time for the required sampling and review to be performed.

  4. Intermediate deadlines which provide adequate time for review must be established for each activity.

  5. Processing personnel should not be aware of when the sample is taken or which units are selected. The sample is selected by the quality reviewer/clerk (preferably clerks) after the items have been released by the employee.

  6. Sampled work will be counted as specified in IRM 25.8.1 and the 5995A database.

  7. To ensure each employee’s work is identified for sampling, the employee’s work must have the date the work was completed and one of the following:

    1. The employee’s name.

    2. The employee’s SSN (Social Security Number).

    3. The employee’s IDRS examiner number.

    4. Other identification number.

  8. To facilitate sample selection and eliminate errors, employees must batch similar OFP work together, identifying the OFP and volume of each batch.

  9. A minimum of four (4) sample pulls is to be done each week for every employee.

3.30.28.2.2  (03-01-2007)
Sample Selection

  1. The process of selecting a portion of completed work for review is called sampling. A sample is defined as a part of a larger group of work from which conclusions about the total work can be drawn.

  2. Check the Reviews Requested column on the SAR to determine how many reviews are needed for the QPC’s shown in the QPC column (see Figure 3.30.28–1).

    Figure 3.30.28-1

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    Sampling Accomplishment/Request Report

  3. Locate the specified QPC’s on the Operations Quarterly Numerical Performance Standards for Employees Report (see Figure 3.30.28–2).

  4. Eliminate OFPs which are not included under the specified QPC from items to be sampled.

  5. Use the following sampling method to obtain the most valid sample:

    1. Select a portion of the sample every day rather than selecting the complete sample on any one day of the week. A minimum of four sample pulls per week is to be performed on each employee.

    2. Select these daily portions of the sample at different hours of the day in order to keep from establishing a regular pattern of selection.

    3. Some OFP’s will not have work spread out over the entire work week, i.e., cycled work, etc. These OFP’s will have to be sampled when the work is available even if the entire sample must be taken at one time.

    4. Select the sample randomly. See the Table of Random Numbers in IRM 3.30.28.2.2.2.

    Note:

    It is imperative that the sample is pulled randomly. If only one program is reviewed within the QPC, the accuracy rate for the one program can be applied to all programs in the QPC. Do not pull specific OFPs in an attempt to sample all programs worked by an employee.

    Figure 3.30.28-2

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    Operation Quarterly Numerical Performance Standards for Employees Report

3.30.28.2.2.1  (03-01-2007)
Recommended Annual Sample (RAS)

  1. The goal of the sampling system is to achieve a confidence level of ninety (90) percent in an employee’s overall effectiveness score, with varying precision.

  2. The RAS is the recommended number of reviews to be completed for each QPC that the employee works during the employee’s annual rating period. All QPCs have a RAS of 260 documents. (The only exception to this is the HQW QPC).

  3. It is the manager’s responsibility to establish weekly sample sizes, and to provide this information in advance to the Quality Review function. The SAR or local form (which contains the required elements) is used for this purpose.

  4. Changes to sample sizes resulting from gross errors in estimates of weeks worked and/or volume can be requested during the quarter. However, consistent treatment of employees must be ensured and changes must be approved by the employee’s Section Chief.

  5. When work is not consistently sampled, the following will result:

    1. Under or over sampling.

    2. Large weekly samples.

    3. Unmeasured evaluations may increase. An employee could be measured one quarter and unmeasured the next quarter in quality.

    4. The sample may not adequately represent the employee’s actual performance.

  6. The employee's manager should ensure the sample size is consistent with the information provided on the Sampling Accomplishment Report. When an employee has a low percent accurate, it does not indicate a higher sample is needed. When this occurs, the employee's manager should consider other methods (e.g., 100% review, OJT coach, additional training, etc.) to improve the employee's accuracy rate. Employees should not be removed from individual review after being released on a program for individual review. If an employee is placed on 100% review, managers should forward the employee's work to the Quality Assurance Manager for the individual review sample to be pulled. The Quality Assurance Manager should return the folder(s) to the employee's manager for 100% review of the remaining work.

  7. The employee’s manager will provide a list of employees with corresponding SSN’s to quality review to ensure compatibility between the SSN’s entered on Form(s) 3081 (or local form) and Form(s) 3926. The employee’s manager, at a minimum, furnishes Quality Review with:

    1. The employee name, or SSN, or both and Grade.

    2. The function/program for QPCs to be reviewed from the Operations Quarterly Numerical Performance Standards for Employees Report.

    3. The recommended sample size and QPC from the SAR/local form.

  8. The functional area managers are required to keep QAO informed of any changes to employee review availability.

  9. The quality review function should contact the employee’s manager for a sample size on any QPC’s worked by an employee if a sample size has not been requested.

  10. If resources are not available to review a sample size, the manager of the quality review function and the manager of the employee need to reach an agreement on the volume to be sampled.

  11. Do not increase the sample size to make up for under sampling in prior weeks.

3.30.28.2.2.2  (03-01-2007)
Table of Random Numbers

  1. A table of random numbers may be used as a guide in sampling completed work from numbered/unnumbered documents and printouts.

  2. The table of random numbers is created by generating the digits Zero through Nine, one after the other, in such a way that the order of the digits cannot be predicted.

  3. The two-digit table of random numbers is used in the following situations:

    • Work groups of 50 items or less. See Exhibit 3.30.28-1..

    • Work groups of 100 items or less. See Exhibit 3.30.28-2.

  4. The three-digit table of random numbers is used for work groups greater than 100 that do not exceed 1,000, for example, batches of 200. See Exhibit 3.30.28-3.

  5. Determine whether to use a two-digit or three-digit random number table. Other tables of random numbers can be obtained from the QAO or the Campus Statistician.

  6. Use the sample size recommended by the employee’s manager.

  7. When making selections, the start number and direction of selection should be changed at least weekly, to avoid setting a predictable pattern. Ensure daily sample selection consistency when using the random number table.

  8. The following are the guidelines for use of the two-digit Table of Random Numbers for work groups of 100 items or less (see Figure 3.30.28–3).

    Figure 3.30.28-3

    Table of Random Numbers-Work Groups of 100 Items or Less

                                           
    81 56 14 62 82 45 65 80 36 02 76 55 63 46 96 85 77 27 92 86
    39 06 63 60 51 02 07 16 75 12 90 41 16 44 19 15 32 63 55 87
    05 80 19 27 47 15 76 51 58 67 06 80 54 34 39 80 62 24 33 81
    76 30 26 72 33 69 92 51 95 23 26 85 76 74 97 80 30 65 07 71
    84 90 20 20 50 87 74 93 51 62 10 23 30 22 14 61 60 86 38 33
                                           
    95 41 20 89 48 98 27 38 81 33 83 82 94 40 03 96 40 03 47 24
    46 65 69 91 50 73 75 92 90 56 82 93 24 52 33 76 44 56 15 47
    78 21 65 65 88 45 82 44 78 93 22 78 09 37 59 20 40 93 17 82
    35 04 88 79 83 53 19 13 91 59 81 81 87 11 02 55 57 48 84 74
    83 30 46 15 90 26 51 73 66 34 99 40 60 10 33 79 26 34 54 71
                                           
    27 56 19 80 76 32 53 95 07 53 09 61 98 67 59 28 25 47 89 11
    92 57 66 59 64 16 48 39 26 94 54 66 40 93 50 75 20 09 18 54
    59 71 55 99 24 88 31 41 00 73 13 80 62 24 43 23 72 80 64 34
    71 11 43 00 15 10 12 35 09 11 00 89 05 39 91 63 18 38 27 10
    57 08 93 09 69 87 83 07 46 39 50 37 85 74 62 19 67 54 18 28
                                           
    51 18 07 41 02 39 79 14 40 68 10 01 61 91 03 35 60 81 16 61
    26 31 11 44 28 58 99 47 83 21 35 22 88 42 57 66 76 72 91 03
    48 68 08 90 89 63 87 00 06 18 63 21 91 06 36 63 06 15 03 72
    91 14 51 22 15 48 67 52 09 40 34 60 85 92 70 96 70 89 80 87
    55 81 36 11 88 68 32 43 08 14 78 05 34 91 08 88 53 52 13 04
                                           
    21 74 84 13 56 41 90 96 30 04 19 68 73 68 85 97 74 47 53 90
    90 84 24 91 75 36 14 83 86 22 70 86 89 59 54 13 09 13 80 42
    53 38 78 65 87 44 91 93 91 62 76 09 20 39 18 32 69 33 46 58
    40 57 56 54 42 35 40 93 55 82 08 78 87 67 43 31 09 12 60 19
    06 66 82 71 28 36 45 31 99 01 03 35 76 61 75 37 19 56 90 75
                                           
    46 23 65 71 69 20 89 12 16 56 61 70 41 78 10 91 11 00 63 19
    25 18 23 23 56 24 03 86 11 06 46 10 23 93 23 71 58 09 78 08
    77 89 28 17 77 15 52 44 15 30 35 12 75 37 55 48 82 63 89 92
    27 66 19 53 52 49 98 45 12 12 06 00 32 62 13 11 71 17 23 29
    20 84 30 02 03 62 68 58 38 04 06 89 94 29 89 97 47 03 13 20
                                           
    32 84 82 64 97 13 69 86 20 09 80 46 75 16 94 85 82 89 07 17
    90 50 38 93 84 32 28 96 03 65 70 90 12 04 93 10 59 75 12 98
    26 94 51 40 51 53 36 39 77 69 06 25 07 95 71 43 68 97 18 85
    48 94 60 65 06 63 71 06 19 35 05 32 56 86 05 39 14 35 48 68
    20 28 22 62 97 59 62 13 41 72 70 71 07 59 30 60 10 41 31 00
                                           
    16 65 12 81 56 43 54 14 63 37 74 97 59 05 45 35 40 54 03 98
    91 78 04 97 98 80 20 04 38 93 13 92 30 71 85 17 74 66 27 85
    92 57 22 68 98 79 16 23 53 56 56 07 47 80 20 32 80 98 00 40
    55 36 95 57 25 25 77 05 38 05 62 57 77 13 50 78 02 73 39 66
    04 33 49 38 47 57 61 87 15 39 43 87 00 67 92 65 41 45 36 77
                                           

    Table of Random Numbers - Work Groups of 100 Items or Less

    1. Determine the direction you will move on the random number table.

      Left to Right Right to Left Top to
      bottom
      Bottom to
      Top

    2. Determine a random number starting point by placing the pencil anywhere on the table random numbers.

    3. The number nearest the pencil point in the direction you decided upon is the first item in the sample. Continue selecting the next numbers in the column until the sample size is selected.

  9. This is an example of the above procedure. There are 100 documents in the batch, and the daily sample size is six. You have decided to select the sample moving from top to bottom on the chart. You have placed your pencil on "27," in column 7 row 6 of the table.

    1. Write down the next six numbers that fall between 00–99. In this case, moving from top to bottom, the numbers would be 75, 82, 19, 51, 53, and 48.

    2. Rearrange the numbers in numerical order (19, 48, 51, 53, 75, 82) to simplify sample selection. Ignore duplicate numbers.

    3. Mark the stopping point when the sixth number is reached. The number sequence should be used for that day only. Resume selection at this point for the next day’s sample.

    4. Pull the items for the sample using the numbers selected from the table. Follow this procedure each day when selecting samples for review.

  10. Change the direction of selection at least weekly to avoid setting a predictable pattern.

3.30.28.2.2.3  (03-01-2007)
IDRS Sampling

  1. The Integrated Data Retrieval System (IDRS) Quality Assurance Review File (QARF) will be sampled periodically in all functional areas for which IDRS transactions are input. This review must be done weekly. The review can be done on-line using Command Code (CC) "RVIEW " to retrieve the necessary information for review.

  2. Another source that can be used in the random selection process is the Quality Review Index Listing (QRIL) (see Figure 3.30.28–4). The listing shows all IDRS transactions available for review. IDRS CC "QRIND" can be used to obtain a screen display of the QRIL.

  3. The QRIL should be reviewed periodically by the functional manager to ensure there are supporting documents for the entries on the listing.

    Figure 3.30.28-4

    QUALITY REVIEW INDEX FOR 08-14-93 GROUP 57 PAGE 003
                       
    INPUT
    DATE
    EMPLOYEE
    NUMBER
    TIN MFT
    CDE
    PLN
    NBR
    TAX
    PER
    AP/
    PN
    REVIEWER
    NUMBER
    NAME
    CTRL
    TRN/LTR
    CDE/NUM
    08-13-93 10575012 32-6271777 10 000 9212 0001 999999999 COLL 290
        32-1081393 02   9301   999999999   320C
        32-1249440 02   9112   999999999   5418AU
        32-1613413 01 000 9109 0001 999999999 GALI 270
        32-1760845 01 000 9303 0001 999999999 DOUG 290
        32-1985657 02   9212   999999999   096C
        32-2216889 01 000 9209 0001 999999999 BROO 290
        32-0253765 01 000 9303 0001 999999999 REPU 290
        32-2262214 00 000 0000 0001 999999999 CLAS 013
    08-13-93 10575061 32-0389390 02 000 9112 0001 999999999 REUT 290
    08-13-93 10575125 32-0680779 03 000 9212 0001 999999999 MORR 290
        32-0682920 03 000 9303 0001 999999999 ALLI 290
        32-1020016 01 000 9212 0001 999999999 SOUT 971
        32-1020016 01 000 9212 0001 999999999 SOUT 612
        32-1020016 01 000 9212 0003 999999999 SOUT 291
        32-0755694 02 000 9212 0001 999999999 MANT 290

    Quality Review Index Listing (QRIL)

3.30.28.2.2.4  (03-01-2007)
Sampling Log

  1. As samples are selected for individual review, a sampling log must be maintained as a control for each employee’s review. The log is also maintained as an audit trail for research purposes.

  2. The sampling log must contain at least the following:

    1. The employee name and other identification number.

    2. The QPC and function/program/grade.

    3. The recommended weekly sample.

    4. Daily volume for sample.

  3. The sampling statistics can be recorded using the Reviews This Week area of the Sampling Accomplishment/Request Report (SAR). See Figure 3.30.28–5. Other local forms, which contain the required elements, can be used for this purpose. IRM 3.30.28.3 has more information on the SAR.

  4. Retain Sampling logs as prescribed in the Records disposition IRM, (1.15.29, Records Control Schedule for Tax Administration - Wage and Investment Records).

    Figure 3.30.28-5

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    Sampling Accomplishment/Request Report

3.30.28.3  (03-01-2007)
Sampling Accomplishment/Request Report (SAR)

  1. This section provides information about the frequency and distribution of the SAR and describes the data elements on the report.

3.30.28.3.1  (03-01-2007)
SAR Frequency and Distribution

  1. A Sampling Accomplishment/Request Report is generated weekly and is distributed to managers. It is shared with Quality managers by the Team managers.

3.30.28.3.2  (03-01-2007)
SAR Description

  1. The SAR (Figure 3.30.28–6) is by individuals and shows by OFP/grade, an employee’s weekly and cumulative data for volume, or work sampled, or both. The report includes any adjustments made to an individual’s record.

  2. The SAR only contains data on employees who meet all of the following criteria:

    1. Employees were assigned to a Measured Performance Plan.

    2. Employees were in work status as of the period ending date of the report.

    3. Employees met learning curve hours on any OFPGs that have Quality Base Points.

    4. Employees reported volume.

  3. The SAR is sorted by:

    • Assigned organization

    • Employee status (permanent, then all other)

    • Alpha order

    Figure 3.30.28-6

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    Sampling Accomplishment/Request Report

  4. The SAR is used by managers to determine whether the RAS for each employee is being met. Below is a description of the items on the SAR:

    1. Run date—the date the report is generated.

    2. Page—consecutive page numbers are generated as the report is printed.

    3. Period Ending—Saturday’s date unless the quarter ends on Monday, Tuesday, Wednesday, or Thursday. Two reports generate when this situation occurs.

    4. The first report contains data for the days in the week up to the quarter ending.

    5. The second report contains data for the remaining days of the week through the Saturday date.

    6. Org (Organization Code)—a five-digit numeric code that identifies the operation, section, unit/group. In Figure 3.30.28–6, 81 represents the operation, 1 the section and 20 the unit/group.

    7. SSN/Name—employee’s names in alphabetical order, and their social security numbers.

    8. Employee Class—indicates employee’s status.

      Permanent Seasonal Intermittent career/career conditional Intermittent non-career/career conditional Term

    9. Grade/Step—two digit numeric code that identifies the grade/step of the employee that performed the work. If an employee has more than one grade during the reporting period, the report will reflect data for each grade/step.

    10. QPC’s/Volume—Only the QPC’s and volume for which the employee has reported hours during the rating period.

    11. Period Titles

      Last Week This Quarter Rating Period—annual rating period for the employee

    12. RSD/RAS—Recommended Sample to Date/Recommended Annual Sample size.

    13. Total Reviewed—Total of all reviews done for the QPC for an employee, broken down by week, quarter, and rating period.

    14. Remaining—RAS minus Total Reviewed in the rating period.

    15. Reviews Requested—area for managers to record the sample size for review.

    16. Errors Assigned—the number of defective documents for the week, quarter, and annual rating period rounded to one decimal place.

    17. Percent Accurate—a figure representing the quality effectiveness achieved by an employee. Total reviewed minus total errors, divided by the total reviewed, multiplied by 100.

    18. Reviews This Week—this area can be used to record the weekly sample data instead of maintaining a separate sample log.

3.30.28.4  (03-01-2007)
Individual Review Procedures

  1. The responsibilities of a quality reviewer and how to perform individual quality review are discussed in this section.

3.30.28.4.1  (03-01-2007)
How to do Individual Review

  1. Quality reviewers are responsible for the performance of individual review on an employee’s work. The quality reviewer completely reviews the designated sample of an employee’s work to determine how well the employee has followed established procedures and guidelines. Since this information is used as an evaluation tool, the review must be accurate, valid, and consistent. Reviewers must establish a climate of integrity, efficiency, and fairness.

  2. New quality reviewers, who are unfamiliar with how individual review is done, should do the following:

    1. Attend a training class for individual review.

    2. Read and review the training Course book 2335-G-001, SCRIPS Quality Review.

    3. Sit with an experienced reviewer. Observe how the reviewer does the job, reviews employees’ work, identifies problems or defects, and prepares the necessary forms pertaining to the review process.

    4. Ask questions you may have about the work.

  3. Obtain the necessary information to perform adequate review on employees. See Recommended Annual Sample (RAS) in IRM 3.30.28.2.2.1. If resources are not available to review the recommended (sample) volume, the quality review function should reach agreements with the employee’s manager on the volume to be sampled.

  4. Verify that the correct volume was pulled for review by checking the Reviews Requested column on the SAR (see Figure 3.30.28–1). Then, verify that the correct OFPs for the specified QPCs on the Operations Quarterly Numerical Performance Standards for Employees Report (see Figure 3.30.28–2) were pulled for review.

  5. Ensure that all work identified for quality review is stamped. Use a quality review stamp that has the campus operation identification symbols. The exceptions for stamping work are:

    • Refunds

    • Notices

    • Other items issued to the taxpayer

  6. Ensure that a Sample Log is prepared to record the samples pulled daily for each employee by function, program, and grade.

  7. Review the employee’s work using the appropriate IRM(s) and established procedures for work processed, and IRM 3.30.28. for quality review guidelines. The following forms and reports are necessary to perform and record individual review.

    1. Obtain and use the Defect Lists to identify the employee’s error, and problems. There are two types of defect lists: national and local. See Exhibit 3.30.28-17. for the national defect list. Use a local defect code only when a national defect code is unavailable. See IRM 3.30.28.4.2 for additional information on defects.

    2. Prepare Form 5963, Quality Assurance Review Notice (or the optional local form) to provide a specific description of the error condition identified. Prepare Form 5963 (see Figure 3.30.28–7) as the review is performed and only when errors are found. Form 5963 and the work sample go to the employee’s manager and subsequently to the employee. See IRM 3.30.28.4.3 for instructions on further use of Form 5963.

      Figure 3.30.28-7

      This image is too large to be displayed in the current screen. Please click the link to view the image.

      Quality Assurance Review Notice

    3. Input review results to Auto 3926 (see IRM 3.30.28.4.7) or prepare Form 3926, Quality Assurance Review Record, when Auto 3926 is not available or the situation in 3.30.28.4.4.3(2) exists. Auto 3926/Form 3926 lists the number of records sampled, the kind of error (defect code) and the number of defects or defective records. Instructions for the preparation of Form 3926 are in IRM 3.30.28.4.4.1.

    4. Send Form 3926 (or optional local form) to the designated operations for numbering. Then forward it to ISRP for transcription for entry to PCA/TEPS (see Figure 3.30.28–8). Numbering and ISRP transcription are not necessary when Auto Form 3926 is used.

      Figure 3.30.28-8

      This image is too large to be displayed in the current screen. Please click the link to view the image.

      Quality Assurance Review Record (Paper)

    5. Prepare Form 3926 to make corrections to the Cumulative Data File. See IRM 3.30.28.4.4.4 for instructions.

    6. Prepare Form 6489, Individual Performance Adjustments, to adjust individual review data. Refer to IRM 3.30.28.4.5 for additional information.

    7. Prepare Form 3927, Quality Assurance Defect Code List, to make adjustments to the Quality Assurance Defect Code file. The instructions for preparation of Form 3927 are in IRM 3.30.28.4.8. The reasons for adjustments are as follows: to delete, to add (establish) data, to replace data.

      Reminder:

      All changes to the Quality Assurance Defect Code List must be cleared through the Quality Assurance Operations or designated area.

    8. Resolve error conditions for Forms 3926 and 3927. IRM 3.30.28.4.9 contains instructions for processing Invalid Forms 3926/3927.

3.30.28.4.2  (03-01-2007)
Defect Classifications

  1. A defect is any portion of a document, case, or return in error. The defects used to perform quality review are classified specifically to identify errors made by employees. Defects are included on the Quality Assurance Defect Code List, or local forms. See Exhibit 3.30.28-17. Each defect is assigned a code followed by a short narrative explanation.

  2. The other categories are listed below:

    • Multiple Identical Defects

    • Identical Defects

    • Procedural/Systemic or Organizational Defects

    • Local Defects

      Note:

      Each campus may establish local defect codes only if there is no duplication of the Headquarters defect codes.

3.30.28.4.2.1  (03-01-2007)
Defect Codes

  1. Management uses the defect codes for tracking purposes and for identifying error trends within a program and in an employee’s work. All defects are input to PCA/TEPS when reviewing the work of an individual employee within "measured" work areas. Defects have the potential of resulting in the following:

    1. Unacceptable input to computer process or to the Master File computer programs.

    2. Incorrect Master File entries.

    3. Accountability imbalances.

    4. Unnecessary production delays, increased costs or increased workloads.

    5. An adverse impact on the taxpayer.

3.30.28.4.2.2  (03-01-2007)
Multiple Identical Defects

  1. If the quality reviewer detects the same defect on every document/case/return in the sample, it is defined as a multiple identical defect. When the action taken on one document/case/return causes the balance of the sample to be incorrect, this is defined as a multiple identical defect. An example of this type of defect would be in the numbering of returns.

  2. When the quality reviewer establishes that the same error is committed on all of the documents in the same sample or every error in the sample is the same defect, the error on only one of the defective documents is charged to the employee.

  3. The remaining like errors are charged to the unit/group using the critical defect codes found on the document that is not considered a multiple identical defect.

  4. All defects must be charged to the employee if at least one document in the sample is worked correctly in relation to that defect. Each additional defect found on the document that is not considered a multiple identical defect is charged to the employee.

  5. Multiple identical defects should be discussed with the employee’s supervisor/manager. If the errors indicate a quality problem exists, the manager should determine if the errors were caused by faulty instructions, training, or other systemic deficiencies over which the employee has no control.

3.30.28.4.2.3  (03-01-2007)
Identical Defects

  1. If the quality reviewer detects two or more errors of the same type in the same sample for individual review, and one or more cases in the sample, of the same type, is worked correctly (illustrating that the employee is aware of the proper procedure), they are defined as identical defects.

  2. After the defects are identified as identical defects, the quality reviewer should:

    1. Report the defects reviewed to the employee’s manager.

    2. Record the original sample to the employee’s SSN on Form 3926.

    3. Charge the defects/defective documents to the employee (except multiple identical defects).

  3. High error samples should be discussed with the employee’s supervisor/manager.

3.30.28.4.2.4  (03-01-2007)
Procedural, Systemic or Organizational Defects

  1. Procedural, Systemic or Organizational Errors are errors caused by faulty instructions, training or other systemic deficiencies over which the employee has no control.

  2. When it is determined by the appropriate management official and the QAO that an error is to be recorded to the unit/group rather than an individual, the appropriate Headquarters 900 series defect code is recorded on Form 3926.

  3. Campus managers and the QAO should monitor the QA Unit/Group Report and the Periodic Defect Frequency OFP Summary for excessive use of these codes, and take action to correct the problem.

  4. Headquarters assigned the following defect codes to describe procedural, systemic or organizational errors.

    • 900—No training or inadequate training received

    • 901—No procedures available (N.O.)

    • 902—Improper or no desk instructions

    • 903—Verbal instructions given are inadequate or erroneous

    • 904—Procedures not given or given in error

    • 905—Insufficient time to complete case

    • 906—Case over-age (organizational responsibility)

    • 907—Disagreed errors removed at the request of the supervisor

    • 908—Errors removed from the individual and charged to the unit/group at the request of the supervisor

    • 909—Adjustment not input (Source Document shows correct amt; systemic)

      Note:

      Maintain supporting documents for Codes 907 and 908, if possible.

3.30.28.4.2.5  (03-01-2007)
Local Defects

  1. Local Defect codes are established when an error condition is present and the Headquarters established codes do not apply.

  2. The codes are established using Form 3927 (see IRM 3.30.28.4.8).

3.30.28.4.3  (03-01-2007)
Prepare F5963, Quality Assurance Review Notice

  1. Form 5963 is used by quality reviewers to document error conditions located during sampling of individual employees work. This form should not be used if no errors are found. The employee’s manager should determine how to give feedback to employees when no errors are found during review.

  2. When the defect description is too general See Exhibit 3.30.28-17. the defect code and a specific description of the error should be documented.

  3. The use of this form is optional where computer registers are used (e.g., ISRP or IDRS).

  4. An example of Form 5963 is shown (See Figure 3.30.28–9) as follows:

    Figure 3.30.28-9

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    Quality Assurance Review Notice (Elements)

    1. Item 1 Enter the employee’s identification number.

    2. Item 2 Enter the eight-digit date of the last day of the week in MMDDYYYY format.

    3. Item 3 Enter the three-digit Function Code.

    4. Item 4 Enter the five-digit Program Code applicable to the work being reviewed.

    5. Item 5 Enter the volume applicable to the review performed.

    6. Item 6 Enter the number of defective documents applicable to the review performed.

    7. Item 7 Enter the quality reviewer’s identification number.

    8. Item 8 Enter the eight-digit date of the current day of the week in MMDDYYYY format.

    9. Item 9 Enter the three-digit defect code(s) applicable to the defect(s) found.

    10. Item 10 Briefly describe the defect(s) found during the review. Enter schedule and line number where applicable.

3.30.28.4.4  (03-01-2007)
Form 3926, Quality Assurance Review Record

  1. Form 3926 is used to establish/update the Cumulative Data File and to record the results of the individual reviews.

  2. It is the source document for the QAP reports. It is also the source document for individual employee quality data included in those portions of the Individual Performance Report (IPR) generated for specified "measured" work areas.

3.30.28.4.4.1  (03-01-2007)
Form 3926 Procedures

  1. Prepare Form 3926 (reviewer or clerk) for all samples reviewed.

  2. Record multiple samples for several employees on the same Form 3926, if the OFPs are the same.

  3. Establish local controls so that the work recorded on Form 3081 is reviewed and recorded on Form 3926 for the same week ending. To accomplish this, set a local cutoff date for quality review (Form 3926). This will enables PCA/TEPS to generate accurate quality reports and evaluation data.

  4. Record samples reviewed as "Individual " on Form 3926 with the employee’s SSN.

  5. Record samples reviewed as "Individual " and charged as a unit/group error (at the request of the manager) on Form 3926 without an SSN.

    Caution:

    PCA/TEPS only accepts review data when the employee’s SSN is present. Therefore, if an employee’s SSN is not put on Form 3926 when recording "Individual" review, the data will only be input to the QAP.

  6. Place a pound sign (#) to the left of the SSN field, as an optional approach, for those transactions that do not have an SSN entered. This lets Numbering know that the SSN was deliberately omitted.

  7. Record all defects on Form 3926 for statistical purposes, since a document can have more than one defect.

  8. Enter a maximum of 10 separate defect codes for each sample recorded on Form 3926. Each sample is known as a " transaction." (As a courtesy, skip a line between each transaction on Form 3926.)

  9. Record the sample as two separate entries or transactions, when more than 10 defect codes are entered for a sample.

  10. Enter a maximum of 200 defect codes on Form 3926 per OFP combination.

3.30.28.4.4.2  (03-01-2007)
Description of Form 3926

  1. The items entered on Form 3926 (See Figure 3.30.28–10) are explained in detail below:

    Figure 3.30.28-10

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    Quality Assurance Review Record (Elements)

    1. Sheet ___of ___located in the upper right corner of Form 3926, enter the page number(s) as appropriate.

    2. Item 1 Document—optional. Form number or document reviewed.

    3. Item 2 Operation—reviewed functional organization and program title.

    4. Item 3 Reviewer—first initial and last name of reviewer.

    5. Item 4 Date—the eight-digit week ending date in MMDDYYYY format.

      Exception:

      Use the quarter ending date, if reports are prepared for the end of a quarter.

    6. Item 5 Defect List No.—the three-digit Defect List Number which will always be ‘000’.

    7. Item 6 Organization—the five-digit Organization Code of the unit for which the review was performed. This is the organization code used by the employees on Forms 3081.

    8. Item 7 Function—the three-digit Function Code applicable to the work being reviewed. This is the function code used by employees on their Form 3081.

    9. Item 8 Program—the five-digit Program Code applicable to the work being reviewed. This is the program number used by employees on their Forms 3081.

    10. Item 9 Serial Number—preparer will leave this area blank.

      Exception:

      Transactions that do not have an SSN entered, place a pound sign (#) to the left of the SSN field to alert numbering. (Check local procedures)

    11. Item 10 Employee’s Social Security Number—if individual review, enter the employee’s nine-digit SSN. If unit/group error, leave blank and place a (#) pound sign to the left of column 10 (in column 9). (Check local procedures.)

    12. Item 11 Number of Records Sampled—enter volumes of records sampled for individual review.

    13. Item 12 Defect Code—the applicable Defect Code from Form 3927, Quality Assurance Defect List, is entered when errors are found in the sample. This item is left blank when there are no errors in the sample. If two transactions are needed for individual review, the SSN needs to be reentered in field 11 along with counts for Sample Size, and Defective Documents for the second transaction (see Figure 3.30.28–11).

    14. Item 13 Defect Tally—this column is for the convenience of the reviewer to tally the defect count for each defect code.

    15. Item 14 Number of Defects—the cumulative total of each defect found in the sample. Leave this column blank when there are no defects. Do not enter a zero.

      Figure 3.30.28-11

      This image is too large to be displayed in the current screen. Please click the link to view the image.

      Quality Assurance Review Record (Elements Cont.)

    16. Item 15 Defective Records Tally—this column is for the convenience of the reviewer to tally the number of defective documents in the sample and is optional. Count only one document as defective whenever one or more defects are found in that document.

    17. Item 16 Number of Defective Records—the cumulative total number of defective documents found in the sample. Leave this column blank when there are no defective documents. Do Not Enter a Zero.

    Note:

    When there is a defect code in column 12, there must be entries, in columns 14 and 16, on the same line as the pound (#) sign, or the employee’s SSN.

3.30.28.4.4.3  (03-01-2007)
Record QR Results

  1. The following is the procedure for recording sample size, defects, and defective documents on Form 3926.

    1. Assess the total error conditions.

      Total # of defective docs due to systemic errors Total # of defective docs due to empl & systemic errors Total # of defective docs due to only an empl error

    2. Determine which defective documents are reported to the unit. Employees can’t be held accountable for these defective documents.

    3. Determine which defective documents are reported to the employee. Example:

    Total original employee sample size15
    Total defective documents in the sample4
    Total defective documents due to only a systemic error3
    Total defective documents due to employee or both
    employee and systemic errors1

  2. Input through TAPS (Totally Automated Personnel System), (see IRM 3.30.28.4.7), or record results on Form 3926 for input through ISRP (Integrated Submission and Remittance Processing System).

    Unit
      Total sample recorded for the unitBlank
      Total defective documents reported to the unit3
    Employee
      Total defective documents reported to the unit3
      Total sample size recorded for the employee12
      Total defective docs reported to the employee1

      Sample
    Size
    Defective
    Documents
    Recorded to Unit 3 3
         
    Recorded to
    Employee
    12 1
         
    Total 15 4

3.30.28.4.4.4  (03-01-2007)
Cumulative Data File

  1. Each week the Cumulative Data File is established, or updated, or both when Form 3926 is input by ISRP.

  2. The Cumulative Data file is a numeric listing of:

    • OFP’s.

    • Defect List Number (‘000’).

    • Defect Codes.

    • Sample Size.

    • Number of Defects.

3.30.28.4.4.5  (03-01-2007)
Correct the Cumulative Data File

  1. The term, Adjustment, means that a correction needs to be made to the Cumulative Data File. It is made via the input of Form 3926.

  2. Prepare Form 3926 with Adjustment noted in the upper left margin of the form in the locally designated color.

  3. Only one Adjustment Form 3926 can be input per week for each organization code, function code and program code combination. One or more of the following is corrected with one Form 3926.

    • Number of documents sampled

    • Number of defects

    • Number of defective documents in the sample

  4. When there are several records to be adjusted to a specific OFP, combine the sample size, number of defects per defect code, and the number of defective documents. Enter the cumulative totals on one Form 3926 in the applicable columns 11, 14, and 16. The fifth position of columns 11, 14 or 16 must contain one of the characters listed below:

    Increase Decrease
    A–1 J–1
    B–2 K–2
    C–3 L–3
    D–4 M–4
    E–5 N–5
    F–6 O–6
    G–7 P–7
    H–8 Q–8
    I–9 R–9
    {–0 }–0

  5. Do not enter an SSN in column 10 of Form 3926 when an adjustment is input to the Cumulative Data File.

  6. The Adjustment Form 3926 (see Figure 3.30.28–12) shows how the Number of Records Sampled, Number of Defects and Number of Defective Records are input to adjust the Cumulative Data File.

    1. The total number of records sampled is increased by 43(4C).

    2. Defect code 234 is increased by 3 (C).

    3. Defect code 509 is increased by 2 (B).

    4. Defect code 627 is increased by 10 (1#).

    5. Number of defective records is decreased by 12 (1K).

      Note:

      Changes to the Cumulative Data File are not related to changes made to the Individual review.

  7. Adjustments to individual review data are made with Form 6489, Individual Performance Report Adjustment. See IRM 3.30.28.4.5 for information on adjusting individual review data. Refer to IRM 3.43.401, Total Evaluation Performance System (TEPS) for Managers of Measured Employees, for information on how to make TEPS adjustments. Remember, if individual data needs to be adjusted, prepare these two forms:

    • Form 6489 to correct the IPR

    • Form 3926 to correct the Cumulative Data Files

    Figure 3.30.28-12

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    Adjustment Form 3926

3.30.28.4.5  (03-01-2007)
IPR Adjustments (Form 6489)

  1. The IPR Adjustment form is used to adjust hours, volume, documents in error or documents reviewed on the PCA and TEPS cumulative files.

  2. Managers and quality assurance personnel, determine what items are adjusted. A corresponding adjustment may have to be made to the Performance and Costs Reports as outlined in IRM 25.14, IMPIS Overview for Work Planning and Control, Production Control and Performance Reporting.

  3. Cumulative file adjustments are used to adjust specific data on performance file records. Adjustments may also be used to add or delete entire records.

  4. Weekly IPRs are monitored to ensure accuracy of Forms 3081 and 3926 data. If data posts incorrectly, then the data needs to be adjusted (all or partial) by adding or subtracting the difference in data that did or did not post.

  5. The Weekly Data Not Posted to TEPS Report contains records of employees quality review that did not find a matching volume record. These OFPs need to be researched and verified and if adjustments are needed, they would be put on a Form 6489 and processed accordingly. (This is "TEPS only" adjustment—they have already posted to PCA.)

  6. IPR Adjustments are printed in the Adjustments Section of the Weekly IPR whose period covered includes the calendar date on which the adjustments were input into the system.

  7. In some instances, the IPR Adjustments are printed in the Adjustments Section of the Weekly IPR one week later than the adjustments are included in the Quarter Recap Section of the Weekly IPR. Example,

    Input IPR Print (in Adjustments
    Section of Wkly IPR)
    Included in Qtr Recap
    Section of Wkly IPR
    Monday 6/2/97 Period of 6/7/97 Period of 5/31/97

3.30.28.4.5.1  (03-01-2007)
Distribution of Form 6489

  1. The IPR Adjustment form (see Figure 3.30.28–13) is submitted to the Reports Unit and distributed to the designated areas as follows:

    1. First copy (original) to Data Conversion area responsible for input.

    2. Second copy (TEPS copy) to Reports Unit or the designated area responsible for input.

  2. The Reports Unit, or the designated area responsible for input, returns to managers any Forms 6489 that contain records that could not be processed as submitted. Managers are responsible for:

    1. Verifying, correcting and annotating records.

    2. Preparing a new Form 6489 for all records not processed.

    3. Discussing discrepancies with the designated area.

    Figure 3.30.28-13

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    IPR Adjustments, Form 6489

3.30.28.4.5.2  (03-01-2007)
Key Elements of the IPR Adjustments (Form 6489)

  1. Preparer’s Name—The name of the preparer of this form.

  2. Preparer’s Title—The title of the preparer.

  3. Phone Number and Mail Stop—The telephone number and Mail Stop of the preparer.

  4. Date—Date this form was prepared.

  5. Organization—The 5 digit organization code being adjusted.

  6. Function—The 3 digit function code being adjusted.

  7. Program—The 5 digit program code being adjusted.

  8. SSN—The 9 digit social security number for the employee whose IPR is being adjusted.

  9. Last Name—The employee’s last name.

  10. Hours (If applicable)—Hours and tenths. Write a (−) minus sign above the last digit to indicate a decrease. Absence of the (−) indicates an increase. "+s" are not needed.

  11. Volume (If applicable)—Write a (−) minus sign above the last digit to indicate a decrease. Absence of the (−) indicates an increase. "+s" are not needed.

  12. Documents Reviewed (If applicable)—Write a (−) minus above the last digit to indicate a decrease. Absence of the (−) indicates an increase, +s are not needed.

  13. Errors (If applicable)—Write a (−) minus above the last digit to indicate a decrease. Absence of the (−) indicates an increase, +s are not needed.

  14. A/C PCA (Adjustment Code)—The one digit that determines what quarter is being adjusted. See IRM 3.30.50, Performance Evaluation Reporting System for additional information.

    1 Reduce to zero all current quarter hours, volume, documents reviewed, and documents in error fields on current grade cum job record or prior grade cum job record
    2 Establish a new current grade cum job record or prior grade cum job record for the current quarter
    3 Substitute values for hours, volume, documents reviewed and documents in error on current grade cum job record or prior grade cum job record for current quarter
    4 Add or subtract values for hours, volume, documents reviewed and documents in error to the current grade cum job record or prior grade cum job record for the current quarter
    9 Reduce to zero all first prior quarter hours, volumes, documents reviewed and documents in error fields on the current grade cum job record or prior grade cum job record
    A Reduce to zero all first prior quarter hours, volumes, documents reviewed and documents in error fields on the current grade cum job record or prior grade cum job record
    B Create a new current grade cum job record or prior grade cum job record for first quarter
    C Substitute values for hours, volumes, documents reviewed and documents in error on current grade cum job record or prior grade cum job record for first prior quarter
    D Add or subtract values of hours, volumes, documents reviewed and documents in error to the current grade cum job record or prior grade cum job record for first prior quarter

  15. M/C PCA (Measured Code)—A one digit code for measured or unmeasured OFPs. "1" is measured. "2" is unmeasured.

  16. M/C TEPS (Measured Code)—A one digit code that determines if an OFPG is measured or unmeasured.

    M/C TEPS (Measured Code)
    1 Programs with volume, fixed standards and the employee is on a Measured Performance Plan
    2 Programs with hours only
    3 Programs with hours, volume, and no numerical performance standards; or programs with hours and volume for an employee on an unmeasured performance plan
    4 Programs with numerical performance standards and the employee is on a measured performance plan for less than 91 days and not under a learning curve
    5 Adjustments to learning curve information

  17. Grade—Two digit grade at the time the work was performed.

  18. R/C PCA (Record Code)—One digit code that indicates if the employee’s grade is a current or prior grade. 1—adjusts the employee’s current grade; 2—adjusts the employee’s prior grade. See IRM 3.30.50, Performance Evaluation Reporting System for further information.

  19. Quarter Ending (YYYYMM)/Week Ending (MMDDYYYY)—The ending month of the quarter being adjusted (i.e.200203). The week ending date for the week being adjusted (i.e. 03012002)

  20. Step—Employee’s step at the time the work was performed.

  21. Time Code—One digit time code used on Form 3081, Employee Time Report. See IRM 3.30.50, Performance Evaluation Reporting System and Form 3081, Employee Time Report, for a list and description of Time Codes.

  22. Class Code—One digit class code for the employee at the time the work was performed.

    1 Full Time Perm
    2 Part Time Perm, or Seasonal, or both
    3 Intermittent
    4 Temp
    5 Term

3.30.28.4.6  (03-01-2007)
Quality Standards

  1. Form 3926–A, Quality Standard Input Record (see Figure 3.30.28–14) must be prepared by the TEPS Coordinator or, Quality Assurance Staff to establish or change the quality standard. In any attempt to insert/update/delete records on the Quality Standard File, a value must be present in the Quality Standard field of the 3926–A.

  2. The method for establishing and revising Quality Standards has changed under TEPS. IRM 3.43.401, Total Evaluation Performance System, provides instructions for the establishment of numerical performance standards.

  3. After eight weeks, the Base Points/Numerical Performance Standards may change. Locally, determine who is responsible for preparing Form 3926–A if a change, deletion, or addition is needed to the Quality Standard. The Quality Standard can be the same standard as that determined for the "3" standard rating level. The " 3" standard is a fully successful level of performance.

  4. The Quality Standards that appear on the 3926 Posting Transactions, QAO Summary Report, the Quality Assurance Weekly Unit/Group Report and the Executive Exception Report, are input using Form 3926–A, Quality Standard Input Record.

    1. Enter blanks as the Defect List Number to insert a new Quality Standard record or update an existing one.

    2. Enter the first two digits of the Organization Code and three zeros.

    3. Enter the Defect List Number as 999 on Form 3926A to delete a record from the Quality Standard File.

    4. In attempting a delete action, if the OFP is not on the Quality Standard File, then the transaction is in error.

    5. Fill in all four positions of the Quality Standard field (Item 9), using zeros if needed.

    6. Enter the standard as a two digit number, followed by two decimal places. The reports always print as NN.NN in the same way as other percent. Examples:

      Enter as Print as
      1000 10.00
      0150 1.50
      0010 00.10
      9900 99.00
      0900 9.00

    Note:

    If a Quality Standard is entered for less than one (e.g. 0010, 0025, etc.), the program will move it to the file and reports as 1.00 and print an " *" next to the 1.00 on the report.

  5. Changes to the Quality Standards must be coordinated with the employee’s manager, TEPS Coordinator, and Quality Assurance Operations.

    Figure 3.30.28-14

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    Quality Standard Input Record

3.30.28.4.7  (03-01-2007)
Automated 3926 (Auto 3926) System

  1. The system is a part of the Totally Automated Personnel System (TAPS) and is being phased into all campuses.

  2. Manual preparation of Form 3926 is virtually eliminated. Form 3926 should be prepared when changes are required after the cut-off time and date, and to charge local errors.

  3. Routing for ISRP input is virtually eliminated with Auto 3926. Data obtained from Quality Assurance error notices (Forms 5963/local forms) are entered directly, through the computer system, into Auto 3926. Once a week the data will automatically be written to tape and transmitted to the Mainframe system where the data will be input to the PCA runs.

  4. Monday is the cut-off date for inputting 3926 data for the prior week. The Systems Administrator/Information Systems Personnel will determine the cut-of time for inputting 3926 data. After the users are locked out, no 3926 data can be input into the system until Tuesday morning.

  5. The Auto 3926 System validates certain fields and in some instances prevents the users from entering incorrect data. The system generates two error reports for the users to review, and make corrections on-line before the cut-off for processing to tape. As a result, 3926 errors are reduced/eliminated from the Error Register.

  6. The users have several options available for processing and monitoring the 3926 data. The menu option followed by " E" (e.g. 1E) provides a full explanation of the item. Refer to the 3926 User’s Guide (Option 12) for input instructions.

  7. Option 1: 3926 Data Entry Screen—Allows users to add, update, query, remove records, and input defect code(s) and volume (s). The entry screen uses two tables (files):

    1. The Master Table (q__review table) contains one record for each authorization number (SSN) per batch. It includes information such as: SSN, Name, date, Org Code, Function, Program, Defect List, Volume, Errors, Region, Batch #, Reviewer, and Rev Date (The last and first names, org code fields and date will generate automatically).

      Note:

      The Defect List automatically defaults to "000" . The error/defective document field must be increased or decreased manually because the computer will not compute the total volume.

    2. The Detail Table (def__vol table)—The Defect Code(s) and Defect Volume(s) are contained in this table. There could be several Detail records entered for each employee for each Master record. The Defect Volume for each Defect Code must be manually increased/ decreased because the computer will not total the volume.

    3. Samples reviewed as "Individual" but charged as a group/unit error are input with: zeros (000–00–0000) in the SSN field, applicable function/program, and error code. Refer to IRM 3.30.28.4.7(7)(e) below to input the defect volume.

    4. On multiple identical defects, charge one of the errors to the employee’s SSN and the others using zeros in the SSN field with the actual function/program and error codes. Refer to IRM 3.30.28.4.7(7)(e) below to input the defect volume.

    5. To charge secondary errors, input all fields for the primary error(s) first. Then input the secondary defect code(s) and defect volume(s), but do not include these defect(s) in the volume for errors/defective documents in the master table. Check the Quality Assurance Review Record (3926) and QA Weekly Unit Group Report to ensure the secondary errors posted correctly.

    Note:

    The system will not allow more than 10 defect codes per batch number. If more than 10 different defect codes are charged to one employee, add another master record with an additional batch number.

  8. Option 2: 3926 Report by Operations—Used to view/print a report by operation after all reviews for the week have been input.

  9. Option 3: Defect Code Input Screen—Used to add, update, and/or remove defect codes (Headquarters Defect Code List) and their descriptions.

  10. Option 4: Defect Code Report—Generates a Defect Code Listing. The user has the option to view/print the report/listing.

  11. Option 5: Employee ID Input Screen—Allows the user to query, add, update, and remove employee ID records. The employee’s SSN, name, employee’s ID, and operation code is contained in the emp__id table.

  12. Option 6: Employee ID List by Operations—Used to view/print a listing of all employee ID’s by operations. The listing contains the employee’s name, SSN, and ID number.

  13. Option 7: 3926/3081 Error Report by Operations—Generates a 3926/3081 Volume Error Report ( See Exhibit 3.30.28-7.) for the user to review and correct errors to prevent fallout to the Error Register. This option should be used once a week after all review has been input and Forms 3081 have been validated (usually Friday or Monday). It might be necessary to check Forms 5963/local forms, review sheet, etc. to correct the error report. Also, refer to Option 9 when correcting the error report.

  14. Option 8: Defects/Errors Mismatch Report—Generates a Defects/Errors Mismatch Report (see Exhibit 3.30.28–8) for the user to correct the errors on a daily basis, after QR input has been completed.

  15. Option 9: Review 3081 Record—Allows the user to view the 3081 data after checking Option 7 for mismatch. This option may aid in correcting the problem. If the 3081 data is incorrect, the manager of the employee reviewed must be contacted. The manager/designated employee can correct and revalidate the 3081 data. If it is a quality error, option 1 can be used to correct the information.

  16. Option 10: 3081 Non-Validated Org Code List—Allows the user to view the list of non-validated Forms 3081 before using Option 7. If the operations have not validated their organization’s 3081, all of the 3926 data input for that Org code will appear on the 3926/3081 Volume Error Report.

  17. Option 11: Prior Week’s 3926 Data Screen—Allows the user to query records that were input in the previous week. Research can be performed by SSN, Org Code, etc.

  18. Option 12: User’s Guide—Allows the user to view and print the User’s Guide. This option will display explanations of the various options on the Print Menu.

3.30.28.4.8  (03-01-2007)
Form 3927, Quality Assurance Defect Code List

  1. Quality Assurance Defect Code List contains defect codes and their descriptions that are established by the use of Form 3927, or issued in the Defect Code File. See Exhibit 3.30.28-17.

3.30.28.4.8.1  (03-01-2007)
Form 3927 Guidelines

  1. The Defect Code File is a standardized list of defect codes developed for the purpose of identifying those defects codes in existence. This list is maintained by the Quality Assurance Operations.

  2. Form 3927, Quality Assurance Defect List, is used to establish (add), change, or delete defect codes and descriptions on the Quality Assurance Defect Code File for direct data input. It must be cleared through appropriate personnel, per local procedures.

  3. After data input of Form 3927, the 3927 Posting Transaction Listing is generated after the update is made to the Defect Code File or if there is a need to view the file. This will show each Form 3927 transaction that is determined to be valid and updated to the Defect Code File. The defect codes used on Form 3926 are compared to the data on the Quality Assurance Defect Code File for validity.

  4. The Headquarters analyst in the Memphis Submission Processing Center (MSPC) will coordinate proposed changes with the Headquarters Program Operations and incorporate them into the Defect List. See Exhibit 3.30.28-17. Approved changes will be transmitted to the campuses for incorporation into the Defect Code File.

  5. Campuses with a program assigned only to the one campus will prepare the defect codes for Headquarters for publishing in the next revision of the IRM.

  6. If a program is conducted as a test only, the defect codes will be reported as process review. If, after the test, the program continues on one or more campuses, Headquarters in cooperation with the campus(es) will determine which OFP’s involved will be subject to review.

  7. Additional defect codes may be developed locally for those programs not included in the Headquarters list.

3.30.28.4.8.2  (03-01-2007)
Assignment of Defect Codes—Forms 3927

  1. The published Defect Code List, See Exhibit 3.30.28-17., may be used in more than one operations area.

  2. The defects codes published on the list have a separate serial number identification, and are assigned Headquarters Defect Code Numbers in the 000–799 series.

  3. Campuses have the option of utilizing 800 series defect codes beginning with 800 through 899 for the identification to the appropriate area office problem areas.

  4. Headquarters has reserved defect codes 900 through 909 for their use.

  5. Campuses have defect codes 910 through 929 for expanding the identification of other problem areas.

  6. Locally initiated defects are assigned Defect Codes in the 930–999 series and are classified as "Local. " The local defects are not to be used for individual review in specified "measured" work areas. Whenever possible, campus initiated defect codes should be added to the Headquarters Defect Code List. This action must be cleared through the QAO.

    Defect Codes: Area Assigned to Use:
    900–909 Headquarters
    910–929 Campus
    930–999 Campus—Local Use

3.30.28.4.8.3  (03-01-2007)
Prepare Form 3927

  1. To prepare Form 3927, Quality Assurance Defect List, use the standardized defect code number if appropriate; or just use the defect code and description.

  2. Form 3927 can have up to 30 defect codes and descriptions on one page. Each page is input as a separate document. Specific instructions for completing items 1 through 9 on Form 3927 are explained below: (see Figure 3.30.28–15).

    1. Item 1 List number—the three numeric characters—will always be " 000"

    2. Item 2 Organization—title of the organization—it is important that this be shown for the purpose of routing Form 3927 to the appropriate area after it has been transcribed.

    3. Item 3 Organization Code—the first two digits of the organization.

    4. Item 4 Function Code—the first two digits of the function.

    5. Item 5 Action Code—the following action codes are used to update the QA Defect Code File:

      Action Code Description
      A Establish a defect code & description on the Defect Code File
      C Change/update a defect code description on the Defect Code File
      D Delete a defect code from the Defect Code File

      Figure 3.30.28-15

      This image is too large to be displayed in the current screen. Please click the link to view the image.

      Quality Assurance Defect List

    6. Item 6 Programs—the 1st, 2nd, 3rd, and sometimes 4th digits of the program to which the defect code applies.

    7. Item 7 Operation(s)—the name of the operation being reviewed, not to exceed 56 positions.

    8. Item 8 Defect Code—the three numeric characters assigned with a defect description.

    9. Item 9 Description—the alpha, numeric, blank or special characters that describe the identified error. May not exceed 100 positions including characters and spaces. When counting the number of positions in the defect description, include commas, hyphens, slash marks, and spaces. Exclude apostrophes, parenthesis, quotation marks, and periods.

3.30.28.4.8.4  (03-01-2007)
Form 3927 Input Action Codes

  1. The following action codes are input on Form 3927 to add, change, delete, or change the QA Defect Code File:

    1. A—to establish (add) a defect code(s) and description(s) on the Defect Code File. Ensure that Defect List Number "000" , Action Code "A," Operation Name, Defect Code(s) and Defect Description(s) are present (see Figure 3.30.28–16).

      Figure 3.30.28-16

      This image is too large to be displayed in the current screen. Please click the link to view the image.

      Quality Assurance Defect List (Action Code A)

    2. C—to change a defect code(s) description on the Defect Code File. Cross out the way the original description was written and edit the defect description(s) (see Figure 3.30.28–19).

      Figure 3.30.28-17

      This image is too large to be displayed in the current screen. Please click the link to view the image.

      Quality Assurance Defect List (Action Code C)

    3. D—to delete a defect code(s) from the Defect Code File (see Figure 3.30.28–20).

    Figure 3.30.28-18

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    Quality Assurance Defect List (Action Code D)

3.30.28.4.9  (03-01-2007)
Processing Invalid Forms 3926/3927

  1. When data from Form 3926 is processed, it is matched against the OFP Consistency File and Defect Code File. If a match does not occur, or if the Form 3926 fails other built-in validity checks, the Form 3926 data must be corrected.

  2. Forms 3927 are also validated against the Defect Code File. Again, failure to pass validity checks means the incorrect Forms 3927 data must be corrected.

  3. Any error conditions, (i.e. failures to match or pass validity checks) result in the printing of the following listings; also referred to as "Error Registers."

    • Invalid Review Data Listing—Form 3926

    • Invalid Form 3927 Listing

  4. This section describes the Invalid Listings, and the codes that describe the error conditions. It also provides instructions for resolving the error conditions. The following are sources of additional information which can help in the resolution of errors:

    • The Defect Code File Listing

    • 3926 Posting Transaction Listing

    • 3927 Posting Transaction Listing

    • Other QAP Reports

3.30.28.4.9.1  (03-01-2007)
Invalid Review Data Listing—Form 3926

  1. There are a variety of reasons that cause information on Form 3926 to be invalid. The following is a list of error codes and descriptions.

    Error Code Description
    01 No longer used.
    02 No longer used.
    03 Organization-Code is not numeric.
    04 Function-Code is not numeric.
    05 Program-Code is not numeric.
    06 OFP is not on WPC–OFP-Consistency-File.
    07 Quality-Standard is not numeric or blank.
    08 Organization-Code must end with 3 zeros when Quality-Standard is present.
    09 Sampling data present with change to Quality-Standard.
    10 Quality-Standard has been duplicated this period.
    11 Sample-Size not numeric or blank.
    12 SSN not blank with a blank Sample-Size.
    13 Defective-Documents not numeric or blank.
    14 Defect-Code is not on Defect-Code-File.
    15 Defect-Code is not numeric or blank.
    16 Defect-Code is blank, but Num-Of-Defects present.
    17 Num-Of-Defects is blank, but Defect-Code is present.
    18 Num-Of-Defects not numeric or not greater than zero.
    19 Duplicate Defect-Codes detected on Form 3926 for a particular OFP list.
    20 Defective-Documents is greater than total of num-of-defects.
    21 Defective-Documents is greater than sample-size.
    22 Sample-Size not blank or first four-digits not numeric.
    23 Defective-Documents not blank or first four-digits not numeric.
    24 Attempt to delete Quality-Standard and Defect-List is not 999.
    25 Num-Of-Defects not blank or first four-digits not numeric.
    26 OFP not on the Cum-Data-File.
    27 The adjustment would cause Sample-Size to be negative.
    28 The adjustment would cause Defective-Document-Cum to be negative.
    29 The adjustment would cause Total-Defects-Cum to be negative.
    30 The adjustment would cause Defective-Documents-Cum to be greater than sample-size-cum.
    31 The adjustment would cause Defective-Documents-Cum to be greater than Total-defects-Cum.
    32 The adjusted Defect-Code is not on the Cum-Data-File for this OFP.
    33 The adjustment would cause Num-Of-Defects-Cum to be negative.
    34 SSN present on an Adjustment Record.
    35 Defective Documents blank, but Num-Of-Defects greater than zero.
    36 Insufficient data to process request.

3.30.28.4.9.2  (03-01-2007)
Resolve Errors on Invalid Review Data Listing—F3926

  1. The field headings appearing on the Invalid Review Data Listing—Form 3926, (Figure 3.30.28–21) and valid characters for each field are as follows:

    • Computer Run Number—PCE–03–41

    • Week Ending Data—MMDDYYYY of the week the quality review was performed

    • Page Number(s) of Report-Page

    • Defect List—the three-digit numeric characters—will be assigned 000

    • Organization Code—the five-digit numeric characters that represent the organization code of the unit for which the review was performed

    • Function Code—the three-digit numeric characters that represent the function code applicable to the work being reviewed

    • Program Code—the five-digit numeric characters that represent the program code applicable to the work being reviewed

    • Quality Standard—four numeric characters or all blank

    • Social Security Number (SSN)—the nine-digit numeric characters that represent the employee’s social security number

    • Sample Size—the volume of the particular sample being reviewed. The first four positions will be numeric or blank and the fifth position will be numeric, alpha or blank

    • Defect Code—the three-digit numeric characters that represent the defect code. This area will be blank if there are no defects

    • Number of Defects—the number of defects shown in the following format. The first four positions will be numeric or blank and the fifth position will be numeric, alpha, or blank

    • Defective Documents—the number of defective documents shown in the following format. The first four positions will be numeric or blank and the fifth positions numeric, alpha or blank

    • Control Number—three-digit numeric or one alpha followed by two numeric characters that are assigned to identify the transactions. The first position of the control number represents the third digit of the alpha control assigned to the block. The last two positions of the number represent the serial number of the Form 3926 that is in error

    • Error Codes—the two-digit numeric characters used to describe the error condition

    Figure 3.30.28-19

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    Invalid Review Data Listing-Form 3926

  2. Error conditions are listed for invalid fields on the error register for Forms 3926 which fail validity checks.

  3. Use the Control Number printed on the list to locate a Form 3926 that is in error.

  4. Use the Control Number to search the blocks of Forms 3926 for a source document which matches the record to be perfected on Organization Code, Function Code, Program Code, and Defect Code if present.

  5. Examine the Invalid Review Data Listing—Form 3926 and take the appropriate action specified for each Error Condition.

  6. Determine the cause of the condition and prepare Form 3926. If the data entry is invalid, reinput to the Cumulative Data File in the next cycle.

  7. If the Form 3926 attempting to post did not match the OFP Consistency File on Organization Code, Function Code or Program Code, withdraw the Form 3926 that matches the Error Register.

    1. If the data entry on the Form 3926 is valid, prepare the form for reinput to the Cumulative Data File in the next cycle after the OFP has been established on the OFP Consistency File.

    2. If the data entry on the Form 3926 is invalid or illegible, determine the cause of the condition and prepare the form for reinput without an SSN in item 10.

  8. If the Form 3926 attempting to post did not match the Defect Code File because the number is non-numeric, withdraw the Form 3926 that matches the Error Register, correct the information and reinput to the Cumulative Data File.

  9. If the Form 3926 attempting to post did not match the Defect Code File on Defect Code or the Defect Code Number is non-numeric, withdraw the Form 3926 that matches the Error Register, correct the information and reinput the form to the Cumulative Data File.

  10. If the Form 3926 is invalid because the defect code is not established on the defect list, prepare a Form 3927 with Action Code "A" to add the defect code and description to the Defect Code File.

  11. If the Form 3926 is not a QA Adjustment Record and the Documents Sampled, Number of Defects and/or Defective Documents fields contain non-numeric characters, prepare a new form with the correct information and reinput to the Cumulative Data File.

    1. Do not record a Zero in items 11, 14 or 16 of Form 3926 if the Number of Documents Sampled, Number of Defects or Number of Defective Documents is less than one.

    2. Prepare a new form leaving these fields blank and reinput to the Cumulative Data File.

  12. Do not enter an SSN in item 10 of Form 3926 when an adjustment or error correction is input to the Cumulative Data File.

  13. If a Form 3926 was prepared to adjust one or more of the cumulative totals for the fields: Documents Sampled, Number of Defects and/or Defective Documents and one or more of the fields after adjustment contains a negative balance, no action is necessary other than to determine if the right action was taken originally. However, if the action taken was incorrect, prepare and reinput a new Adjustment Form 3926 to the Cumulative Data File.

  14. If Defective Documents are higher than Documents Sampled or Number of Defects, withdraw the Form 3926 that matches the Error Register, prepare a new form and reinput to the Cumulative Data File.

  15. If more than one Adjustment Form 3926 is input to the same OFP with entries in Organization Code and Program Code and which contain an alpha character in one or more of the fields (Documents Sampled, Number of Defects and/or Defective Documents); combine the amounts to be adjusted to one total entry for Documents Sampled, each Defect Code, and Number of Defective Documents. Reinput the Adjustment Form 3926 to the Cumulative Data File.

3.30.28.4.9.3  (03-01-2007)
Error Codes/Descriptions Invalid Listing F3927

  1. The numeric Error Codes and their associated descriptions which are generated on the Invalid Form 3927 Listing (PCE–01–43), are as follows:

    Error Code Description
    01 No longer used.
    02 Action-Code invalid.
    03 Defect-Code not numeric.
    04 Action-Code " A" used to add new Defect-Code into defect list.
    05 Action-Code " C" used to change a Defect-Code which is not currently shown; or Action-Code "D" used to delete Defect-Code data which does not now exist.
    06 Action-Code is not "A" and the Defect-Code is not on file.
    07 Defect-Description must be present for this action-code.
    09 Insufficient data to process request (Action-Codes "C" , "A" , "D" ).
    10 Duplicate request on this transaction.

3.30.28.4.9.4  (03-01-2007)
Resolve Errors on Invalid F3927 Listing

  1. The field headings appearing on this list and the valid characters for each field are as follows (See Figure 3.30.28–22):

    • Defect List—three numeric characters

    • Action Code—one alpha character: A, C, D

    • Defect Code—three numeric characters

    • Defect Description—alpha, numeric, blank, or special characters

    • Error Code—two numeric characters

  2. Refer to the Defect Code File to aid in correction of the invalid records. See IRM 3.30.28.6.3 for further information about the Defect Code File.

  3. These steps are applicable for all of the Error Conditions. To resolve errors on the Invalid Form 3927 Listing:

    1. Search Forms 1332, Block and Selection Records, with Format Code 223 in the DLN (Document Locator Number) box for each of the Error Conditions.

    2. Then search the blocks of Forms 3927 that match the Error Register.

    3. Compare the Action Code and date shown on Form 1332 to information on the Invalid 3927 Listing.

    4. Examine the Invalid Form 3927 Listing and take the appropriate action specified for each Error Condition Code.

    5. Prepare a new Form 3927 or reinput data to the Defect Code File.

    Figure 3.30.28-20

    This image is too large to be displayed in the current screen. Please click the link to view the image.

    Invalid Form 3927 Listing


More Internal Revenue Manual