3.0.275  Business Results Measures for Submission Processing Functions

Manual Transmittal

September 11, 2014

Purpose

(1) This transmits the revised Internal Revenue Manual (IRM) 3.0.275 General, Business Results Measures for Submission Processing Functions.

Material Changes

(1) Editorial changes have been made throughout the IRM

(2) IRM 3.0.275.1.6(4) updated Business Results Measures Communication of Defective Case(s)

(3) IRM 3.0.275.4.2.1 updated Form 8009 to Form 8009-A

(4) IRM 3.0.275.5.5 updated reports for the Deposit Error Rate (DER)

(5) IRM 3.0.275.5.5.1 DER Raw Error Rate Quality Report updated

(6) IRM 3.0.275.5.5.2 changed to Top 5 What and Top 5 Who codes

(7) IRM 3.0.275.5.5.3 DER Summary reports updated

(8) IRM 3.0.275.5.5.4 deleted section, Top 5 What/Who codes, combined with IRM 3.0.275.5.5.2

(9) IRM 3.0.275.6.6.3(5) updated and additional instructions for Refund Error Rate

(10) IRM 3.0.275.7.1 updated NER Sample

(11) IRM 3.0.275.11.1(1) updated timeframe from 20 days to 10 days

(12) IRM 3.0.275.11.2 updated chart for Reject Correspondence Timeliness cut-off and due dates

(13) IRM 3.0.275.11.5.5(1) updated timeframe from 20 days to 10 days

(14) IRM 3.0.275.11.6(2) update criteria for sending feedback

(15) Exhibit 3.0.275-2 updated SMART Database cutoff and report due dates

(16) Exhibit 3.0.275-11 removed CP 259E from NER CP Chart

(17) Exhibit 3.0.275-12 updated NER Cycles for FY15 Monthly Volumes

(18) Exhibit 3.0.275-13 updated NER Monthly Cut-off Chart for FY15

(19) Exhibit 3.0.275-17 updated NER EXPOE Codes

(20) Exhibit 3.0.275-23 updated Refund Interest Processing cycles for FY15

Effect on Other Documents

IRM 3.0.275 dated September 19, 2013 (effective October 1, 2013) is superseded. The following IPU's are also incorporated into this IRM: 14U0261(issued 02/03/2014), IPU 14U0337 (issued 02/18/2014) and IPU 14U0993 (issued 06/11/2014).

Audience

Wage and Investment (W&I)

Effective Date

(10-01-2014)

Paul Mamo
Director, Submission Processing
Wage and Investment Division

3.0.275.1  (12-01-2003)
Overview of Submission Processing Business Results Measures

  1. This section is designed to provide procedures for program level reviews of:

    • Submission Processing correspondence

    • Submission Processing deposit activity

    • Submission Processing refund processing

    • Submission Processing notice generation

  2. This section also provides information about automated program level assessment of:

    • Submission Processing Center deposit timeliness

    • Submission Processing Center refund interest paid

    • Submission Processing Center productivity

3.0.275.1.1  (12-01-2003)
Purpose of Submission Processing Business Results Measures

  1. The purpose of the Submission Processing Business Results Measures program is to collect data that will provide a basis for measuring and improving our work products by:

    1. Identifying sources of error from processing systems, procedural instructions, center and taxpayer action or inaction.

    2. Identifying and analyzing defect trends.

    3. Recommending and submitting corrective action.

    4. Following up with reviews to ensure the corrective action was effective.

    5. Providing input to National Business Measure reports.

3.0.275.1.2  (12-01-2003)
Use of Submission Processing Business Results Measures Results

  1. Results of Submission Processing Business Results Measures reviews may not be used as the basis of evaluative recordation for bargaining team employees.

3.0.275.1.3  (10-01-2007)
Roles and Responsibilities in the Submission Processing Business Results Measures Program

  1. The success of the Submission Processing Business Results Measures Program depends on the participation of all of the following:

    • Wage and Investment Submission Processing Headquarters staff

    • Submission Processing Center Planning and Analysis Departments

    • Submission Processing Center Improvement Team Managers and Improvement Team Analysts

3.0.275.1.3.1  (10-01-2008)
Headquarters Roles

  1. Headquarters is responsible for issuing review guidelines and procedures for all Business Measure Improvement Analysis reviews.

  2. Headquarters will review the Business Results Measures Program as part of periodic reviews of Submission Processing programs.

    Note:

    When program reviews are conducted the Headquarters analyst will attempt to review cases that can be corrected within the sampled month or two-month period after the Monthly Report Run date. For Notice Error and Refund Timeliness and Error the Headquarters analyst will provide the Submission Processing Center (SPC) with a list of cases to be reviewed. The SPC will be required to request the returns and have available all documentation necessary to perform the review. In addition, there may be times when the Headquarters analyst will want to review cases that are past the cut off. This will generally involve cases worked during the peak filing season or those worked at the end of the fiscal year.

  3. Headquarters administers the Submission Processing Measures Analysis and Reporting Tool (SMART) database.

  4. Headquarters will maintain the Business Results Measures section of the Submission Processing Home page.

3.0.275.1.3.2  (10-01-2009)
Improvement Team Manager

  1. Improvement Team Managers must maintain the integrity and quality of the Business Results Measures program by monitoring and reviewing monthly a sample of all work assigned to an Improvement Team analyst. This review must include cases on which errors have been identified and cases coded as perfect cases. Improvement Team Managers will work with local functional management to arrange for manual sampling and case retrieval when necessary. In addition, a quarterly managerial review will be conducted on the technique used in pulling the sample for each measure. The manager will keep a log of the reviews conducted for the measures.

  2. Improvement Team Managers will train their clerical staff to use valid sampling techniques and review monthly the daily sampling log used by the clerical staff to ensure valid sampling was performed. See IRM 3.0.275.1.5. (5)d.

  3. Improvement Team Managers must inform Headquarters of any problems encountered in obtaining required sample sizes during a review period.

  4. Improvement Team Managers and their staff determine the causes that adversely affect quality and timeliness by identifying error trends. Improvement Team Managers and their staff recommend corrective action to functional areas and if recommended corrective action is implemented perform a Follow-up review.

  5. Improvement Team Managers will establish a designated Functional Contact for each impacted functional area for each of the Business Results Measures. Improvement Team Managers will ensure that sample cases that require rework are forwarded to the Functional Contact timely and rebuttals are returned timely.

  6. Improvement Team Managers will arrange for the analysts to attend functional training of areas that impact the Business Results Measures. For additional courses available for Improvement Team managers, visit the Enterprise Learning System (ELMS) located at http://elms.web.irs.gov.

  7. Improvement Team Managers must ensure the IRM 10.2.14 (2), Method of Providing Protection, for "Clean Desk Policy" is followed. All taxpayer information must be in a lockable container during non-duty hours.

3.0.275.1.3.3  (10-01-2007)
Improvement Team Analyst

  1. For all Business Results Measures Improvement Team reviews, Improvement Team Analysts will thoroughly review sample cases using the guidelines in this section.

  2. Improvement Team Analysts perform an unbiased, consistent, and accurate review of all Business Results Measures sample cases.

  3. Improvement Team Analysts should provide their manager with:

    • Any cases identified for rework

    • Regular analysis of error trends

  4. Improvement Team Analysts will:

    1. Review work against established IRM procedures ( IRM 3.0.275.1.9.).

    2. Apply consistent review and case analysis techniques.

    3. Attend functional training of the areas that impact the Business Results Measures.

    4. Have a good working knowledge of the functional areas and programs they review.

    5. Record complete review results using the SMART database and review the database for consistency in coding.

    6. Report all problems in sampling, case review, and source document availability to the Improvement Team Manager.

    7. Coordinate with their manager if additional training is necessary (e.g., Computer basics).

3.0.275.1.4  (10-01-2013)
Submission Processing Business Results Measures Research Tools

  1. The following list is not all inclusive, but it provides a foundation of the most frequently used research tools for Business Results Measures reviews.

    • The primary references are the chapters of IRM Part 3, Submission Processing. All functional procedures and guidelines related to Submission Processing functions are found in Part 3. (Notice Review procedures are found in IRM 3.14.1, IMF Notice Review and IRM 3.14.2, BMF Notice Review.)

    • Integrated Data Retrieval System (IDRS) and Corporate Files On-Line (CFOL) Command Code instructions are found in IRM 2.3, IDRS Terminal Response, and IRM 2.4, IDRS Terminal Input.

    • The Submission Processing Business Results Measures Program encompasses a wide range of functions. Each Submission Processing Improvement Team should establish a library or have access to all necessary IRMs. Improvement Teams should also have access to the Program Requirements Packages (PRPs) for Master File programming and Functional Specifications Packages (FSPs) for Submission Processing Center processing.

    • Any IRS publication can be cited as a reference source. Procedural instructions, particularly when looking for the source of taxpayer error trends, can be found in the publications.

    • The Submission Processing Home Page on the IRS Intranet contains links to a variety of sources of reference material. Access the home page at:
      http://win.web.irs.gov/sp/

    • The "Statistical Abbreviations" is a web site that contains the definitions for the acronyms/abbreviation. Access the web at: http://rnet.web.irs.gov/Other/acronymdb.asp

    • The official method for communicating IRM Part 3 changes, clarifications, and corrections is through Servicewide Electronic Research Program (SERP) Alerts.

      Note:

      Do not charge defects related to IRM or procedural changes until 7 calendar days after the SERP Alert posting date of the IRM update or change. However, additional time may be granted if nationwide training is necessary to implement the change. Provide feedback to local functional areas until 7 calendar days after the SERP Alert posting date of the IRM update or change.

  2. The Submission Processing Home Page contains a section specifically for Business Results Measures information, which includes:

    Data Dictionaries - Includes in-depth workload/performance indicators for each of the Individual Master File (IMF) and Business Master File (BMF) Business Results Measures.

    General Issues - Contains the cut off and due dates of the Business Results Measures reports.

    Conference Notes - Minutes from the Business Results Measures meetings.

  3. Business Results Measures Templates- Maintains the results for IMF and BMF.

    • Do not use training material, locally created job aids that deviate from the IRM, local processing agreements or commercial tax publications when reviewing sample cases for Business Results Measures.

    • In conducting reviews, you may need access to other automated systems. Your manager is responsible for contacting the appropriate systems administrators to grant access rights.

  4. When NEW Letters or Notices are added to the Correspondence Error Rate (CER) Measure a Courtesy Review is conducted before incorporating the new product into the measure. The first step taken by the HQ analyst for this Courtesy Review is to contact IT product to have the new product added to the sample (ensure they are added before the new month's review begins). Prior to conducting the Courtesy Review, contact Statistics of Income (SOI) and let them know your plans. During the Courtesy Review, SOI will not incorporate the volumes associated with the Courtesy Review into the skip interval calculation, because the results will not be included in CER. The normal skip interval can be used to select cases for both CER and the Courtesy Review, but the Courtesy Review cases and the volume (do not include in the Volume of Universe) associated with the Courtesy Review must be dropped from the sample for CER. When the Courtesy Review is completed, contact SOI to inform them that you will be adding a new notice(s) or Letter(s) to the CER measure. If notices are being added, then SOI will start including the volumes for the new notice(s) into the skip interval process. If letters are being added, then you will need to work with SOI to develop a methodology for estimating the volume for the new letters on a monthly basis for a 12 month period. Once the new notice(s) or Letter(s) have been incorporated into the CER measure, they should not be removed. The length of the review will be determined by the Headquarters analyst assigned to the measure.

    Note:

    This excludes new systems that do not provide new letters or notices.

  5. To assist in communication and sharing of Measures data the Headquarters Measures analysts established a centralized Nationwide Enterprise Resource Domain (NERD) shared drive in 2005. This provides a place where emails, documents, and historical Measures data can be stored for easy access by all analysts and champions (Headquarters and SPC) working with Measures. The Headquarters analysts are not required to use this site. The Letter Error, Notice Error, Refund Interest, and Productivity are using the site extensively. Managers need to contact Joan D. Williams or Darlene Ammer via Email to obtain the steps needed to access the Wage and Investment (W&I), Submission Processing Business Measures NERD Shared Folders.

3.0.275.1.5  (10-01-2013)
Submission Processing Business Results Measures General Sampling Guidelines

  1. Improvement Teams are responsible for ensuring that output from automated sampling runs for Business Results Measures programs are received on a timely basis. If you do not receive an expected sample run, open a Knowledge, Incident/Problem, Service Asset Management (KISAM) ticket following local procedures. Inform the Headquarters analyst assigned to the program if you continue to experience problems receiving sample run output.

  2. Improvement Team Managers are responsible for contacting the appropriate Headquarters analyst immediately when they identify problems with an automated sample run (for example, the sample size is much smaller or larger than expected).

  3. Improvement Teams are responsible for manually sampling cases for the Deposit Error Rate Measure, non-IDRS correspondence for the Letter Error Rate Measure and closed cases from the Notice Review area for the Notice Error Measure. The Headquarters analyst responsible for the measure will provide a monthly sampling plan. The Improvement Team manager is required to perform a bi-annual review of the Improvement Teams to ensure the procedures are being followed and to see if updates are needed.

  4. If a decision is made at your SPC to have a functional area (other than the Improvement Team) pull the manual sample (other than for Deposit Error Measure, IRM 3.0.275.5.2. for sampling guidelines) the Improvement Team must provide the Manual Sampling Procedures to the functional area. Improvement Team is required to perform a bi-annual review of the functional area's sampling to ensure the procedures are being followed and to see if updates are needed.

  5. To ensure a reliable and valid sample, you must follow the instructions below when manually sampling cases for Business Results Measures.

    1. Every item that is subject to sampling must have an equal chance of being selected for review. For example, all non-IDRS correspondence subject to Letter Error Rate review must be available for sampling.

      Note:

      The manually pulled sample should be conducted after Quality Review has pulled their sample. This ensures the functional area had the opportunity to correct the return before Improvement Team sample.

    2. On the first day of the sampling period, use the random start number to select the first sample case. Use the skip interval to select subsequent documents for review. Use the random start number only at the beginning of the sampling period, even if the skip interval changes during the period.

    3. Begin each day's count with the remaining count that followed the last document selected on the previous day. For example, your skip interval is eight. On Monday, when you selected the last case for review, there were five cases remaining. Begin Tuesday's count at six cases. In effect, you are choosing the third case from Tuesday's work. However, that case is the eighth case in the skip interval sequence.

    4. Keep a daily sampling log for each measure showing the total number of cases available for review, the skip interval you used, the number of cases sampled, and the number remaining after the last case was selected.

3.0.275.1.6  (10-01-2014)
Business Results Measures Communication of Defective Case(s)

  1. This section applies only to sample cases that are included in the computation of the Business Results Measures.

  2. For all Business Results Measures, if a repeating defect is identified, all defects must be coded. An example: A tax examiner did not follow procedures. If additional cases are pulled that have been worked by the same employee and contain the same defect, all cases will be coded for that defect.

  3. If the Improvement Team analyst did not define the error correctly, or charged the error erroneously and a subsequent error was found upon further review, the functional area will be charged the new or different error.

  4. When a defect is identified, the functional area responsible for correcting the case will receive a Communication Record. No response is required if you agree with the defect. See Exhibit 3.0.275-1. The format of this Communication Record is left to the discretion of individual SPCs, but must contain the following information:

    1. Case Identification: case number, date the case was reviewed, response date (3 business days beginning the day after the review date), functional area responsible for defect, and copy of the defective case.

      Note:

      The correction of the case should be made within two weeks of receipt. The Improvement Team will monitor for correction and if not corrected make contact with the functional area.

    2. Research Information: indicate IRM references (if applicable) for defect(s) identified and defect description (a concise description of the defect(s) identified).

    3. Functional Contact Representative: If the Functional Contact representative disagrees with the Communication Record, it must be returned by the response date on the Communication Record. If additional time is necessary, the Functional Contact must contact the Improvement Team analyst or Improvement Team manager to request an extension (this additional time can be no longer than 5 business days after initial response date). The Functional Contact will include the date of review, their signature, phone number, comments, and backup to support the rebuttal.

      Note:

      Depending on local procedures, the Improvement Team analyst may share the case with the functional area prior to inputting case data into the SMART database. If a defect is identified at the end of the cut off time for SMART input, you must contact the Functional Contact to request a quicker turn around time.

  5. If the Functional Contact agrees with the defect, it is their responsibility to ensure these defects are shared with the individual employee and, if a trend is identified, shared with the functional area. For all measures, arrangements must be made for an area other than the Improvement Team to complete the correction to Master File when Deposit, Tax Period, Master File Tax, Name Lines, Received Date, Taxpayer Identification Number/Social Security Number, Tax, Credits, Address, etc., is affected. For all measures, each SPC must develop a Defect Monitoring log to track all defects, ensuring errors needing correction have been corrected within a two week period. Document the action taken and date closed. This log must be available upon request from Headquarters. If the correction has not been made within two weeks then a follow-up will be required by the Improvement Team analyst. Any corrections not made within three weeks should be referred to management.

  6. If the Functional Contact does not agree with the defect(s) the Improvement Team analyst identified, and the Improvement Team analyst agrees with the rebuttal, edit the database to reflect the change. Communicate your agreement to the Functional Contact.

  7. If the Functional Contact does not agree with the Improvement Team defect(s), and the Improvement Team manager does not agree with the rebuttal from the Functional Contact, the Improvement Team manager will forward the case(s) to the Headquarters (HQ) analyst responsible for the measure using the Improvement Team Review Defect Rebuttal Procedures. See IRM 3.0.275.1.7. The case(s) submitted to the HQ analyst must include, if applicable, all IRM references, etc. used by the Functional Contact and Improvement Team.

    Note:

    The Improvement Team manager will inform the Functional Contact that the rebutted case(s) will be forwarded to HQ for resolution. Any disputed defect removed without following the Defect Rebuttal Procedures should be faxed or mailed to the Measures Headquarter Analyst. Remember, disclosure procedures should be followed when faxing or sending information. See IRM 3.0.275.1.10 for additional information that is required on the monthly narratives sent to HQ.

  8. Provide Taxpayer error information to your Local Communication Office (e.g., Certified Public Accountant (CPA) trends, Taxpayer trends, Tax Preparer trends).

3.0.275.1.7  (10-01-2013)
Improvement Team Review Defect Rebuttal Procedures

  1. Occasionally, you may not be able to resolve a defect rebuttal case at the SPC (e.g., all who would have a stake in resolving the case) level. If this happens, the Improvement Team Manager will refer the disputed case to the Headquarters Measures Analyst responsible for the case(s) product line.

  2. The referral must contain the Document Collection Instrument (DCI) number of the disputed case and an analysis of the defect from the Improvement Team analyst and the functional area. You must provide any research material (e.g., IRM references) you relied on when originally reviewing the case.

  3. All disputed cases should be resolved prior to the monthly cutoff date. See Exhibit 3.0.275-2. for cut-off dates and report due dates for all Business Results Measures. SPCs should make every effort to ensure cases sent to Headquarters Measures Analyst are received no later than the Monday prior to the monthly cutoff date. If the cutoff date has passed, the cases will still be sent to the Headquarters Analyst for resolution. These cases will then be captured in the cumulative results.

  4. The Headquarters Measures Analyst will respond to the SPC within five business days. The Headquarters Measures Analyst will consider both areas statements and will make the final determination of whether or not a defect has been appropriately identified.

3.0.275.1.8  (10-01-2013)
Problem Reporting Instructions for Notice and Letter Error Rate

  1. If a systemic problem has been identified during the review process for Notice or Letter Error, complete the Problem Reporting Template (PRT) using the following steps. To print a copy of the PRT, go to the Submission Processing Home Page located on the IRS Intranet, and find the header "Programs and Information" and click on "Notices and Letters" , and select "Correspondence Problem Reporting Template." Access the home page at http://win.web.irs.gov/sp/.

    1. Enter correspondence type. This will most commonly be the Computer Paragraph (CP) or Correspondex (CRX) letter number. Leave this box blank when reporting an equipment problem.

    2. Enter the name and telephone number of the person who identified the problem. Headquarters may use this information if additional data about the problem is required.

    3. Enter the date and time that the problem was identified. The problem report tracking system uses the date and time to trigger various reporting and escalation actions.

    4. Provide the cycle(s) during which the problem is present, if applicable.

    5. Numbers 6a through 8, circle the appropriate answer.

    6. Provide a description of the problem being reported. Include information about the status of the problem and any actions taken to reduce the impact of the problem on taxpayers or other functions/operations. Provide IRM reference when appropriate. In addition, provide Improvement Team coding of the error. Before faxing ensure you include TXMOD prints, IMFOLT/BMFOLT, ENMOD, and other supporting documentation. Do not sanitize the documents, only sanitize when instructed; the pertinent information on the notice is needed for research.

  2. When the PRT has been completed for IMF or BMF Notices/Letters email the PRT to the CER Analyst -Tom O'Hare (816–291–9719). Also, scan or fax the notice/letter and send to the CER Analyst (fax 816–292–6261). Sanitize the notice before sending. The CER Analyst will submit the notice/letter to the Office of Taxpayer Correspondence (OTC) for change/corrections.

  3. The Headquarters Measures Analyst assigned to Letter or Notice Error will review the PRT and coding. If the HQ analyst agrees with the PRT then the HQ analyst will share with all SPCs and post information from the PRT on the "Submission Processing Notice or Letter Systemic Error Chart." In addition, the HQ analyst will post the PRT to the "Shared Drive" for Notice or Letter Error.

  4. The HQ WEB Page owner will post the "Submission Processing Notice or Letter Systemic Error Chart" to the Business Results Measures section of the Submission Processing web site (http://win.web.irs.gov/sp/).

  5. The HQ Notice and Letter liaisons will be responsible for coordinating, monitoring, providing status updates, and tracking of the Notice/Letter PRTs to resolution.

  6. After the systemic error has been corrected then it will be removed from the "Submission Processing Notice or Letter Error Systemic Chart."

3.0.275.1.9  (10-01-2010)
Business Results Measures Review Guidelines

  1. When reviewing a sample case for a Submission Processing Business Results Measure, you may take into account local procedures that do not conflict/deviate with the appropriate IRM instructions. Share all local procedures with the appropriate HQ IRM analyst. Post requests for deviation from the IRM via the "SERP Alert" system, and cannot be implemented until the IRM owner grants approval.

    Note:

    All approved "SERP Alert" procedures will be incorporated into the next revision of the appropriate IRM. If the procedure has not been incorporated within one year of the approval, then the IRM procedure is obsolete. See IRM 1.11.1, IMD Program and Responsibilities, and see IRM 1.11.2, Internal Revenue Manual (IRM) Process for additional information.

  2. The unavailability of any program (e.g. QRADD, Computer Assistance Review of Error Resolution System (CARE), 100percent reviews, managerial reviews) that could improve the quality of work will not remove a sample case from the Business Result Measures reviews. If non-systemic errors are identified, they must be charged to the appropriate functional area, and not as Systemic, unless it can be proven that the quality system was directly responsible for the error. These reviews are enhancements to the initial processing of the return and are only tools to improve the results.

  3. When a defect is identified, the analyst must determine what type of defect to charge. The following definitions should assist in ensuring consistency of coding:

    • Accuracy (Non-Systemic) - Case was handled incorrectly by the functional area.

    • Accuracy (Systemic) - Defect occurred as a result of a Submission Processing programming problem (excludes any programs to improve the quality of the product), or incorrect/outdated IRM procedures. (Notice Error, Letter Error, Business Return Error, Refund Timeliness and Error only).

    • Professionalism (Non-Systemic) - Minor defects that do not affect the Accuracy of the information being sent to the taxpayer. These defects include incorrect punctuation, capitalization, and spacing defects made by the functional area. (Notice Error, Letter Error, and Refund Timeliness and Error only).

    • Professionalism (Systemic) - Defect occurred as a result of a programming problem, incorrect/outdated IRM procedures, or when a properly working system results in a less than ideal product. (Refund Timeliness and Error only).

      Note:

      The Notice and Letter Error measures will no longer code these types of errors. However, a Problem Reporting Template (PRT) will be required for every systemic professionalism error. Check the Systemic Chart to see if the error has been reported previously. If not, complete the PRT and submit it to HQ. The PRT's will be monitored by the HQ Measure owner until corrected. See IRM 3.0.275.1.8, Problem Reporting Instructions for Notice and Letter Error Rate, on completing a PRT.

  4. When coding any sample case for a Submission Processing Business results Measure, DO NOT include sensitive data into the SMART database. This includes taxpayers names, phone numbers, addresses, and taxpayer identification numbers. This type of information is considered privileged and would be unauthorized disclosure.

3.0.275.1.10  (10-01-2013)
SMART Database Cut-Off Dates and Report Information for All Business Results Measures

  1. For cases to be included in the monthly (period) report, the cut-off date for inputting these cases is the close of business on the 22nd day of the month following the end of the sample month. Therefore, the period rates shown may represent only a subset of the entire monthly sample and can only be considered preliminary in nature. To ensure that the monthly report represents the actual error rate for your center, enter as many cases as possible before the Monthly Report Run date. It is recommended to input the review information into the SMART database within one business day after coding.

    Note:

    In reviewing the Service Center Reports on SMART remember these reports are "real time" reports and continuously reflect changes made by the SPCs after the National reports are run on the 23rd. These reports change every time a case is loaded and only reflect results at the specific time they are viewed. See Exhibit 3.0.275-2.

  2. Cases input after the cut-off date will be included in the following months cumulative rate (not the period) (e.g., June cases coded after the 22nd day of July will be reflected in the cumulative rate of the month it was input). Therefore, the only figure that includes all coded cases to date is the current month's cumulative.

    Note:

    In reviewing the National Reports on SMART remember these reports are generated on the 23rd of each month, providing a data "snapshot" at that moment. Adding data after the "snapshot" will not change the National reports for that month. Any case loaded after the National Report running will roll into the cumulative for the current month.

  3. For Notice Error, Letter Error, Deposit Error, and Refund Timeliness and Error any updates/edits to the SMART database after the data cut-off date may be input for two months after the Monthly Report Run date. Additionally, for Deposit Error, Notice Error, and Letter Error any additional cases from the original sample month may be added for two months after the Monthly Report Run date. However, no additional cases can be added for the Refund Timeliness and Error measures. When edits or additions are made after the Monthly Report Run date, they will be reflected in the cumulative for the month the updates/edits/additional cases were input. At the end of the two month period, any cases not coded will be deleted from the database.

  4. Reports for the fiscal year must be finalized by the end of October. Therefore, all August and September updates/edits/additional cases must be input by October 22. The Fiscal Year Narrative report will be due four working days after the September Monthly Report narratives are due to Headquarters. The final report should also include a comparison of last years results to this years results. If any, what issues occurred this year that impacted the measures results. What action did the operations take to improve the measures (e.g., developed job aid, special task teams, action plan). What action did the Improvement Team take to improve the measures.

  5. National Reports will be generated on the 23rd of the month following the end of the sample month data cut-off date and will be posted to the Submission Processing Web Page on or about the last business day of the month. See Exhibit 3.0.275-2.

    Note:

    The following is the time on the 23rd of each month that the measures data automatically generates the reports. This is Central time (Austin Time).
    4:00 a.m. - Notice Error Update
    4:00 a.m. - period data and 4:30 a.m. Cum data - Deposit Error Update
    4:00 a.m. - Business Return Error Rate
    5:30 a.m. - Refund Timeliness and Error Update
    4:00 a.m. - Deposit Timeliness Update
    6:00 a.m. - Letter Error Update
    6:15 a.m. - Correspondence Error Rate

  6. Narratives for IMF and BMF are due to Headquarters for all measures by Close of Business (COB) on the 1st business day after the Monthly Report run date. Send narratives to the Headquarter Monitoring Section Manager with a carbon copy to the Headquarters Measures Analyst assigned to the measure. The Business Results Measures Monthly Narrative Report will be used and include/address the following:

    • SPC Goal

    • Period Rate

    • Cumulative Rate

    • Number of Errors

    • Number of Documents/Cases Reviewed

    • Number of cases still pending (Deposit Error, Letter Error, Notice Error, Business return Error, and Refund Error) for the current month and any other pending cases identified by the month case(s) were pulled (e.g., current month - April 2 cases pending. Prior month March 3 cases pending.)

      Note:

      The next month's analysis will address the results of the pending cases from the previous month(s) and any other cases still pending. Headquarters is concerned these cases may be more prone to error than the timely reviewed cases.

    • Number of cases closed due to insufficient data (Deposit Error, Letter, Notice, Business Return Error and Refund)

  7. Comments must include:

    1. For all measures, whether the SPC goal was met or not met. If a change of seven percent (plus or minus) difference from the Corporate Monthly Cumulative Goal provide causes for the change (e.g., was it due to attrition, legislative changes, procedural changes). Use the SMART database to assist in identifying trends for both systemic and non-systemic (both period and cumulative) errors. If a trend is identified, explain what was the cause for the trend (e.g., was it procedural, training). Is the trend different from last year? If yes, explain the change. Make contact with your Operations determine if something happened that caused the SPC goal to met or not met (e.g., could be procedural change caused a higher error rate, new process improved the SPC rate). For the Productivity Measure, provide a description of the current month's in depth analysis and the programs/areas identified/targeted for improvement. Describe the mechanics of the analysis performed since last month. What was identified? Are rates too low for a particular program? What are the programs that have the highest negative impact on the measure? What are the programs that have positive impact on the measure? What are the areas targeted for improvement? If the Productivity goal was met provide an explanation of the factors that assisted the SPC in reaching the goal. For the Deposit Timeliness Measure, provide a breakdown of cases reviewed (how many R-RPS, Transport, PPU, Field, Lockbox or Non Transport).

      Note:

      It is important to analyze the data provided by SMART and capture any initiatives that your SPC has initiated.

    2. Provide information from the Operations on what they are doing to improve the measures results. Provide the specific initiatives implemented to improve the work product. For example: Action Plan initiatives, Just in Time Training, Special Procedures, 100percent reviews, Champions, task teams.

      Note:

      If the information provided by the Operations does not change from month to month there is no need to repeat your narratives, just state no change. However, address error trends or extraordinarily great performance, any new initiatives, and any previously submitted initiatives or actions that have been discontinued. This also applies to the Productivity Measure.

    3. Provide via an attachment any guides or other information that has been distributed by your SPC. For example: Quality Alerts, Weekly Reports. This also applies to the Productivity Measure.

    4. If cases are being closed due to insufficient data, no source document, or lack of back-up material, please elaborate on why the data is not available, and what is being done to address the issue. This item does not apply to the Productivity Measure.

    5. Address open cases from the previous month by supplying the results of these cases (e.g., 10 open cases, 8 correct, 2 errors - Errors included). This item does not apply to the Productivity Measure.

    6. All rebutted/disagreed cases must be captured in the comment section of the Monthly Narrative Report. Include number of cases rebutted/disagreed, number of defects charged/removed, and the reason why the defect was charged/removed. In addition, a file must be established that will include all cases that cannot be agreed upon by all parties (e.g., Improvement Team manager, Improvement Team analyst, Operations, Planning and Analysis (P&A). Monthly, a copy of these cases must be either faxed or mailed to HQ for review. This will ensure that the determinations made are consistent with the intent of the measure. This item does not apply to the Productivity Measure.

    7. Any additional information that will assist HQ in reporting to upper management. For the Productivity Measure, also include any reporting issues that impact the results. Provide a list of reporting issues or errors that impact Productivity data. Since the source for productivity is the PCC-6240 Work Planning and Control (WP&C) Program Analysis Report, any WP&C reporting issues should be evaluated for the impact to the Productivity Measure. An example for Productivity volume is Unit Production Card (UPC) reporting problems, especially if UPC production volumes didn't get input. An example for hours is the situation when the Form 3081 time for new hires does not get into the WP&C in the appropriate week due to timing with HRConnect and the Reports Staff has not had the opportunity to get the adjustments input into the correct month.

      Note:

      To print a copy of the Business Results Measures Monthly Narrative Report go to the Submission Processing Home Page on http://win.web.irs.gov/sp/ the IRS Intranet go to Programs, select Measures, then select "Business Results Measures Reports" then select Balance Measures Template Monthly Report.

  8. Five full years of data will be kept on the database. After 5 years, the data will be stored on an Austin storage server.

3.0.275.1.11  (10-01-2013)
Submission Processing Business Results Measures Records Retention

  1. The IMF and BMF Improvement Teams must retain, on site, six months of case files for each Business Results Measure. After six months, the case files can be sent to the Record Retention area. See IRM 1.15.29, "Records Control Schedule for Submission Process Center Records," Exhibit 1, Number 20, Quality Review Forms, Reports and Records (1) (a), as a guideline/reference for retention of the Business Results Measures data. On the paper work to be submitted under "sample control" indicate DCI forms, as the type of record. DO NOT send case files to Federal Record Center. The Record Retention area can destroy these case files after one year. If a SPC is ramping down contact Headquarters for direction on destroying cases prior to the SPC shutdown.

  2. Required Documentation for Business Results Measures Case Files:

    • Letter Error - each case file should contain a copy of the DCI, a copy of the return or source document, and a copy of all associated research.

    • Deposit Error- each case file should contain a copy of the source document(s) and posting document. A copy of the DCI is optional.

    • Refund Timeliness and Error - case coded as perfect - case file should contain a copy of the DCI, IMFOL prints, and any additional documentation to support your coding decision. Case coded as not perfect - each case file should contain a copy of the DCI, a copy of the return, IMFOL prints, and any additional documentation to support your coding decision.

    • Notice Error - case coded as perfect - case file should contain a copy of the DCI, a copy of the CP notice, and any other documentation to support your coding decision. Case coded as not perfect - case file should contain a copy of the DCI, a copy of the CP notice, a copy of the return, and any other documentation to support your coding decision.

    • Refund Interest - each case file should contain a copy of the DCI, a copy of the return or source document, and a copy of all associated research.

    Note:

    Supporting documentation does not necessarily mean that the entire return must be photocopied. Copies of all appropriate pages should be attached.

3.0.275.1.12  (10-01-2012)
Submission Processing Business Results Measures Time Reporting

  1. See IRM 25.8.1, OFP Codes Overview, for appropriate Work Planning and Control (WP&C), Organization, Function, and Program (OFP) time reporting codes for Business Results Measures work performed in the Submission Processing Improvement Teams.

  2. The OFP 880-08120 (Program Business Results Measures Review) with a 5th digit will identify the Business Results Measures reviewed:

    • Refund Timeliness and Error Rate - 880-08121

    • Deposit Timeliness and Error Rate - 880-08122

    • Letter Error Rate - 880-08123

    • Notice Error Rate - 880-08124

    • Refund Interest - 880-08125

    • Reject Timeliness Performance Indicator - 880-08127

  3. The OFP 880–08190 (Improvement Team-Clerical) with a 5th digit will identify the clerical time spent on each Business Results Measures:

    • Refund Timeliness and Error Rate - 880-08191

    • Deposit Timeliness and Error Rate - 880-08192

    • Letter Error Rate - 880-08193

    • Notice Error Rate - 880-08194

  4. All centers must use the OFP 990-59130 (Analyst-Management and Program analyst) with a 5th digit to identify time spent analyzing Productivity since this program does not require samples to be extracted for review. The 5th digit OFP:

    • Productivity - 990-59135

  5. All centers, if Functional areas are performing additional (or increased) reviews to improve Business Results Measures use OFP 880-08040.

3.0.275.2  (10-01-2013)
Entity Review for Letter Error, Refund Error, and Notice Error

  1. For Refund and Notice Error when reviewing either the Entity or Address, a defect that results from the Entity Index File Response is charged as a systemic defect.

    Note:

    See IRM 3.0.275.6.6.2. Refund Error Rate Review, for information on the Entity Index File processing.

  2. Code an incorrect middle initial as a Professionalism defect.

    Note:

    For these measures, an omitted middle initial is not a defect.

  3. Code an incorrect designation of the name as an Accuracy defect (e.g., Jr., Sr., etc.).

    Note:

    Omission of "Jr." , "Sr." , "III" , etc., when it appears on the return is a Professionalism defect.

  4. Any misspelled names will be coded as a Professionalism defect. Exception: If the first four characters of the taxpayers last name (primary or secondary) are incorrect or transposed, code as an Accuracy defect (e.g., SMIT input SITM) or notice shows name line as "To H" and taxpayer name is Tony).

  5. If the Name Line exceeds the maximum number of characters (35) and the IRM instructions were followed, no defect will be charged. See IRM 3.12.3 Error Resolution - Individual Income Tax Returns and IRM 3.13.5 Campus Document Services - Individual Master File (IMF) Account Numbers for instructions covering changing/correcting the first name line.

    Note:

    Do not enter the full name of a taxpayer when the name line was incorrect. In the DCI comment field state, name line incorrect.

  6. If reviewing a Second Name Line use the table to determine appropriate coding:

    Review of Second Name Line
    If.. And.. Then..
    The source document/return does not show a 2nd name line,   Review the 2nd name line on the notice/letter/return for obvious misspellings only.
    The source document/return shows a 2nd name line, The 2nd name line on the notice/letter does not match the source document/return, but contains the same information, Do not code as a defect.
    The source document/return shows a 2nd name line, The 2nd name line on the notice/letter does not match the source document/return, and contains completely different information, Code as an Accuracy defect. Notice Error and Refund Error, route a copy of the case to Entity for resolution.
    The source document/return shows a first and 2nd name line, The name lines are reversed but contain the same information, Do not code as a defect.
    The source document/return shows a 2nd name line, There is no 2nd name line present on the notice/letter, Code as an Accuracy defect. Notice Error and Refund Error, route a copy of the case to Entity for resolution.
  7. Omission of MINOR when it is required by processing or it is not transcribed will be considered a Professionalism defect.

  8. If Deceased (DECD) or "Estate of" is noted on the return and the IRM directives is to include these entries and they were omitted from the name line, code as an accuracy error. For Letter Error if the filing status was married filing joint (MF 2) and one spouse is deceased, code an accuracy error if the letter is not addressed to the surviving spouse.

3.0.275.3  (10-01-2008)
Address Review for Letter Error, Refund Error, and Notice Error

  1. Common abbreviations for "Street" , "Avenue" , "Road" , etc. are acceptable. Do not code as a defect.

  2. Abbreviation of the literal "Apartment" as "Apt." is acceptable. Do not code as a defect.

  3. If you find a discrepancy in designations of "St." , "Rd." , "Ave." , (e.g., return shows "123 7th St." , notice shows "123 7th Rd." ) perform research to determine whether both addresses exist. For the example above, if research indicates that both 7th St. and 7th Rd. exist in the taxpayer's city, code an Accuracy defect. Otherwise, code a Professionalism defect. See Exhibit 3.0.275-19. If you have access to the Internet, you can access a Post Office ZIP Code look-up screen at https://tools.usps.com/go/ZipLookupAction!input.action. If you do not have access to the Internet, you can use Post Office ZIP Code books to conduct research.

    Note:

    Do not enter the full name of a taxpayer when the name line was incorrect. In the DCI comment field state, name line incorrect.

  4. Incorrect street number is always an Accuracy defect.

  5. Incorrect street name (other than minor spelling errors) is an Accuracy defect. Minor spelling errors are Professionalism defects.

  6. Incorrect or omitted apartment or suite number is an Accuracy defect (a defect that results from accessing the Entity Index File is charged as a systemic accuracy defect).

  7. Street address' compass sign incorrect or omitted, code as an Accuracy defect.

  8. Incorrect city (other than minor spelling errors) is an Accuracy defect. Minor spelling errors are Professionalism defects.

  9. Incorrect state is an Accuracy defect.

  10. Incorrect ZIP code is an Accuracy defect.

3.0.275.3.1  (10-01-2013)
FINALIST - United States Postal Service (USPS) Standardization Software

  1. The FINALIST program is standardization software used by the USPS to ensure addresses are valid and correct. FINALIST knows if building numbers are valid.

  2. FINALIST is used to ensure the IRS’s outgoing mail, notices, tax packages, etc. comply with the USPS address standards.

  3. An address that is entered into IDRS and does not meet the USPS standards WILL NOT update and/or post with the input address.

  4. The USPS allows thirteen characters (including spaces) for city names. FINALIST will abbreviate city names if needed to reduce the number of characters to thirteen.

  5. FINALIST will also abbreviate street names. However, a list of abbreviations used by FINALIST is not available.

  6. If you see an address that appears to be an abbreviation of the street name or city name, you can research the USPS Zip Code look up screen at https://tools.usps.com/go/ZipLookupAction!input.action to see if the abbreviation is accepted by the USPS.

3.0.275.4  (12-01-2004)
Letter Error Rate Business Results Measure

  1. The purpose of the Letter Error Rate Business Results Measure is to determine the percentage of incorrect letters issued to taxpayers by Submission Processing employees.

3.0.275.4.1  (10-01-2009)
Definition of Letter Error Rate

  1. Letter Error Rate is defined as the percentage of incorrect correspondence issued to taxpayers by Submission Processing employees. The number of letters and including Return and Income Verification Services (RAIVS) envelopes used to respond for Form 13871I and Form 13781B that are inaccurate, unprofessional, or unclear is divided by the sample size to determine the percentage of inaccurate correspondence. Beginning in FY10, the return envelope used to send the taxpayer a response on RAIVS requests (13871I and 13871B) will be included in the measure review and errors will be charged appropriately if incorrect. The RAIVS envelope has been added to the measure to ensure no unauthorized disclosures result from an incorrect address of the envelope.

    Note:

    RAIVS 13873I and 13873B letters are the only correspondence product that includes a review of the envelope.

    Note:

    Professionalism defects are not included in the reports shared with the Upper Management Commissioner. For Fiscal Year (FY) 2002, IMF and BMF included systemic defects in the rates reported on the Commissioner's Monthly Report (CMR). For FY 2003, IMF continued to include systemic defects in the rates reported on the CMR. For BMF, systemic defects were not included in the rates reported on the CMR. Starting FY 2006, IMF and BMF will include systemic defects in the rates reported on the CMR. Letters are reviewed to identify any errors caused by the IRS including systemic errors. The completeness of letters (does the letter address all issues related to the source document) as well as the professionalism and clarity of the language is analyzed. Starting in FY 2007 a new Correspondence Error Rate Measure combining the results from the Letter and Notice Error Measures was introduced.

3.0.275.4.2  (10-01-2012)
Sample Instructions for Letter Error Rate

  1. The Letter Error Rate program uses a daily sampling of letters to determine a monthly error rate. The sample includes all IDRS letters and some non-IDRS letters issued by Submission Processing employees.

  2. The Letter Error Rate sample is automated by the CRX program. Access to Control-D reports is necessary to obtain the CRX and CRXR Sample Reports. Contact your functional coordinator if you do not have access to Control-D.

    Note:

    The CRXR reports which contain cases for sample is restricted. If you need access see your manager.

  3. The following reports are posted to Control-D daily and provide the Sample Reports and a Summary Report:

    • 01S45 (IMF Letter Detail Report)

    • 01S46 (BMF Letter Detail Report)

    • 01S44 (IMF and BMF Submission Processing Letter Controls)

      Note:

      The reports shown could be entitled somewhat different or have additional indicators in front of the designator for each report.

  4. Paper copies of the selected letters are printed and provided to the Improvement Team daily as part of the CRX run. CRX01S14 produces copies of the IMF letters, and CRX01S15 produces copies of the BMF letters. Coordinate the delivery of these letters to the Improvement Team with your local Enterprise Help Desk (IT) function.

  5. After obtaining the copies of the selected letters, an Improvement Team employee must visit the appropriate Submission Processing Center teams to find the associated return or case file. See IRM 3.0.275.4.3.1 if you cannot locate the document in the originating area.

  6. Photocopy the return or case file for use in the review process.

    Note:

    The original return or case file must always be maintained in the appropriate area.

3.0.275.4.2.1  (10-01-2014)
Letter Error Rate Sampling of Non-IDRS Letters

  1. The CRX sample program will only select IDRS letters. While Submission Processing functions also send other types of non-IDRS correspondence, Improvement Team will only sample the following non-IDRS correspondence:

    • Form/Correspondence that is used to send a return back to the taxpayer (e.g., Form 3531, Request for Missing Information or Papers to Complete Return). These are usually referred to as "Greenies" .

    • Responses to Requesters for Tax Returns (e.g., Auto-Transcript letters/Word documents/other local letters). These are used by Return And Income Verification Services (RAIVS).

    • Tax return/return information that is sent back to the taxpayer by Receipt and Control (R&C).

  2. Greenies, RAIVS, Machine Services (Cincinnati Service Center (CSPC) only), IRS Individual Taxpayer Identification (ITIN) (Austin Service Center (AUSPC) only), and R&C cases must be manually sampled each day, and the functional area must make all Greenies, RAIVS, Machine Services (CSPC only), ITIN (AUSPC only), and R&C cases available for sampling. See IRM 3.0.275.1.5. for sampling guidelines.

  3. Improvement Team employees will manually sample the Greenies, RAIVS, Machine Services (CSPC only), ITIN (AUSPC only), and R&C using the skip interval provided by Headquarters. The following IRMs will assist you in determining the Greenies to review. In addition, see the "Business Results Measures for Submission Processing Functions," Job Aid 6804-701, for Letter Error, for a list of Non-IDRS letters.
    For IMF Greenies, see the following IRMs:

    • IRM 3.11.3, Individual Income Tax Return

    • IRM 3.11.6, Data Processing (DP) Tax Adjustments (Form 8009-A)


    For BMF Greenies, see the following IRMs:

    • IRM 3.11.13, Employment Tax Returns (Form 6800-SP and Notice 695)

    • IRM 3.11.14, Income Tax Returns for Estates and Trusts (Form 1041, Form 1041-QFT, Form 1040-N and Form 6800)

    • IRM 3.11.15, Return for Partnership Income

    • IRM 3.11.16, Corporate Income Tax Return (Form 6800)

    • IRM 3.11.17, Processing Form 1120X and Form 8842 (Form 6800)

    • IRM 3.11.23, Excise Tax Return (Form 6800)

    • IRM 3.11.106, Estate and Gift Tax Returns (Form 6800)

    • IRM 3.11.154, Unemployment Tax Returns, (Form 6800-SP)

    • IRM 3.11.212, Applications for Extension of Time to File, (Form 6401 and Form 6513)

    • IRM 3.11.213, Form 1066, U.S. REMIC Tax Return, (Notice 695)

    • IRM 3.11.249, Processing Form 8752, (Form 6800)


    For review of Receipt and Control (R&C) see:

    • IRM 3.8.44, Campus Deposit Activity

  4. Photocopy all Greenies, RAIVS, Machine Services (CSPC only), ITIN (AUSPC only), and R&C supporting documents. The original correspondence form and return must remain in the functional area for mailing.

3.0.275.4.3  (10-01-2010)
Case Review Instructions for Letter Error Rate

  1. Secure all research material before beginning case review. Due to the use of the CRX program for the automated sample, Command Codes QRIND and RVIEW can be used for Letter Error Rate review if the review is performed within two days from input. Use Control-D to determine if the letter was deleted or a duplicate by selecting from the Project List either CRX (two reports "Daily Letter Error List" or "Daily Delete Request List" ) or EOD (two reports "Duplication Transaction" or "Quality Review Index Transaction" ). In addition, use CC LLIST to see if a letter was created earlier in the day. Use Command Codes INOLE, ENMOD, TXMOD, IMFOL, BMFOL, BRTVU, and RTVUE when reviewing the taxpayer's letter and account, if needed.

  2. Review each sample letter against the source document and research that you have obtained. See IRM 3.0.275.2 and IRM 3.0.275.3 for Entity and Address Reviews. Determine the IRM requirements and procedures for the case you are reviewing using the following IRMs:

    • IRM 1.2.4, Use of Pseudonyms by IRS Employees

    • IRM 1.10.1, Office of the Internal Revenue, The IRS Correspondence Manual

    • IRM 1.10.2, Reference Material for Preparing Correspondence

    • IRM 2.3, IDRS Terminal Responses

    • IRM 2.11.1, IDRS Correspondence System

    • IRM 3.0.273, Administrative Reference Guide

    • IRM 3.8.44, Campus Deposit Activity

    • IRM 3.8.45, Manual Deposit Process

    • IRM 3.11 Series, Returns and Documents Analysis

    • IRM 3.12 Series, Error Resolution

    • IRM 3.13.2, BMF Account Numbers

    • IRM 3.13.5, IMF Account Numbers

    • IRM 3.17 Series, Accounting and Data Control

    • IRM 13.1 Series, Taxpayer Advocate Case Procedures

    • IRM 13.1.3, Definition of Terms/Use of Abbreviations

    • IRM 3.5.20, Processing Requests for Tax Return/Return Information (RAIVS)

    • IRM 21.3.3, Incoming and Outgoing Correspondence/Letters

    • IRM Servicewide Electronic Research Program (SERP)

  3. Use other IRMs and documents in conjunction with the IRMs shown above, if necessary. For example, Command Code LETER instructions can be found in IRM 2.4, IDRS Terminal Input.

  4. Refer to the IDRS Letter Correspondex to determine if the appropriate letter and selected paragraphs were used.

  5. Review each RAIVS return envelope against the source document (Form 4506, "Request for Copy of Tax Return" or Form 4506T, "Request for Transcript of Tax Return" ). Compare the address of the envelope against lines 1a, 2a, and 3, of Form 4506/4506T for correctness. Taxpayers may also use line 5 to have the information mailed to a third party.

3.0.275.4.3.1  (10-01-2005)
Source Documents for Letter Error Rate

  1. In some instances the source document or return associated with the sample letter will not be available when you search for it in the functional area. If this is the case, you must attempt to obtain it for review.

  2. If there is no Document Locator Number (DLN) or if the document is in Suspense Team, search the suspense file or appropriate team for the source document. Continue to search periodically for 25 days. If the return is unavailable after 25 days, review the letter following the procedures in IRM 3.0.275.4.4.3 and enter a check mark in the Case File Unavailable field of the Letter Error Rate Data Collection Instrument (DCI).

  3. For source documents sent to Files, prepare a document request following local procedures. If the charge-out indicates it is charged out to another area, contact the area to get a copy of the document. If the charge-out indicates document not in file, review IDRS to see if the DLN was re-numbered. If, after thorough research you cannot locate document and at least 10 days have passed, input a second request using CC ESTABD "V." If document is not received within 15 days, review the letter following the procedures in IRM 3.0.275.4.4.3 and enter a check mark in the Case File Unavailable field of the Letter Error Rate Data Collection Instrument (DCI).

  4. For letters associated with an E-File return, request the Electronic Filing System (ELF) print. If the ELF print is not received within 25 days, review the letter following the procedures in IRM 3.0.275.4.4.3 and enter a check mark in the Case File Unavailable field of the Letter Error Rate Data Collection Instrument (DCI).

  5. To ensure a thorough review of all sample cases, secure source documents as often as possible. In cases where you are sure that the source document cannot be secured (e.g., case backup information has been destroyed), it is not necessary to follow the request procedures shown above.

  6. If an erroneous letter (Accuracy error only) needs to be deleted, input Command Code (CC) RVIEW (response in bottom left hand corner is CC QRACN). This CC will allow you to input a number that will delete the letter. See IRM 2.4.5, IDRS Terminal Input Command Codes QRADD, QRADDO, QRNCH, QRNCHG, RVIEW, QRACN, and QRIND for the Quality Review System.

3.0.275.4.4  (01-01-2002)
Data Input Instructions for Letter Error Rate

  1. Input the results of your case review into the SMART database.

3.0.275.4.4.1  (10-01-2008)
Letter Error Rate Data Input General Instructions

  1. Be as specific as possible when identifying an error. See Exhibit 3.0.275-3. for a check list for Entity, Letters, Greenies, RAIVS and Receipt and Control (R&C). Only three errors can be coded per letter. Information on additional errors notated in the comments field.

  2. For Systemic Errors, "Greenies," RAIVS, or R&C review, use the following Who Code definers:

    • 000 - Systemic

    • 001 - Greenies

    • 002 - RAIVS

    • 003 - R&C

    • 004 - ITIN (AUSPC only)

    • 005- MSU (Machine Services) (CSPC only)

      Note:

      The appropriate code 001, 002, 003, 004 or 005 must also be entered in the Team Number field.

  3. The CRX system will select letters generated by IDRS numbers 100 through 299. The ranges for each operation are:

    • Accounting Operation - IDRS Number Range 110–129 (plus 106 through 109 at CSPC only)

    • Director, Planning and Analysis, and Site Coordinator - 104 (Ogden Service Center (OSPC) only)

    • Receipt and Control Operation - IDRS Number Range 160–199

    • Document Perfection Operation - IDRS Number Range 220–249

    • Input Correction Operation - IDRS Number Range 250–279

    • Statistics of Income Operation - CSPC and OSPC only - IDRS Number Range 280–289

    • ITIN Operation - AUSPC only - IDRS Number Range 280–289

  4. If the letter you are reviewing was input by a Clerical Team but initiated by another team, see IRM 3.0.275.4.4.2. for RPT (Responsible Prior Team). This field will identify the team responsible for any error other than a clerical input error (for example, an incorrect or incomplete entry on the correspondence action sheet).

    Note:

    Headquarters does not require a RPT to be coded if there was no error on the letter. However, a decision could be made locally to code the RPT at all times for analytical purposes.

  5. If an incorrect IRS Received Date or Request Date is referenced in the Correspondence Received Date field, and the correct and incorrect dates are not greater than five calendar days apart, a non-systemic professionalism error will be charged.

  6. If during your review a Systemic Professionalism error was identified, do not code. However, a Problem Reporting Template (PRT) will be required for every systemic professionalism error. Check the Systemic Chart to see if the error has been reported previously. If not, complete the PRT and submit it to Headquarters. The PRT's will be monitored by the Headquarter Measure owner until correction. See IRM 3.0.275.1.8, Problem Reporting Instructions for Notice and Letter Error Rate, on completing a PRT.

  7. Code Letter errors in the following priority:

    1. Team Accuracy

    2. RPT Accuracy

    3. Systemic Accuracy

    4. Team Professionalism

    5. RPT Professionalism

    • Note:

      If a letter is not necessary, use the following coding: WHAT Code 107, WHERE Code 209, and TYPE A. Do not code additional defect.

  8. If, during your review, you identify that another functional area should have corresponded and did not, do not code for a defect but initiate feedback to the functional area.

3.0.275.4.4.2  (10-01-2007)
Letter Error Rate Data Collection Instrument (DCI) Input Instructions

  1. Open the Business Results Measures Home Page:
    http://balmeas.enterprise.irs.gov.

  2. Click on Letter Error Rate Tab.

  3. Letter Error Rate Home Page is shown.
    http://balmeas.enterprise.irs.gov/LtrErr/LtrErrHome.asp .

  4. Click on DCI Input Form Key and follow the "Business Results Measures for Submission Processing Functions, Job Aid," (Catalog Number 37697B, Training 6804-701) page 2-25, step 4. To obtain a copy of the Job Aid go to: http://coursebooks.enterprise.irs.gov . On the left hand side under Operation place cursor over Cross Functional, several selections appear, select Business Measures which will bring up the Training 6804-701, click on the training number.

  5. See Exhibit 3.0.275-3 and Exhibit 3.0.275-4. These exhibits contain a check list, and What and Where Codes.

3.0.275.4.4.3  (10-01-2008)
Letter Error Rate Input Instructions when Source Document is not Available

  1. After following the instructions in IRM 3.0.275.4.3.1 for requesting and searching for source documents, use the following instructions when you cannot obtain the source document.

    1. Use the appropriate Command Code to review the entity portion of the letter for misspelling.

    2. Enter a check mark in the Case File Unavailable field.

    3. Review the letter content for the following factors.

      Note:

      There may be technical issues for which the source document is not necessary (e.g., DLN and Master File Transaction (MFT)).

      Letter Error Rate Review Items

      Note:

      Use only if the source document cannot be located.

      Contact Point and Telephone Number Provided
      Timeframes
      Attachments/Enclosures
      Authorized Disclosure
      Clear and Appropriate Language
      Capitalization
      Grammar
      Punctuation
      Salutation
      Spacing
      Spelling/Typographical
      Closing

      Note:

      Code and Edit is required to verify the Entity information on the return is complete and legible.

3.0.275.4.5  (10-01-2006)
Reports for Letter Error Rate

  1. Two sets of reports are available for Letter Error Rate (National and Service Center).

  2. The Service Center Reports are:

    1. List of Reviews by Review Date

    2. List of Reviews by Review Date with Monthly Subtotals by Sample Date

    3. List of Reviews by Sample Date with Summary of Accuracy and Professionalism Errors

    4. List of Reviews by Sample Date

    5. List of Errors by Sample Date

    6. Letter Error Rate Report by Operations (All Errors)

    7. Letter Error Rate Report by SPC (All Errors)

    8. Letter Error Rate National Report

    9. Letter Error Rate Report by Operations (Non-Systemic Accuracy Errors)

    10. Letter Error Rate Report by SPC (Non-Systemic Accuracy Errors)

    11. Top 5 Error Report

    12. Top 5 Error Report by Operation

    13. Top 5 Error Report by WHAT and WHERE

    14. Letter Error Adhoc Query by Sample Date

    15. Letter Error Adhoc Query by Review Date

    16. List of Open Cases

    17. Letter Summary of Who, What, Where Report by Operations

    18. Letter Summary of Who, What, Where Report by SPC

    19. Letter Summary of Who, What, Where Report by RPT

    20. Letter Summary of Who, What, Where Report by Team

    21. Letter Summary of Accuracy and Professionalism Errors by Letter Type

    22. Letter Summary of IDRS Letter vs. Non-IDRS Letters

    23. Summary of Accuracy and Professionalism Errors by Operation

    24. Fiscal Year Weighted Error Rates by Campus

  3. The National Weighted Reports are:

    1. Letter Error Report — All Errors

    2. Letter Error Report — All Errors Except Systemics

    3. Letter Error Report — Accuracy Errors

    4. Letter Error Report — Non-Systemic Accuracy Errors Only

    5. Letter Error Report — Professionalism Errors

    6. Letter Error Report — Non-Systemic Professionalism Errors Only

    7. Letter Error Rate — Weighted Roll up

    8. Summary of Open Cases

    9. Fiscal Year Weighted Rates by Service Center

    10. Fiscal Year Weighted Rates — National, W&I and SB/SE

3.0.275.5  (12-01-2004)
Deposit Error Rate Business Results Measure

  1. The purpose of the Deposit Error Rate Business Results Measure is to determine the number of inaccuracies created by the Submission Processing Center during remittance processing.

3.0.275.5.1  (10-01-2008)
Definition of Deposit Error Rate

  1. Deposit Error Rate is defined as the percentage of errors made by the Submission Processing Center during remittance processing. These errors result in the inaccurate processing of deposits and may have a negative impact on the taxpayer.

3.0.275.5.2  (10-01-2011)
Sample Instructions for Deposit Error Rate

  1. Improvement Team employee(s) will pull the sample from the deposit function. Generally, this would be after Extracting and prior to leaving Pre-Batch (before payment is perfected and input). Samples from the Field Offices and Lockbox can be pulled earlier in the process. It is important the sample reflects the entire universe of remittances processed.

    Note:

    This may require the Improvement Team employee(s) is available for multiple shifts. The Improvement Team must go to Receipt and Control as often as necessary to get samples from all mail drops. During peak periods, we request the Improvement Team make every effort to get samples from all mail deliveries, and all shifts, and to include the weekends, if possible. If the weekends are included, the daily sample should be reduced accordingly so as not to substantially exceed the total monthly sample size.

  2. The sampling mix is determined by an individual SPCs actual history of receipts by month. The W&I Headquarters analyst will supply annually, for each SPC, a monthly mix of eight samples (monthly sample size equals 168, divided by 21 work days) from the following areas: Remittance Processing System (R-RPS), Transport, Payment Perfection, Lockbox, Field Office, and Non-transport.

  3. Improvement Team employees will use the following procedures after the sample has been identified:

    1. When a case has been identified for the sample, mark the place from where the case was pulled. Ensure the case is returned to the original location.

    2. Photocopy all information needed to determine the application of the payments. Photocopy the remittance (check, money order, draft), all payment related source documents (voucher, bill, return, correspondence), and envelope, if applicable. Ensure that the copies are legible.

    3. Perfect any illegible data on the photocopies.

    4. If you discover the sample check is one of two or more remittances to be applied to the same source document (multiple remittance ), photocopy all remittances received. Indicate the check to be reviewed.

    5. Ensure the IRS received date is present. If not, annotate the received date on the photocopy.

    6. Ensure the case was reassembled correctly and the live document is returned to its original location as marked.

    7. If your sample includes using a sample plan, ensure work already counted and sampled is not recounted by placing a "Improvement Team Counted" placard on it or by another method of identification that does not alter the live work.

    8. Optional, attach a paper Data Collection Instrument (DCI) to each case and enter the deposit amount, sample date, and IRS received date on the paper DCI.

      Note:

      If a SPC chooses not to prepare a paper DCI and if an area wants a copy of the DCI, the SPC will print the DCI from the SMART database.

  4. Do not include the following in your Accuracy sample:

    • State forms with checks made out to IRS or United States Treasury

    • Unacceptable Payments (i.e.,, Savings Bonds, Credit Cards, etc.)

    • Non-negotiable checks

    • Voided checks

    • Completely blank check

  5. Each Submission Processing Center Deposit Accuracy Coordinator is responsible for setting up any necessary local procedures for obtaining sample documents. The Improvement Team employee(s) before accessing restricted areas in Receipt and Control, review security procedures. The Coordinator must review the availability of photocopy machines in the sampling area and make arrangements to provide additional photocopy facilities if they are needed. Improvement Team employee(s) may also require a workstation within Receipt and Control to complete their work.

3.0.275.5.2.1  (12-01-2003)
Deposit Error Rate Inventory Control

  1. Within two days of the sample pull, enter the deposit amount, IRS received date (earliest date the payment was received at any IRS location), and sample date from the DCI into the SMART Database. When the case record is saved, a sequence number will be assigned. Annotate the sequence number at the top of the document or paper DCI. The case record will be held in suspense status (Case Status code "S" in the SMART database) pending final case review and input.

  2. File sample cases by sample date and DCI number until the payment transaction posts and the case file can be reviewed.

3.0.275.5.3  (10-01-2010)
Case Review Instructions for Deposit Error Rate

  1. You may choose to begin your initial review (including a print of IDRS) when creating the DCI. One to two weeks later, review IDRS to determine if the payment was applied and posted correctly using CFOL command codes. Remittance Transaction Research (RTR) may also be utilized. If the payment is found on the correct account with no errors, close the case as perfect.

    Note:

    Beginning with cases sampled October 1, 2005, the following two changes have been approved for coding deposit error rate cases:
    IRS Received Date
    If a payment posted with an incorrect IRS Received Date and the correct and incorrect dates are timely, and are not greater than five calendar days apart, do not code a defect. If, during a case review, it is obvious that the resulting incorrect date will cause a negative taxpayer impact, code a defect even if it fits the above criteria. Written feedback must be supplied to the Operation involved, and enter a feedback comment on the Data Collection Instrument (DCI). Please note: When considering timeliness in the preceding statement, grace periods of seven days from the return due date of an original return or extension should be considered. Also to be considered is the grace period from the 23C date of a notice if the account status is 19, 20, 21, 56, or 58. See IRM 21.5.1.4.2.5 Account Resolution - General Adjustments Received Date - Grace Periods to determine the grace period if the balance due is less than or more than $100,000.00.

    Unpostables
    At the time the case is being coded (approximately two weeks from the sample date), if a remittance has gone Unpostable, wait for the resolution. If the payment posts to Master File correctly, and there is no negative taxpayer impact, do not code a defect. Written feedback must be supplied to the Operation involved, and a feedback comment entered on the DCI. If the payment post the Master File incorrectly, code a defect, capturing both the original Operation and Unpostables. If, after communicating with the Unpostables Unit, the case has not been resolved in time to enter the data into SMART, code a defect to the original Operation if an internal error has been made.

    Transaction Code (TC) 570
    If during your review, the tax module shows the payment and adjustment posting in the same cycle and no TC 570 is shown, due to an even balance or balance due, do not charge an error. In these type of cases the computer drops the TC 570 from posting.

  2. See IRM 3.8.45, Manual Deposit Process and IRM 3.8.44, Campus Deposit Activity, are the major source of processing procedures for remittances. In addition, check with the functional areas in the Receipt and Control Operation for any local procedures and additional processing guidelines. However, review the Accuracy of the sample case based solely on IRM guidelines (IRM 3.0.275.1.9).

  3. The Extraction Function in Receipt and Control Operation performs the initial sort in "perfect" and "imperfect" remittance categories. You will determine the payment processing data for coding purposes based on the information contained in IRM 3.10.72, Extracting, Sorting, and Numbering.

  4. You may need to research the General Fund Remittance Recap Report (RRPS) by deposit date to verify that the payment was applied to the correct Treasury Account Symbol (TAS). See IRM 3.17.63, Interim Revenue Accounting Control System, for a complete listing of Treasury Account Symbols.

  5. Thoroughly examine the copy of the remittance and all associated posting documents to determine the Accuracy of processing each sample case. Base your identification of processing errors on the taxpayer's intent and IRM procedural instructions.

    Note:

    When evaluating the transaction code....
    If your sample does not include a copy of a return or an indication that there was a return, AND it is not obvious there would be a negative impact on the taxpayer, consider either a TC 610 or a TC 670 as being correct.

  6. You may also identify and record taxpayer errors that contribute to misapplied payments. However, these errors will not be used to calculate your SPCs error rate. If a sample case contains both IRS and taxpayer errors, always code IRS errors first. Code the DCI for all errors you identified during case review.

  7. Deposit Error Rate Site Coordinators must develop local procedures for Improvement Team access to research records that are maintained in the Accounting functions.

  8. See Exhibit 3.0.275-5. for a series of "Frequently Asked Questions" about the Deposit Error Rate review process.

3.0.275.5.3.1  (10-01-2011)
Deposit Error Rate Review of Revenue Receipts

  1. Remittances received and deposited into the Revenue Receipts fund consist of the following items:

    • Internal Revenue taxes, penalties, interest and costs, assessed or assessable against taxpayers.

    • Payments on accepted Offers-in-Compromise.

    • Payment of court fines, court costs forfeitures, penalties incident to or imposed for violation of Internal Regional laws, from the redemption of property acquired by the government.

    • Receipts from consummated sales of acquired property.

  2. Access IDRS and use the following Command Codes to determine the posting of each sample case.

    Research Command Codes for TIN
    If... Then...
    No Taxpayer Identification Number (TIN) is present, Use Command Codes NAMEI, NAMEB, NAMEE, or NAMES to obtain the TIN
    TIN is found, Use any of the following CCs to locate the payments: IMFOLI, IMFOLP, BMFOLI, BMFOLP and TXMODA. If payment found, go to "Payment is found" in the "IF" column.

    Note:

    IMFOLP and BMFOLP displays payment transactions within a specified date range. The input format is IMFOLP000-00-0000 YYYYMM YYYYMM or BMFOLP00–0000000 YYYYMM YYYYMM

    TIN is not found, Verify if the case was forwarded to the Unidentified Function.
    TIN is present and no intended tax period can be determined, Use Command Codes BMFOLI, IMFOLI, TXMOD and SUMRY to locate debit modules. Research all modules to locate the payment.
    Payment applied to earliest Collection Statute Expiration Date (CSED), Go to, "Payment is Found" in the "IF" column.
    Payment not applied to earliest CSED, Go to, "Payment is found" in the "IF" column.
    TIN and intended tax period can be determined by remittance and/or posting document, Use Command Codes BMFOLT, IMFOLT, SUMRY, and TXMODA to verify the Accuracy of the payment posting information. Go to, "Payment is found" in the "IF" column.
    Payment is found, Determine taxpayer intent and verify Accuracy of payment posting information (correct account, MFT, tax period, transaction code, etc.).
    Payment posting information is correct, Code the DCI and file the case.
    Payment posting information is incorrect, Code the DCI, prepare the Communication Record and route the case to the appropriate area.
    Payment is not found, (i.e., payment has not posted to Master File), Research RTR Subsystem and use Command Codes UPTIN, URINQ, and SCFTR to search for the payment.
    Payment found on RTR Subsystem, Go to "Payment is found" in the "IF" column.
    Payment found on CC UPTIN, Review IRM 3.12.32, General Unpostables to determine the cause of the unpostable condition and go to, "Payment is found" in the "IF" column.
    Payment found on URINQ, Unidentified (URF) payment, Review to determine if IRM 3.17.10, Dishonored Check File (DCF) and Unidentified Remittance File (URF) and IRM 3.8.44, Campus Deposit Activity were followed for accurate placement in the URF account and go to, "Payment is found" in the "IF" column.
    Payment found on SCFTR, Review the appropriate IRMs to determine the cause and go to, "Payment is found" in the "IF" column.
    Payment not found on RTR Subsystem, UPTIN, URINQ or SCFTR, Give the case to the Deposit Coordinator, or the Receipt and Control manager, to locate the payment. If the payment is not found, and all research has been exhausted, delete the case from the sample.

3.0.275.5.3.2  (10-01-2008)
Deposit Error Rate Review of Automated Non-Master File (ANMF) Payments

  1. Access the ANMF system to determine the posting of the NMF payment. NMF accounts generally have Social Security Numbers (SSNs) or Employee Identification Number (EINs) ending with the letter "N" .

  2. To review NMF payments you must access the ANMF system. See Exhibit 3.0.275-6. for types of NMF accounts.

    Payment Posting
    If... Then...
    Payment is found, Determine the taxpayer's intent and verify the Accuracy of the payment posting information, such as, correct: Taxpayer's account, MFT Code, Tax Period, and Transaction Code.
    Payment posting information correct, Code the DCI and file the case.
    Payment posting information incorrect, Code the DCI Code, prepare the Communication Record and route the case to the appropriate area.
    Payment is not found, Request research of the ANMF Unpostable listing. Access IDRS for possible misapplication of the payment to a Master File account.
    Payment not found on ANMF unpostable listings, Route the case to Hard Core Payment Tracer Function and delete the case from the sample.

3.0.275.5.3.3  (10-01-2008)
Deposit Error Rate Review of General Fund Payments

  1. Examples of General Fund Payments are user fees, photocopy fees, conscience fund payments, and payments to the public debt.

  2. Access RTR to verify payment processing data and to determine the payment deposit date. Research the RRPS General Fund Remittance Recap Report by deposit date to verify that the payment was applied to the correct Treasury Account Symbol (TAS). See IRM 3.17.63.2.6, Accounting and Data Control, Redesign Revenue Accounting Control System, for a complete listing of Treasury Account Symbols.

    Note:

    Employee Plans/Exempt Organization (EP/EO) User Fee payments can be verified solely by the payment DLN. Access RTR to verify the payment data. A user Fee DLN can be recognized as Document Code 57 with the Blocking Series 50X. Installment Agreement User fees are also General Fund, but can be verified on the Master File under MFT 55 for IMF and MFT 13 for BMF.

3.0.275.5.3.4  (12-01-2004)
Deposit Error Rate Review of Deposit Fund Payments

  1. Examples of Deposit Fund payments are Offer in Compromise and Sale of Seized Property.

  2. Follow the procedures in Research the RRPS Deposit Fund Remittance Recap Report. See IRM 3.0.275.5.3.3.

  3. Follow the procedures in Deposit Error Rate Review of Revenue Receipts. See IRM 3.0.275.5.3.1.

3.0.275.5.3.5  (12-01-2003)
Deposit Error Rate Review of Refund Repayments

  1. Remittance received and deposited into the Repayments to Refund Appropriations funds are:

    • Recoveries of erroneous, duplicate, or overpaid refunds

    • Fraudulently negotiated checks

  2. Access IDRS to verify proper application of the payment. Refund repayments post with Transaction Code 720 and Document Code 45.

  3. Follow the procedures in Revenue Receipts payments.

3.0.275.5.4  (10-01-2012)
Deposit Error Rate Measure Data Input Instructions

  1. To open the Home Page for Deposits: http://balmeas.enterprise.irs.gov/DepError/DepErrorHome.asp .

    Note:

    Enter the information based upon the taxpayer's intent.

  2. Select the "DCI input form" action from the main Deposit Error Rate menu and enter "search record" and the DCI number. Enter the information from either the paper DCI or case in the corresponding fields of the database screen. Edit any of the three original input fields as needed.

  3. Enter the following case identification fields for every sample case:

    1. DCI number: Generated by the database from the Inventory Control input.

    2. Campus: Generated

    3. Site: This field will generate from your user login.

    4. Case Type: Generated by the database from the Inventory Control input.

    5. Review Date: Generated by the database from the Inventory Control input.

    6. Reviewer ID (IDRS Profile #): Initial entry of IDRS Employee Number is required. The field will generate on second and subsequent case records.

    7. Sample Date: "Verify" the date the sample was pulled. If incorrect, correct.

    8. Deposit Amount: "Verify" the amount of the remittance. If incorrect, correct.

    9. DLN: Enter the 14 digit payment DLN shown on the source document, without dashes.

    10. Deposit Date: Generated from the Julian Date in the DLN.

    11. IRS Received Date: Enter the earliest date the payment was received at any IRS location. Refer to IRM 3.8.44, Campus Deposit Activity. After review, see IRM 3.0.275.5.3. (1) note before finalizing.

    12. Negative Taxpayer Impact: All accurate cases must be coded "N" for no. This is the default if no entry is made. On all inaccurate cases, determine if the error, had it not been pulled as part of the sample, would have caused a negative impact to the taxpayer. If so, code with a "Y" or yes. This would include, but would not be limited to unnecessary notice or letter, incorrect penalty and/or interest assessed, erroneous or delayed refund, or anything else that would give the taxpayer incorrect information or cause them to contact the IRS when they normally would not have. Correction of the error during or after the coding process should not change this determination.

    13. Master File: From the drop down menu, select the Master File to which the payment was posted. Refer to Document 6209, IRS Processing Codes and Information, for further explanation.

      Deposit Error Rate Master File Entries
      IMF
      BMF
      IRAF (Individual Retirement Account File)
      NMF
      URF
      Other
    14. MFT: From the drop down menu, select the MFT code used to post the payment. Refer to Document 6209, IRS Processing Codes and Information, for further explanation.

    15. Tax Period: Enter using YYYYMM format. Enter "000000" for URF, General, and Deposit Fund cases.

    16. Deposit Transaction Code: From the drop down menu, select the Transaction Code used to post the payment. Enter "000" for URF, General, and Deposit Fund cases.

      Note:

      When evaluating the transaction code...
      If your sample does not include a copy of a return or an indication that there was a return, AND it is not obvious there would be a negative impact on the taxpayer, consider either a TC 610 or a TC 670 as being correct.

    17. Remittance Posting Cycle: Enter the cycle the payment posted using YYYYCC format.

      Remittance Posting Cycle Entries
      Unpostable: Enter the cycle the case unposted
      Master File: Enter the payment posting cycle
      Deposit/General Fund and URF: Enter the deposit date cycle
    18. Time (HH:MM): Reserved

    19. Type of Posting Document: Select one of the following from the drop down menu.

      Deposit Error Rate Type of Posting Document Entries
      Return
      Voucher
      Bill
      Envelope
      IRS Letter
      Extension
      Check Only
      Taxpayer Letter
      Form 8109
      Other
    20. Postmark Date: Enter the postmark date on the envelope (when available) using MM/DD/YY format.

    21. Payment Processing Data: Select "Perfect" or "Imperfect" from the drop down menu. Base your entry on IRM 3.10.72, Receiving, Extracting, and Sorting.

    22. Deposit Method: Refer to IRM 3.8.44, Campus Deposit Activity and IRM 3.8.45, Manual Deposit Process, for identification of exception processing. Enter one of the following from the drop down menu.

      Deposit Error Rate Deposit Method Entries
      Command Code PAYMT
      Exception Processing
      RRPS
    23. Origin of Payment: Enter one of the following from the drop down menu: D.O., Lockbox, or Site.

    24. Payment Category: Refer to IRM 3.17.63, Redesign Revenue Accounting Control System. Enter one of the following from the drop down menu.

      Payment Category Entries
      Revenue Receipts
      Deposit Fund
      General Fund
      Refund Repayment
      Split Payment
      Multiple Payment
    25. Accurate: Enter "Y" if you found no errors during case review. Enter "N" if you found error.

      Note:

      If the Accurate field is coded "Y," stop here. If Accurate is coded "N," continue.

    26. Internal Error: Enter "Y" if you found IRS errors during case review. Enter "N" if you found a taxpayer or field office error. "N" coded records will not count against the center error rate.

      Note:

      Reviewed Not a data entry field. Upon initial case record entry, this field will display "N." It will automatically change to a "Y" two calendar months from the review date. A "Y" in this field disables the ability to edit the case record.

      Note:

      Case Status Not a data entry field. Upon initial case record entry, this field will display "S." It will automatically change to a "C" after the edit function has been completed.

  4. Enter the following error identification fields only if you have identified an error on the sample case:

    1. Who Code: Identify where the error occurred by selecting the appropriate Who Code from the drop down menu. For a list of valid Who Codes, see Exhibit 3.0.275-7.

      Note:

      Data Conversion or Receipt and Control (Who Code 7) to be used when, after research and managerial approval, the area responsible cannot be determined. This Who Code is not restricted for use only with received date defects.

    2. What Code: Enter a defect description by selecting the appropriate What Code from the drop down menu. For a list of valid What Codes, see Exhibit 3.0.275-7.

      Note:

      There are three Who/What combinations available for each case record.

    3. Comments: Use this field to record any additional information or comments relating to the review. An entry in this field is required when Who Codes 1 or 5 are used.

      Note:

      Do not enter sensitive taxpayer data in the Comments field. Never enter a TIN, and do not enter a taxpayer's full name or address.

      Note:

      Beginning immediately, if your sample is either an Automated Collection System (ACS) or Offer in Compromise (OIC) deposit, please identify it as such in the comments field of the DCI. If you put either ACS or OIC as the first three characters in comments it will be easier to identify in a query.

  5. When all of the case information is entered and verified, select the "Update" button.

  6. When the "Update" button is pressed, the DCI record will undergo validity checks. If the record fails any validity check, you will see an error message and your cursor will be placed in the invalid field. Correct your entry and select the "Update" button again. This process will continue until all validity checks are passed.

  7. When a valid DCI is submitted to the database, the status code of the case record will change to "Closed" (C).

3.0.275.5.5  (10-01-2014)
Reports for Deposit Error Rate

  1. Two sets of reports are available for Deposit Error Rate,National and Service Center.

  2. The National Reports are:

    1. Non-Lockbox:

      Combined - Period and Cumulative Combined

      Deposit Error Roll-up Report

      By Service Center - Period and Cumulative Data

      Grouped by IMF/BMF - Period and Cumulative Data

  3. The Service Center Reports are:

    1. Raw Error Rate Quality Rpt by Review Dt

    2. Raw Error Rate Quality Rpt by Sample Dt

    3. Top 5 What Codes Rpt by Sample Date

    4. Top 5 What Codes Rpt by Review Date

    5. Top 5 Who Codes Rpt by Sample Date

    6. Top 5 Who Codes Rpt by Review Date

    7. Who/What Rpt by Sample Date

    8. Who/What Rpt by Sample Date (with cases)

    9. Who/What Rpt by Review Date (with cases)

    10. Who/What Rpt with Comments

    11. List Cases by Sample Date

    12. Adhoc Summary Report (consolidated summaries)

3.0.275.5.5.1  (10-01-2014)
Deposit Error Rate Raw Error Rate Quality Report

  1. These reports provide the total errors and accuracy rate by review date or sample date and for a selected range of dates. The Master File categories are IMF, BMF and URF. The report displays the Master File, number of cases reviewed, total number in error, internal error count, error rate, and accuracy rate. The volume of errors determines the quality rates.

    Note:

    A remittance that is corrected by Unpostables, and creates no negative taxpayer impact, will be coded as an accurate case. Provide detailed and timely feedback to the remittance processing functional area that created the unpostable condition. This situation must be documented in the comments field on the SMART database.

3.0.275.5.5.2  (10-01-2014)
Deposit Error Rate Top 5 What and Top 5 Who Codes Report

  1. These reports provide a list of the Top 5 What Codes or theTop 5 Who Codes, number of errors and percentage of errors for a selected range of dates. The data appears in a table format.

3.0.275.5.5.3  (10-01-2014)
Deposit Error Rate Summary Reports

  1. These reports enable the user to query information by entering the desired selection into a variety of parameters. The information may be selected by month, fiscal year or a specific date range. A date type of either the sample date, review date, IRS received date, deposit date or postmark date is selected. The user determines which data is to be retrieved by the query. The parameters include, but are not limited to, Case Status Accurate, Internal Errorand Site, .

3.0.275.6  (12-01-2003)
Refund Timeliness and Error Rate Measures

  1. The Refund Timeliness and Error Rate (RT&E) Business Results Measure program provides us with the ability to assess the timeliness and Accuracy of refunds that are generated from current year paper IMF returns.

    Note:

    E-file returns are not included in this measure.

3.0.275.6.1  (03-09-2012)
Definition of Refund Timeliness

  1. Refund Timeliness is defined as the percentage of refunds from current year Form 1040 family paper returns issued in 40 days or less.

  2. Timeliness is measured using the number of days from two days prior to the day the taxpayer's return is received through the day the taxpayer receives their refund. Since timeliness is measured from the taxpayer's perspective, we subtract two days from the IRS received date to approximate the date the return was mailed. For paper refund checks, we assume the taxpayer receives their refund the day after the Refund Pay Date shown on the taxpayer's Master File account. For direct deposit refunds, the taxpayer received their refund on the Refund Pay Date. For paper checks, the Refund Pay Date is 6 business days after the return posts to Master File. For direct deposit, the Refund Pay Date is 4 business days after the return posts to Master File.


More Internal Revenue Manual