13.5.1  TAS Balanced Measure System  (10-01-2001)

  1. As part of the IRS and its new organization, TAS has incorporated balanced measures in its procedures.  (10-01-2001)
The Balanced Measurement System

  1. In 1998, IRS developed a plan for Modernization that consisted of five "levers of change" to modernize the IRS and support the new mission. One "lever" involves implementing a system of balanced measures to assist in measuring and improving organization performance.

  2. The IRS developed its balanced measurement system in an effort to modernize the IRS and to reflect the IRS' priorities as articulated in the IRS mission statement. This system does the following:

    1. supports the organizational mission and goals;

    2. communicates priorities;

    3. meets external reporting requirements;

    4. provides a balanced approach to measuring performance at each level of the organization; and

    5. expands performance measurement to include more than operational financial results.

  3. To ensure balance, this system measures performance in three areas:

    1. customer satisfaction;

    2. employee satisfaction; and

    3. business results.

  4. Each of these measurement areas are carefully considered when setting organizational objectives, establishing goals, assessing progress and results, and evaluating individual performance.

  5. TAS uses balanced measures to assess program effectiveness and service delivery.  (10-01-2001)
Customer Satisfaction Measures

  1. The goal of customer satisfaction, one part of the balanced measurement system, is to continually improve the IRS' ability to provide accurate and professional service to internal and external customers in a courteous and timely manner.

  2. IRS determines customer satisfaction goals and accomplishments on the basis of customer feedback collected via questionnaires, surveys, and other types of information gathering mechanisms.

  3. The balanced measures system will help answer key questions regarding customer satisfaction including:

    1. the general level of customer satisfaction in specific IRS transactions;

    2. the frequency and reason for customer dissatisfaction; and

    3. taxpayers' perceptions and expectations of the IRS.  (10-01-2001)
Employee Satisfaction Measures

  1. The goal of employee satisfaction, the second part of the balanced measurement system, is to provide a work environment that enables employees to achieve organizational goals.

  2. The "quality work environment" goal assesses measures of employee satisfaction.

  3. The balanced measurement system will help answer key questions regarding IRS employee satisfaction including:

    1. the general level of employee satisfaction;

    2. employees' perceptions of the effectiveness of various levels of management; and

    3. the employees' perception of the IRS' provision of an appropriate work environment.  (10-01-2001)
Business Results Measures — Quantity and Quality

  1. The goal of business results is to generate a productive quantity of work in a quality manner and to provide meaningful outreach to customers. This includes delivering a volume of work within mixed categories and dedicating appropriate resources to provide taxpayer education, outreach, and advocacy.

  2. The business results measures consist of numerical results determined under the "quantity" and "quality" elements.

  3. The goal involves more than achieving a target or predetermined quality and quantity results. It addresses key questions regarding business results including:

    1. Did we assess/collect the proper tax?

    2. Did we provide quality customer service?

    3. Did we use our resources in accordance with the plan?

    4. Did we provide education and outreach/advocacy to our customers?

  4. Each balanced measure element represents an important aspect of the organization's goals and is of equal importance in carrying out the IRS's program and functions.

  5. The frequency of measures data availability may vary across the three elements because the data from each element changes at different intervals and/or requires more time for data collection. However, differences in the frequency of data availability do not reflect differences in priority of the balanced measurement elements.  (10-01-2001)
Using Balanced Measures

  1. The balanced measurement system changed the way the IRS uses measures. In the past IRS used measures in ways that produced unintended consequences, such as ranking business units or functions on an index of measures.

  2. The emphasis of the new system is on understanding why measured data have changed and what actions IRS took or should have taken to influence results.

  3. We are not managing programs to achieve numbers. We are managing processes and people to achieve the IRS mission.

  4. The use of quality tools such as flow charts, pareto charts, root cause analysis, etc., prove valuable in the fact finding process.

  5. You can find information about the appropriate use of measures, including the definition of record of tax enforcement results (ROTERs), goal setting use of measures in evaluations, sharing data, etc., in IRM 105.4, Managing Statistics in a Balanced Measurement System.

  6. All managers must ensure strict adherence to this guidance.  (10-01-2001)
Data Analysis

  1. Businesses often refer to the process of analyzing business results or understanding the changes in measures as "getting behind the numbers." There is no prescribed method for this although the following guides may be of assistance:

    1. The President's Quality Award (PQA) Criteria for Excellence is an excellent way of evaluating organization success and developing opportunities for improvement based on comparison with "best in class" activity.

    2. The Management Analysis Resource Guide, based on the Management Model, used in conjunction with diagnostic tools, helps managers understand activities they should be doing on a regular basis in order to "get behind the numbers."

  2. Typically, we initiate the analytical process by looking at:

    1. the balanced measures data and the diagnostic tools data;

    2. those more closely involved in the process; and

    3. the actions that impacted the results.


    A change in the TAS case cycle time measure may be due to different methods of case progression or to equipment problems.

  3. It may not be easy to discern the root cause of the change by looking at data alone. An analysis of the facts and a brief conversation with individuals close to the process may quickly bring this to light.

  4. You may find detailed information on the balanced measures system in IRM 105.4, Managing Statistics in a Balanced Measurement System, Chapter 2.  (10-01-2001)
Diagnostic Tools

  1. The IRS collects a great deal of additional information about programs and services, some of which have been used as performance measures in the past.

  2. Under the balanced measurement system, a business unit or organization may only use the approved set of balanced measures — both strategic and operational — to measure its performance.

  3. IRS refers to indicators that are not designated as balanced measures as "diagnostic tools." Use diagnostic tools to analyze the factors that affect changes in the balanced measures' performance and to "get behind the numbers."

  4. Using diagnostics tools provides you with a mechanism to analyze factors that influence performance and encourages dialogue about specific actions that managers may take to improve customer satisfaction, employee satisfaction, and business results.

  5. Do not set goals or targets for diagnostic tools and do not use them in individual performance evaluations.

  6. Diagnostic tools include any type of data that is helpful in understanding what influences and impacts balanced measures. It is permissible to use ROTERs as diagnostic tools.  (10-01-2001)
Reference Guide for Diagnostic Tools

  1. The Reference Guide for Diagnostic Tools serves as a managerial resource to help you change the way you review statistics and manage data. Although originally written for previous IRS functions, the overall framework for diagnostic tools translates to the new IRS' organization.

  2. The guide will:

    1. Encourage dialogue by providing a mechanism to help managers analyze the data elements/diagnostic tools that are "behind the numbers" and to focus on management actions based on that context;

    2. Aid management by helping diagnose outcomes to get "behind the numbers" and to analyze indicators/data elements and actions taken to address the problems;

    3. Assist management of organizations or business units to focus on and understand the underlying factors that affect the key set of balanced measures (e.g., to help separate controllable and external factors);

    4. Provide information to allow revision of future plans; and

    5. Define topic parameters for operations/business review by setting expectations, eliminating surprises, and ensuring the ability to prepare.

  3. Some TAS examples of diagnostic tools are listed in Exhibit 13.5.1-1, TAS Diagnostic Tools.  (10-01-2001)
Proper Use of Diagnostic Tools

  1. Use diagnostic tools to understand underlying factors that cause changes in balanced measures.

  2. Do not use diagnostic tools as organization performance measures.

  3. Do not set goals or targets for diagnostic tools.

  4. Do not use these tools as an individual performance measure (i.e., in evaluating an individual employee's performance).

  5. Do not use diagnostic tools as a "gotcha" in performance reviews.

  6. Do not compare the results of diagnostic tools when evaluating different offices or units.  (10-01-2001)
Business Performance Review System

  1. The Business Performance Review System (BPRS) represents a fundamental change in the way IRS reviews and manages its operations.

  2. The system enables a periodic review of strategic and operations issues and business unit performance and facilitates the assessment of the IRS' progress in achieving its mission and strategic goals.

  3. The BPRS establishes a framework for measuring, reporting, and reviewing a business unit's performance against plans established within the Strategic Planning and Budget process.

  4. During this process each business unit identifies, defines, and tracks the essential elements of its performance.

  5. IRS uses BPRS meetings to assess all business units' progress toward organizational goals and to identify crosscutting issues in a timely manner.

  6. The BPRS provides the opportunity to identify areas of the plan where revisions are necessary, which then can be incorporated into the next strategic planning and budget cycle.  (10-01-2001)
Taxpayer Advocate Service Balanced Measures

  1. The National Taxpayer Advocate (NTA) approved ten measures. Text below describes in detail all the measures.

  2. The measures are broken down into the categories mentioned above — customer satisfaction, employee satisfaction, and business results (quantity and quality).

  3. The measures within each category relate to either casework or advocacy. See Exhibit 13.5.1-2, TAS Balanced Measures, for a chart of TAS' ten measures and this relationship.  (10-01-2001)
Casework Quality Index

  1. The casework quality index is one of the quality business results of TAS' balanced measures.

  2. This is a numerical indicator of the extent to which TAS casework meets the standards prescribed.

  3. These results are indicators of quality and are used to identify and correct problems that require changes in areas such as procedures and training as well as to identify best practices or systemic problems.

  4. The results of these reviews are for use by the National Office as well as Area Taxpayer Advocates (ATAs), Local Taxpayer Advocates (LTAs), and quality analysts only. Do not share specific case review results (i.e., quality review checksheets and comments sheets) with TAS group managers or Associate Advocates (AAs)/Senior Associate Advocates (SAAs). See Exhibit 13.5.1-3, Sharing Quality Review Results, for more information about sharing quality review statistics and case results.

  5. The information in the following text describes TAS' responsibilities for this measure, the procedures for calculating the index, and other features of this balanced measure.  (10-01-2001)
Responsibilities for TAS' Quality Review Program

  1. The Director, Program Planning and Review, in the NTA's office, the ATA, and LTA are responsible for the quality review program in their respective offices.

  2. The information in the sections below describes the responsibilities in detail.  (10-01-2001)
National Responsibilities

  1. The Director, Program Planning and Review at the National Office is responsible for the centralized quality review program for TAS.

  2. That person is responsible for:

    1. ensuring that the review sites review and document the results of the monthly quality samples sent from the field TAS offices and transmit the quality review database records to the appropriate field sites as well as National Office;

    2. maintaining and revising, as appropriate, the TAS quality review database (QRDB);

    3. generating the national monthly and cumulative quality reports and the data sent to the Executive Management Support System (EMSS);

    4. responding to DIALOGUE situations (see IRM, DIALOGUE Process) which the field or review sites elevate to the National Office and disseminating the results to the area TAS offices and review sites;

    5. publishing a quarterly quality newsletter through the appropriate TAS communication channels; and

    6. providing an annual report on quality consisting of information from area offices.  (10-01-2001)
Area Responsibilities

  1. Each ATA is responsible for the centralized quality review program for TAS in his/her area.

  2. The ATA is responsible for:

    1. ensuring that the monthly quality samples are sent from the area's field TAS offices to the appropriate review site;

    2. maintaining the area's TAS quality review database (QRDB);

    3. generating the area's monthly and cumulative quality reports;

    4. being the first level of review for DIALOGUE situations from the field offices in that area (see IRM, DIALOGUE Process, elevating to the National Office any DIALOGUE situations for which there is a disagreement;

    5. disseminating the results of nationally elevated DIALOGUE results to the field TAS offices in that area;

    6. analyzing quality review data for the area and offices within that area to identify trends, procedures needing improvement, training needs, systemic problems, and best practices;

    7. Using the analytical results to improve quality in the area's local offices/campuses (e.g., share best practices, set up area training classes, work with offices on specific problem areas, etc.);

    8. sharing systemic analysis results with the Operating Division Taxpayer Advocate (ODTA) for the appropriate business unit;

    9. providing input to the quarterly quality newsletter published by the National Office; and

    10. providing input (i.e., quality analysis) to the National Office for an annual report on quality.  (10-01-2001)
Local Office/Campus Responsibilities

  1. The LTA at each local office/campus is responsible for the centralized quality review program for TAS in his/her office.

  2. The LTA is responsible for:

    1. ensuring that the monthly quality samples are sent from the TAS offices to the appropriate review site;

    2. maintaining the office's TAS quality review database (QRDB);

    3. generating the office's monthly and cumulative quality reports;

    4. submitting DIALOGUE situations for questionable situations to the appropriate review site, keeping the area quality analyst informed of the situations, and elevating any discrepancies to the National Office (see IRM, DIALOGUE Process)

    5. disseminating clarifications in TAS procedures to the field TAS managers and caseworkers in the office;

    6. analyzing quality review data for the office to identify trends, procedures needing improvement, training needs, systemic problems, and best practices;

    7. Using the analytical results to improve quality in the local office/campus (e.g., share best practices, set up training classes, work with managers and AAs/SAAs on specific problem areas, etc.);

    8. ensuring cases that should be reopened are correctly resolved;

    9. sharing systemic analysis results with the area and ODTA for the appropriate business unit;

    10. providing input to the area for the quarterly quality newsletter published by the National Office; and

    11. providing input (i.e., quality analysis) to the area for the national annual report on quality.  (10-01-2001)
Monthly Quality Random Samples

  1. At the end of each month the local offices and campuses must select a random sample of their regular and reopen criteria cases. See Exhibit 13.5.1-4, Quality Sample Sizes for Centralized Quality Review, for sample sizes by office.

  2. Until TAMIS has a new quality sampling report, use a Vision Query (a/k/a Intelligent Query (VQ)) procedure to list the cases eligible for sampling.

  3. See Exhibit 13.5.1-5, Selecting Cases for the Centralized Quality Review Sample, for procedures on selecting the sample.

  4. Send the samples by the scheduled due date to the Oakland review site for local offices or to the Brookhaven review site for campuses. The National Office publishes a quality review schedule one month prior to the beginning of each fiscal year.

  5. See Exhibit 13.5.1-6, Documentation Required for TAS Cases, for documentation that must be in each case file.

  6. The addresses for the review sites are:

    1. Oakland — Internal Revenue Service, Quality Assurance Division/TAS Review, Attn: Anita Kitson, 1301 Clay Street, Suite 800S, Oakland, CA 94612.

    2. Brookhaven — Internal Revenue Service, PAS — Stop 110, 1040 Waverly Avenue, Holtsville, NY 11742.  (10-01-2001)
TAS Quality Standards

  1. TAS derives its quality index from the monthly quality reviews of local offices' and campuses' monthly randomly sampled cases.

  2. The reviewers at the two sites review the TAS cases following the guidelines in the TAS Quality Standards and Review Guide. Basically, the quality standards for the quality index reflect the TAS casework procedures required in this IRM.

  3. There are eight quality standards. The casework quality index is the result of the review of these standards. Since this business measure is for overall TAS casework, reviewers look at the case in its entirety and do not differentiate between offices when the case involves transfers from another office.

  4. Each standard weighs a particular amount of points (shown in parentheses below). The maximum total applicable points for a case are 100. Not all standards are applicable in every case. You may find a more in-depth description of the standards and their applicability in Exhibit 13.5.1-7, TAS Casework Quality Index (CQI) Standards.

  5. The standards are:

    1. Did TAS make timely initial contact with the taxpayer? (5 points)

    2. Did TAS take initial action/request information within the specified time frames? (10 points)

    3. Did TAS take all subsequent actions timely from the time action could have been taken? (10 points)

    4. Did TAS resolve all taxpayer issues? (25 points) Did TAS address all related issues? (10 points)

    5. Were all adjustments that impact the taxpayer technically/procedurally correct? (15 points)

    6. Did TAS give the taxpayer a clear, complete, correct explanation at closing? (20 points)

    7. Did TAS educate the taxpayer regarding any of his/her actions that contributed to the problem? (5 points)  (10-01-2001)
TAS Quality Review Database

  1. The TAS Quality Review Database (QRDB) is a personal computer based database created using Microsoft Access software.

  2. The Taxpayer Advocate office in each local office/campus, area office, operating division, and National Office has a copy of the QRDB that contains the review results (e.g., checksheet, comments, and closure/sample size information) for the particular office.


    A local office's database such as Greensboro's has only the results of a review of Greensboro's monthly random samples. The QRDB for Area 2 — Richmond, of which Greensboro office is a part, contains the review results for all the local offices within Area 2. The National Office's and the ODTA office's QRDBs contain results from all the offices within the nation.

  3. The National Office transmits an empty QRDB to the area offices and ODTA office who in turn send it to the local offices/campuses. A designated administrator in each office (usually the quality analyst) is responsible for loading the monthly results received from its review site.

  4. You, as an analyst, ATA, LTA, or ODTA may use the QRDB to generate monthly and fiscal year cumulative reports as well as specific queries for data analysis.  (10-01-2001)

  1. The TAS DIALOGUE is a period of time following a monthly review when local offices and campuses have an opportunity to discuss errors charged on the previous month's reviews.

  2. This time allows an exchange of information to clarify errors, identify best practices, identify training needs for reviewers or TAS employees, identify systemic problems, and ensure procedures are being followed correctly.

  3. Results of the discussion have no impact on the local office or campus's score.

  4. You may elevate any situations that cannot be solved at the review site or area level to National Office. Once decided, National Office disseminates the situation, the TAS office's view, the review site's view, and the national decision to the area and local offices to ensure all TAS employees are following the same case processing guidelines.

  5. See Exhibit 13.5.1-8, DIALOGUE Process, for the steps of the process.  (10-01-2001)
Closed Cases

  1. The text below describes TAS closed cases, one of TAS' business results (quantity) measures.  (10-01-2001)
Definition and Purpose

  1. The volume of regular criteria closures (i.e., first time TAS contacts meeting one of TAS' criteria 1-7) represents one of TAS' business results (quantity) measures. Its purpose is to measure TAS productivity and effectiveness in identifying, working, and resolving all taxpayer issues and hardships that meet TAS case processing criteria expanded by the enactment of the Revenue Restructuring Act of 1998.

  2. In conjunction with the other business results (e.g., customer satisfaction, CQI, and outreach), TAS closed cases will afford TAS the opportunity to gauge effectively its performance in ensuring that all taxpayers whose issues meet TAS criteria receive the appropriate TAS assistance, intervention, and case resolution.  (10-01-2001)
Reporting Procedures and Responsibility

  1. Until TAS redesigns its TAMIS reports to capture and track TAS' new performance measures, the Director, Program Planning and Quality, has the responsibility to record TAS closed case performance for all offices and to convey performance results for the Commissioner's monthly report and to EMSS.

  2. Vision Query (a/k/a Intelligent Query) is the vehicle to capture closed case performance results.

  3. TAS will share with each area office the national composite results, specific area composite performance, and performance results for those offices within the specific area.

  4. Each area office will convey to its local offices the national composite performance, the area's composite performance, and the individual performance specific to the local offices/campuses under the area's jurisdiction.

  5. The implementation of the TAMIS reports re-design will provide for the systemic capture and recordation of closed case performance results and the electronic transmission to EMSS.

  6. Each Advocate office will also have TAMIS access to the closed case performance results to which it is entitled to view within the parameters and guidelines of the IRS' policy of managing statistics in a balanced measures environment (See IRM 105.4, Managing Statistics in a Balanced Measurement System.)  (10-01-2001)
Access Restrictions to Closed Case Statistical Data

  1. IRS restricts access to closed case statistical data, as well as access to other statistical performance results, in accordance to its policy of managing statistics in a balanced measures environment (see IRM 105.4, Managing Statistics in a Balanced Measurement System).

  2. No Advocate office may access and view closed statistical data of a peer office (i.e., "horizontal" access restrictions). An Advocate office may access its own closed case data and composite closed case performance results of those offices directly above or below it (i.e., "vertical" access permissions).

  3. The National Office and the ODTA offices have access to all composite and individual closed case performance data.

  4. An area office may access national composite data, its own composite data, and individual date for the local offices within its jurisdiction.

  5. A local office/campus may access national composite data, composite data for the area to which it belongs, and to its own individual data.

  6. A group within a local office/campus may view its own group data, the composite data for the local office, area office, and nation.

  7. Plans include reprogramming both TAMIS and Vision Query to ensure systemic adherence to the above mentioned access permissions and restrictions.  (10-01-2001)
Closed Case Cycle Time

  1. The text below describes TAS closed case cycle time, one of TAS' business results (quality) measures.  (10-01-2001)
Prior and Current Definitions and Purposes

  1. Closed case cycle time is one of the quality components of TAS' business results measures.

  2. TAS uses it to measure timeliness in resolving effectively all taxpayer issues and concerns that meet TAS case processing criteria.

  3. Previously, TAS defined this measure as the total days to resolve/close regular criteria cases divided by the count of regular criteria closures.

  4. The definition in (3), above, did not capture fully the total time expended to resolve taxpayer issues because it was silent as to reopen closed case cycle time. A reopened case is an extension of a period case that did not address all taxpayer issues or did not respond to the issues in a manner deemed satisfactory by the taxpayer.

  5. TAS has re-defined the closed case cycle time measure as the total days expended to close both regular and reopen criteria cases divided by the count of regular criteria closures.

  6. By adding reopened case cycle time to that of the original closed case cycle time, this new cycle time measure accounts more effectively for and tracks TAS timeliness in addressing and resolving taxpayer issues and concerns.  (10-01-2001)
Reporting Procedures and Responsibility

  1. The reporting and recording procedures for closed case cycle time performance results are the same as for TAS closed cases indicated above in IRM, Employee Satisfaction Measures.  (10-01-2001)
Access Restrictions to Closed Case Cycle Time Data

  1. The restrictions and permissions for viewing and accessing closed case cycle time performance results are the same as for TAS closed cases indicated above in IRM, Business Results Measures—Quantity and Quality.  (10-01-2001)
External Customer Satisfaction

  1. TAS conducts an annual sample survey of externaI customers (taxpayers and their powers of attorney) who have had recently closed cases.

  2. The survey gauges customers' reactions to key service variables such as:

    1. the quality and timeliness of service;

    2. the responsiveness, courtesy and professionalism of the TAS employees with whom they dealt;

    3. the ease of communications; and

    4. the end results (perception of fairness of resolution).  (10-01-2001)

  1. TAS collects customer satisfaction data as part of the balanced performance system of organizational evaluation.

  2. The collected data are for the sole purpose of enabling TAS to assess its customers' perspectives of TAS' delivery of services.

  3. The customer satisfaction survey is part of a reiterative process to drive both local and national planning in each successive fiscal year to better meet customer expectations.  (10-01-2001)
Ownership and Responsibilities

  1. The Director, Program Planning and Quality in the National Office owns the process of managing the customer satisfaction survey.

  2. The area offices and their respective subordinate offices own the data and reports produced by the survey for their offices as well as the nation as a whole.

  3. ATAs and LTAs are collectively responsible for analyzing the data using process management techniques and for engaging employees through their representative organizations in identifying local initiatives to improve customer satisfaction.

  4. Use analysis of customer satisfaction survey data integrated with the other balanced measures to plan and recommend specific actions (i.e., changes to procedures, training, equipment, etc.) that you need to take to improve satisfaction scores in subsequent surveys.

  5. Incorporate improvement objectives flowing from survey analysis into the performance expectations of local and area managers and consider this as an integral part of the strategic planning and budgeting process for each office.

More Internal Revenue Manual