13.5.1 TAS Balanced Measure System

Introduction

  1. As part of the IRS and its new organization, TAS has incorporated balanced measures in its procedures.

The Balanced Measurement System

  1. In 1998, IRS developed a plan for Modernization that consisted of five "levers of change" to modernize the IRS and support the new mission. One "lever" involves implementing a system of balanced measures to assist in measuring and improving organization performance.

  2. The IRS developed its balanced measurement system in an effort to modernize the IRS and to reflect the IRS' priorities as articulated in the IRS mission statement. This system does the following:

    1. supports the organizational mission and goals;

    2. communicates priorities;

    3. meets external reporting requirements;

    4. provides a balanced approach to measuring performance at each level of the organization; and

    5. expands performance measurement to include more than operational financial results.

  3. To ensure balance, this system measures performance in three areas:

    1. customer satisfaction;

    2. employee satisfaction; and

    3. business results.

  4. Each of these measurement areas are carefully considered when setting organizational objectives, establishing goals, assessing progress and results, and evaluating individual performance.

  5. TAS uses balanced measures to assess program effectiveness and service delivery.

Customer Satisfaction Measures

  1. The goal of customer satisfaction, one part of the balanced measurement system, is to continually improve the IRS' ability to provide accurate and professional service to internal and external customers in a courteous and timely manner.

  2. IRS determines customer satisfaction goals and accomplishments on the basis of customer feedback collected via questionnaires, surveys, and other types of information gathering mechanisms.

  3. The balanced measures system will help answer key questions regarding customer satisfaction including:

    1. the general level of customer satisfaction in specific IRS transactions;

    2. the frequency and reason for customer dissatisfaction; and

    3. taxpayers' perceptions and expectations of the IRS.

Employee Satisfaction Measures

  1. The goal of employee satisfaction, the second part of the balanced measurement system, is to provide a work environment that enables employees to achieve organizational goals.

  2. The "quality work environment" goal assesses measures of employee satisfaction.

  3. The balanced measurement system will help answer key questions regarding IRS employee satisfaction including:

    1. the general level of employee satisfaction;

    2. employees' perceptions of the effectiveness of various levels of management; and

    3. the employees' perception of the IRS' provision of an appropriate work environment.

Business Results Measures — Quantity and Quality

  1. The goal of business results is to generate a productive quantity of work in a quality manner and to provide meaningful outreach to customers. This includes delivering a volume of work within mixed categories and dedicating appropriate resources to provide taxpayer education, outreach, and advocacy.

  2. The business results measures consist of numerical results determined under the "quantity" and "quality" elements.

  3. The goal involves more than achieving a target or predetermined quality and quantity results. It addresses key questions regarding business results including:

    1. Did we assess/collect the proper tax?

    2. Did we provide quality customer service?

    3. Did we use our resources in accordance with the plan?

    4. Did we provide education and outreach/advocacy to our customers?

  4. Each balanced measure element represents an important aspect of the organization's goals and is of equal importance in carrying out the IRS's program and functions.

  5. The frequency of measures data availability may vary across the three elements because the data from each element changes at different intervals and/or requires more time for data collection. However, differences in the frequency of data availability do not reflect differences in priority of the balanced measurement elements.

Using Balanced Measures

  1. The balanced measurement system changed the way the IRS uses measures. In the past IRS used measures in ways that produced unintended consequences, such as ranking business units or functions on an index of measures.

  2. The emphasis of the new system is on understanding why measured data have changed and what actions IRS took or should have taken to influence results.

  3. We are not managing programs to achieve numbers. We are managing processes and people to achieve the IRS mission.

  4. The use of quality tools such as flow charts, pareto charts, root cause analysis, etc., prove valuable in the fact finding process.

  5. You can find information about the appropriate use of measures, including the definition of record of tax enforcement results (ROTERs), goal setting use of measures in evaluations, sharing data, etc., in IRM 105.4, Managing Statistics in a Balanced Measurement System.

  6. All managers must ensure strict adherence to this guidance.

Data Analysis

  1. Businesses often refer to the process of analyzing business results or understanding the changes in measures as "getting behind the numbers." There is no prescribed method for this although the following guides may be of assistance:

    1. The President's Quality Award (PQA) Criteria for Excellence is an excellent way of evaluating organization success and developing opportunities for improvement based on comparison with "best in class" activity.

    2. The Management Analysis Resource Guide, based on the Management Model, used in conjunction with diagnostic tools, helps managers understand activities they should be doing on a regular basis in order to "get behind the numbers."

  2. Typically, we initiate the analytical process by looking at:

    1. the balanced measures data and the diagnostic tools data;

    2. those more closely involved in the process; and

    3. the actions that impacted the results.

    Example:

    A change in the TAS case cycle time measure may be due to different methods of case progression or to equipment problems.

  3. It may not be easy to discern the root cause of the change by looking at data alone. An analysis of the facts and a brief conversation with individuals close to the process may quickly bring this to light.

  4. You may find detailed information on the balanced measures system in IRM 105.4, Managing Statistics in a Balanced Measurement System, Chapter 2.

Diagnostic Tools

  1. The IRS collects a great deal of additional information about programs and services, some of which have been used as performance measures in the past.

  2. Under the balanced measurement system, a business unit or organization may only use the approved set of balanced measures — both strategic and operational — to measure its performance.

  3. IRS refers to indicators that are not designated as balanced measures as "diagnostic tools." Use diagnostic tools to analyze the factors that affect changes in the balanced measures' performance and to "get behind the numbers."

  4. Using diagnostics tools provides you with a mechanism to analyze factors that influence performance and encourages dialogue about specific actions that managers may take to improve customer satisfaction, employee satisfaction, and business results.

  5. Do not set goals or targets for diagnostic tools and do not use them in individual performance evaluations.

  6. Diagnostic tools include any type of data that is helpful in understanding what influences and impacts balanced measures. It is permissible to use ROTERs as diagnostic tools.

Reference Guide for Diagnostic Tools
  1. The Reference Guide for Diagnostic Tools serves as a managerial resource to help you change the way you review statistics and manage data. Although originally written for previous IRS functions, the overall framework for diagnostic tools translates to the new IRS' organization.

  2. The guide will:

    1. Encourage dialogue by providing a mechanism to help managers analyze the data elements/diagnostic tools that are "behind the numbers" and to focus on management actions based on that context;

    2. Aid management by helping diagnose outcomes to get "behind the numbers" and to analyze indicators/data elements and actions taken to address the problems;

    3. Assist management of organizations or business units to focus on and understand the underlying factors that affect the key set of balanced measures (e.g., to help separate controllable and external factors);

    4. Provide information to allow revision of future plans; and

    5. Define topic parameters for operations/business review by setting expectations, eliminating surprises, and ensuring the ability to prepare.

  3. Some TAS examples of diagnostic tools are listed in Exhibit 13.5.1-1, TAS Diagnostic Tools.

Proper Use of Diagnostic Tools
  1. Use diagnostic tools to understand underlying factors that cause changes in balanced measures.

  2. Do not use diagnostic tools as organization performance measures.

  3. Do not set goals or targets for diagnostic tools.

  4. Do not use these tools as an individual performance measure (i.e., in evaluating an individual employee's performance).

  5. Do not use diagnostic tools as a "gotcha" in performance reviews.

  6. Do not compare the results of diagnostic tools when evaluating different offices or units.

Business Performance Review System

  1. The Business Performance Review System (BPRS) represents a fundamental change in the way IRS reviews and manages its operations.

  2. The system enables a periodic review of strategic and operations issues and business unit performance and facilitates the assessment of the IRS' progress in achieving its mission and strategic goals.

  3. The BPRS establishes a framework for measuring, reporting, and reviewing a business unit's performance against plans established within the Strategic Planning and Budget process.

  4. During this process each business unit identifies, defines, and tracks the essential elements of its performance.

  5. IRS uses BPRS meetings to assess all business units' progress toward organizational goals and to identify crosscutting issues in a timely manner.

  6. The BPRS provides the opportunity to identify areas of the plan where revisions are necessary, which then can be incorporated into the next strategic planning and budget cycle.

Taxpayer Advocate Service Balanced Measures

  1. The National Taxpayer Advocate (NTA) approved ten measures. Text below describes in detail all the measures.

  2. The measures are broken down into the categories mentioned above — customer satisfaction, employee satisfaction, and business results (quantity and quality).

  3. The measures within each category relate to either casework or advocacy. See Exhibit 13.5.1-2, TAS Balanced Measures, for a chart of TAS' ten measures and this relationship.

Casework Quality Index

  1. The casework quality index is one of the quality business results of TAS' balanced measures.

  2. This is a numerical indicator of the extent to which TAS casework meets the standards prescribed.

  3. These results are indicators of quality and are used to identify and correct problems that require changes in areas such as procedures and training as well as to identify best practices or systemic problems.

  4. The results of these reviews are for use by the National Office as well as Area Taxpayer Advocates (ATAs), Local Taxpayer Advocates (LTAs), and quality analysts only. Do not share specific case review results (i.e., quality review checksheets and comments sheets) with TAS group managers or Associate Advocates (AAs)/Senior Associate Advocates (SAAs). See Exhibit 13.5.1-3, Sharing Quality Review Results, for more information about sharing quality review statistics and case results.

  5. The information in the following text describes TAS' responsibilities for this measure, the procedures for calculating the index, and other features of this balanced measure.

Responsibilities for TAS' Quality Review Program
  1. The Director, Program Planning and Review, in the NTA's office, the ATA, and LTA are responsible for the quality review program in their respective offices.

  2. The information in the sections below describes the responsibilities in detail.

National Responsibilities
  1. The Director, Program Planning and Review at the National Office is responsible for the centralized quality review program for TAS.

  2. That person is responsible for:

    1. ensuring that the review sites review and document the results of the monthly quality samples sent from the field TAS offices and transmit the quality review database records to the appropriate field sites as well as National Office;

    2. maintaining and revising, as appropriate, the TAS quality review database (QRDB);

    3. generating the national monthly and cumulative quality reports and the data sent to the Executive Management Support System (EMSS);

    4. responding to DIALOGUE situations (see IRM 13.5.1.5.1.5, DIALOGUE Process) which the field or review sites elevate to the National Office and disseminating the results to the area TAS offices and review sites;

    5. publishing a quarterly quality newsletter through the appropriate TAS communication channels; and

    6. providing an annual report on quality consisting of information from area offices.

Area Responsibilities
  1. Each ATA is responsible for the centralized quality review program for TAS in his/her area.

  2. The ATA is responsible for:

    1. ensuring that the monthly quality samples are sent from the area's field TAS offices to the appropriate review site;

    2. maintaining the area's TAS quality review database (QRDB);

    3. generating the area's monthly and cumulative quality reports;

    4. being the first level of review for DIALOGUE situations from the field offices in that area (see IRM 13.5.1.5.1.5, DIALOGUE Process, elevating to the National Office any DIALOGUE situations for which there is a disagreement;

    5. disseminating the results of nationally elevated DIALOGUE results to the field TAS offices in that area;

    6. analyzing quality review data for the area and offices within that area to identify trends, procedures needing improvement, training needs, systemic problems, and best practices;

    7. Using the analytical results to improve quality in the area's local offices/campuses (e.g., share best practices, set up area training classes, work with offices on specific problem areas, etc.);

    8. sharing systemic analysis results with the Operating Division Taxpayer Advocate (ODTA) for the appropriate business unit;

    9. providing input to the quarterly quality newsletter published by the National Office; and

    10. providing input (i.e., quality analysis) to the National Office for an annual report on quality.

Local Office/Campus Responsibilities
  1. The LTA at each local office/campus is responsible for the centralized quality review program for TAS in his/her office.

  2. The LTA is responsible for:

    1. ensuring that the monthly quality samples are sent from the TAS offices to the appropriate review site;

    2. maintaining the office's TAS quality review database (QRDB);

    3. generating the office's monthly and cumulative quality reports;

    4. submitting DIALOGUE situations for questionable situations to the appropriate review site, keeping the area quality analyst informed of the situations, and elevating any discrepancies to the National Office (see IRM 13.5.1.5.1.5, DIALOGUE Process)

    5. disseminating clarifications in TAS procedures to the field TAS managers and caseworkers in the office;

    6. analyzing quality review data for the office to identify trends, procedures needing improvement, training needs, systemic problems, and best practices;

    7. Using the analytical results to improve quality in the local office/campus (e.g., share best practices, set up training classes, work with managers and AAs/SAAs on specific problem areas, etc.);

    8. ensuring cases that should be reopened are correctly resolved;

    9. sharing systemic analysis results with the area and ODTA for the appropriate business unit;

    10. providing input to the area for the quarterly quality newsletter published by the National Office; and

    11. providing input (i.e., quality analysis) to the area for the national annual report on quality.

Monthly Quality Random Samples
  1. At the end of each month the local offices and campuses must select a random sample of their regular and reopen criteria cases. See Exhibit 13.5.1-4, Quality Sample Sizes for Centralized Quality Review, for sample sizes by office.

  2. Until TAMIS has a new quality sampling report, use a Vision Query (a/k/a Intelligent Query (VQ)) procedure to list the cases eligible for sampling.

  3. See Exhibit 13.5.1-5, Selecting Cases for the Centralized Quality Review Sample, for procedures on selecting the sample.

  4. Send the samples by the scheduled due date to the Oakland review site for local offices or to the Brookhaven review site for campuses. The National Office publishes a quality review schedule one month prior to the beginning of each fiscal year.

  5. See Exhibit 13.5.1-6, Documentation Required for TAS Cases, for documentation that must be in each case file.

  6. The addresses for the review sites are:

    1. Oakland — Internal Revenue Service, Quality Assurance Division/TAS Review, Attn: Anita Kitson, 1301 Clay Street, Suite 800S, Oakland, CA 94612.

    2. Brookhaven — Internal Revenue Service, PAS — Stop 110, 1040 Waverly Avenue, Holtsville, NY 11742.

TAS Quality Standards
  1. TAS derives its quality index from the monthly quality reviews of local offices' and campuses' monthly randomly sampled cases.

  2. The reviewers at the two sites review the TAS cases following the guidelines in the TAS Quality Standards and Review Guide. Basically, the quality standards for the quality index reflect the TAS casework procedures required in this IRM.

  3. There are eight quality standards. The casework quality index is the result of the review of these standards. Since this business measure is for overall TAS casework, reviewers look at the case in its entirety and do not differentiate between offices when the case involves transfers from another office.

  4. Each standard weighs a particular amount of points (shown in parentheses below). The maximum total applicable points for a case are 100. Not all standards are applicable in every case. You may find a more in-depth description of the standards and their applicability in Exhibit 13.5.1-7, TAS Casework Quality Index (CQI) Standards.

  5. The standards are:

    1. Did TAS make timely initial contact with the taxpayer? (5 points)

    2. Did TAS take initial action/request information within the specified time frames? (10 points)

    3. Did TAS take all subsequent actions timely from the time action could have been taken? (10 points)

    4. Did TAS resolve all taxpayer issues? (25 points) Did TAS address all related issues? (10 points)

    5. Were all adjustments that impact the taxpayer technically/procedurally correct? (15 points)

    6. Did TAS give the taxpayer a clear, complete, correct explanation at closing? (20 points)

    7. Did TAS educate the taxpayer regarding any of his/her actions that contributed to the problem? (5 points)

TAS Quality Review Database
  1. The TAS Quality Review Database (QRDB) is a personal computer based database created using Microsoft Access software.

  2. The Taxpayer Advocate office in each local office/campus, area office, operating division, and National Office has a copy of the QRDB that contains the review results (e.g., checksheet, comments, and closure/sample size information) for the particular office.

    Example:

    A local office's database such as Greensboro's has only the results of a review of Greensboro's monthly random samples. The QRDB for Area 2 — Richmond, of which Greensboro office is a part, contains the review results for all the local offices within Area 2. The National Office's and the ODTA office's QRDBs contain results from all the offices within the nation.

  3. The National Office transmits an empty QRDB to the area offices and ODTA office who in turn send it to the local offices/campuses. A designated administrator in each office (usually the quality analyst) is responsible for loading the monthly results received from its review site.

  4. You, as an analyst, ATA, LTA, or ODTA may use the QRDB to generate monthly and fiscal year cumulative reports as well as specific queries for data analysis.

DIALOGUE Process
  1. The TAS DIALOGUE is a period of time following a monthly review when local offices and campuses have an opportunity to discuss errors charged on the previous month's reviews.

  2. This time allows an exchange of information to clarify errors, identify best practices, identify training needs for reviewers or TAS employees, identify systemic problems, and ensure procedures are being followed correctly.

  3. Results of the discussion have no impact on the local office or campus's score.

  4. You may elevate any situations that cannot be solved at the review site or area level to National Office. Once decided, National Office disseminates the situation, the TAS office's view, the review site's view, and the national decision to the area and local offices to ensure all TAS employees are following the same case processing guidelines.

  5. See Exhibit 13.5.1-8, DIALOGUE Process, for the steps of the process.

Closed Cases

  1. The text below describes TAS closed cases, one of TAS' business results (quantity) measures.

Definition and Purpose
  1. The volume of regular criteria closures (i.e., first time TAS contacts meeting one of TAS' criteria 1-7) represents one of TAS' business results (quantity) measures. Its purpose is to measure TAS productivity and effectiveness in identifying, working, and resolving all taxpayer issues and hardships that meet TAS case processing criteria expanded by the enactment of the Revenue Restructuring Act of 1998.

  2. In conjunction with the other business results (e.g., customer satisfaction, CQI, and outreach), TAS closed cases will afford TAS the opportunity to gauge effectively its performance in ensuring that all taxpayers whose issues meet TAS criteria receive the appropriate TAS assistance, intervention, and case resolution.

Reporting Procedures and Responsibility
  1. Until TAS redesigns its TAMIS reports to capture and track TAS' new performance measures, the Director, Program Planning and Quality, has the responsibility to record TAS closed case performance for all offices and to convey performance results for the Commissioner's monthly report and to EMSS.

  2. Vision Query (a/k/a Intelligent Query) is the vehicle to capture closed case performance results.

  3. TAS will share with each area office the national composite results, specific area composite performance, and performance results for those offices within the specific area.

  4. Each area office will convey to its local offices the national composite performance, the area's composite performance, and the individual performance specific to the local offices/campuses under the area's jurisdiction.

  5. The implementation of the TAMIS reports re-design will provide for the systemic capture and recordation of closed case performance results and the electronic transmission to EMSS.

  6. Each Advocate office will also have TAMIS access to the closed case performance results to which it is entitled to view within the parameters and guidelines of the IRS' policy of managing statistics in a balanced measures environment (See IRM 105.4, Managing Statistics in a Balanced Measurement System.)

Access Restrictions to Closed Case Statistical Data
  1. IRS restricts access to closed case statistical data, as well as access to other statistical performance results, in accordance to its policy of managing statistics in a balanced measures environment (see IRM 105.4, Managing Statistics in a Balanced Measurement System).

  2. No Advocate office may access and view closed statistical data of a peer office (i.e., "horizontal" access restrictions). An Advocate office may access its own closed case data and composite closed case performance results of those offices directly above or below it (i.e., "vertical" access permissions).

  3. The National Office and the ODTA offices have access to all composite and individual closed case performance data.

  4. An area office may access national composite data, its own composite data, and individual date for the local offices within its jurisdiction.

  5. A local office/campus may access national composite data, composite data for the area to which it belongs, and to its own individual data.

  6. A group within a local office/campus may view its own group data, the composite data for the local office, area office, and nation.

  7. Plans include reprogramming both TAMIS and Vision Query to ensure systemic adherence to the above mentioned access permissions and restrictions.

Closed Case Cycle Time

  1. The text below describes TAS closed case cycle time, one of TAS' business results (quality) measures.

Prior and Current Definitions and Purposes
  1. Closed case cycle time is one of the quality components of TAS' business results measures.

  2. TAS uses it to measure timeliness in resolving effectively all taxpayer issues and concerns that meet TAS case processing criteria.

  3. Previously, TAS defined this measure as the total days to resolve/close regular criteria cases divided by the count of regular criteria closures.

  4. The definition in (3), above, did not capture fully the total time expended to resolve taxpayer issues because it was silent as to reopen closed case cycle time. A reopened case is an extension of a period case that did not address all taxpayer issues or did not respond to the issues in a manner deemed satisfactory by the taxpayer.

  5. TAS has re-defined the closed case cycle time measure as the total days expended to close both regular and reopen criteria cases divided by the count of regular criteria closures.

  6. By adding reopened case cycle time to that of the original closed case cycle time, this new cycle time measure accounts more effectively for and tracks TAS timeliness in addressing and resolving taxpayer issues and concerns.

Reporting Procedures and Responsibility
  1. The reporting and recording procedures for closed case cycle time performance results are the same as for TAS closed cases indicated above in IRM 13.5.1.2.2, Employee Satisfaction Measures.

Access Restrictions to Closed Case Cycle Time Data
  1. The restrictions and permissions for viewing and accessing closed case cycle time performance results are the same as for TAS closed cases indicated above in IRM 13.5.1.2.3, Business Results Measures—Quantity and Quality.

External Customer Satisfaction

  1. TAS conducts an annual sample survey of externaI customers (taxpayers and their powers of attorney) who have had recently closed cases.

  2. The survey gauges customers' reactions to key service variables such as:

    1. the quality and timeliness of service;

    2. the responsiveness, courtesy and professionalism of the TAS employees with whom they dealt;

    3. the ease of communications; and

    4. the end results (perception of fairness of resolution).

Purpose
  1. TAS collects customer satisfaction data as part of the balanced performance system of organizational evaluation.

  2. The collected data are for the sole purpose of enabling TAS to assess its customers' perspectives of TAS' delivery of services.

  3. The customer satisfaction survey is part of a reiterative process to drive both local and national planning in each successive fiscal year to better meet customer expectations.

Ownership and Responsibilities
  1. The Director, Program Planning and Quality in the National Office owns the process of managing the customer satisfaction survey.

  2. The area offices and their respective subordinate offices own the data and reports produced by the survey for their offices as well as the nation as a whole.

  3. ATAs and LTAs are collectively responsible for analyzing the data using process management techniques and for engaging employees through their representative organizations in identifying local initiatives to improve customer satisfaction.

  4. Use analysis of customer satisfaction survey data integrated with the other balanced measures to plan and recommend specific actions (i.e., changes to procedures, training, equipment, etc.) that you need to take to improve satisfaction scores in subsequent surveys.

  5. Incorporate improvement objectives flowing from survey analysis into the performance expectations of local and area managers and consider this as an integral part of the strategic planning and budgeting process for each office.

Use and Limitations of Customer Satisfaction Survey Data
  1. Customer Satisfaction Survey data is used only toward the improvement of organization operations.

  2. Taxpayer privacy — Survey reports will not identify individual IRS employees nor the taxpayers or practitioners with whom TAS interacted. For this survey, those conducting the survey assure external customers that their responses are confidential and guarantee their privacy by combining their responses with those of other taxpayers and by reporting only statistical totals.

  3. Employee privacy — TAS will never use survey information to identify an individual employee or to evaluate the performance of an individual employee.

  4. Actions necessary to assure privacy — Area offices will scrutinize and sanitize data before sharing them with field offices to ensure that no one can identify taxpayers, powers of attorney, or employees. Area offices will delete from their databases records that could result in employee or taxpayer identification. Where the amount of data produced for an individual office is so small as to jeopardize the privacy of either the taxpayer or employee, we will not report the data for that organization but will combine the data with all the offices in the area.

  5. Access to customer satisfaction survey data — Individual offices have permission to access data and reports only for their own offices, for their areas, and for TAS as a whole. IRS prohibits cross-comparisons of specific individual offices, which are not appropriate.

  6. Outputs — There are two different outputs from the customer satisfaction surveys.

    1. Quarterly reports — Summaries of survey results are in quarterly reports that provide a synopsis of customer satisfaction responses for each area and for all of TAS. Reports contain an overall customer satisfaction score and cross-tabulated tables that portray customer characteristics (e.g., major issue code and complexity code) against overall satisfaction scores.

    2. Annual reports — Reports for individual offices will be produced at such time as the data are deemed to be statistically significant. Area offices will receive a database containing the customer satisfaction data pertinent to their local offices which they may use in developing analytical approaches. In the event that there is an insufficient amount of data to assure statistical validity for an individual office, the data of that office will be combined with that of the overall area. Local offices should use that data in developing their improvement plans.

Employee Satisfaction

  1. The employee satisfaction measure assesses how well management provides employees with the necessary support, resources, and tools needed to accomplish their jobs.

  2. The measure also focuses on the work environment, manager/employee relations, and other factors that affect an employee's ability to do a good job.

  3. How employees rate employee satisfaction impacts their productivity level and quality of customer service.

Definition of the Employee Satisfaction Measure
  1. IRS and TAS define employee satisfaction as a measure of employee perception of management practices, organization barriers, and overall work environment that affect employees' efforts to do a good job.

Measuring Employee Satisfaction
  1. IRS designed its employee satisfaction survey to evaluate management's effectiveness, the quality of the work environment, and to identify specific issues that affect the work group dynamics and productivity.

  2. Managers and their employees use the results of this survey to identify key areas for improvement.

Accountability
  1. Each manager's performance plan includes employee satisfaction.

  2. Managers use survey results to develop personal commitments.

Roles and Responsibilities — Managers
  1. Managers must promote employee satisfaction as part of their daily operations.

  2. All managers will share the results of the survey with their employees.

  3. Meetings are mandatory for all first-line managers. Managers should use the workgroup report as a starting point for their discussions.

  4. When a manager schedules a meeting which includes bargaining unit employees, (s)he must invite a union representative to attend the meeting.

  5. Managers should elevate any unresolved issues and ones that they cannot resolve locally through the normal chain of command.

  6. Managers should provide sufficient time and resources for all personnel who are involved in planning for and administering the employee satisfaction survey.

Roles and Responsibilities — Employees
  1. Employees should support the employee satisfaction survey process by taking the survey and answering all survey items as candidly and honestly as possible.

  2. They should also participate fully in the survey results meetings to discuss employee satisfaction issues, using the data to plan actions and to follow up on commitments made within the work group.

Partnership with NTEU
  1. NTEU should be a partner in all activities related to the employee satisfaction survey.

  2. These activities include, but are not limited, to:

    1. marketing,

    2. developing local survey items,

    3. administering the survey;

    4. communicating survey results;

    5. ensuring steward coverage for workgroup meetings;

    6. selecting and using of facilitators;

    7. developing elevated issues; and

    8. developing action plans to improve employee satisfaction.

Outreach Resources Spent versus Plan

  1. Hours and dollars spent on outreach efforts will provide an indication of the amount of effort put into promoting TAS to our internal and external customers. TAS will compare actual hours and dollars spent to planned hours and dollars.

  2. The measure will answer the question — How much energy, time, money, etc., did TAS put or did TAS plan to put into educating the public through outreach programs on the role of the Taxpayer Advocate?

Outreach Plans
  1. TAS expects each local office to develop an outreach plan that meets the needs of taxpayers.

  2. Each plan will be unique in its targeted audiences as well as methods of delivery based on the needs and demographics of the local offices.

  3. The local office will use the outreach plan to identify and complete the activities planned to improve awareness of the TAS program, solicit feedback on IRS problems, and improve customer service.

  4. Outreach plans must include the following information/components:

    1. Office — the local TA office submitting the plan.

    2. Fiscal year — self-explanatory.

    3. Audience — the group or targeted audience you want to reach with your message.

    4. Basis — the reason this particular audience was selected.

    5. Method — the plan to deliver our message to your audience (e.g., speech, targeted mail out, booth at seminar/fair, etc.).

    6. Projected cost — the cost for your office to conduct the identified activity.

    7. Responsible party — the staff member responsible for this activity.

    8. Target date — the date you want this activity to be completed.

    9. Actual date — the date you actually completed the activity.

    10. Comments — Annotated information from the approval process (area use) or local data (local office).

    11. Submitted by — name of person submitting the outreach plan.

    12. Approved by — signature of person (usually ATA) approving the plan. If portions of the plan or costs are not approved, the approving office will annotate this.

      Note:

      Exhibit 13.5.1-9, Outreach Plan Template, provides an example of an outreach plan you may use.

  5. When preparing a local outreach plan, give consideration to other plans including the national TAS strategic plan, TAS communication goals, etc. Also consider partnerships with other operating/functional divisions within IRS such as TEC, SPEC, etc. When possible, base the plan on some type of research data such as demographic information. See Exhibit 13.1.5-10, Local Outreach Plan Development.

  6. Each area office is responsible for the review and approval of the specific events and associated costs of local outreach plans. Focus each plan on activities that will result in the greatest return based on the needs of the local office.

  7. LTAs will submit an outreach activity plan to their ATA by a designated date. The ATA will review, provide feedback or recommendation(s) for change to the individual plans and approve the plan.

Reporting Outreach Costs
  1. Report outreach costs by the following methods:

    1. Quarterly reporting the status of the outreach plans;

    2. Use of Project Cost Accounting System (PCAS) code for the Automated Financial System (AFS) and Travel Reimbursement and Accounting System (TRAS).

    3. Time reporting of outreach activities on Single Entry Time Reporting (SETR).

Quarterly Reporting Outreach Plans
  1. Local TAS offices will review and report progress, changes, and results to their ATAs by the 15th day after the end of each quarter.

  2. Include in reports updates to the templates, raw data, and anecdotal data.

  3. Include as raw data the number of outreach activities conducted for the reporting period and the number of people who either heard the message or had the potential to hear the message.

  4. Exhibit 13.5.1-11, Outreach Event Report, is an example of an instrument that may be used to gather data on each event.

  5. Include as anecdotal data any information received regarding the receptiveness of the outreach by the public or internal stakeholders. This may include comments, opinions, suggestions, etc.

  6. The ATA will forward reports to the Director, National TAS Communications and Liaison, by the 25th day after the end of each quarter.

  7. National TAS Communications and Liaison will consolidate data for use in analysis, budget requests, development of communications and marketing plans, etc.

Project Cost Accounting System (PCAS) Code
  1. TAS has established a special code to track travel and other costs related to TAS outreach activities.

  2. The new PCAS code for AFS and TRAS is TASM1.

  3. LTAs must ensure the use of this code when preparing all travel vouchers and any procurement requests relating to outreach.

Time Reporting of Outreach Activities on the Single Entry Time Reporting (SETR) System
  1. SETR has seven Organization Function Program (OFP) codes for use when reporting time spent on outreach activities. These codes are for use by TAS employees:

    1. 36750Internal — used to record preparatory time, travel time, and presentation time for outreach activities involving other IRS employees.

    2. 36751Congressional Office — used to record preparatory time, travel time, and presentation time for outreach activities involving Senators, Congresspersons, and/or their staff members. This category also includes Congressional liaison meetings and Congressional Affairs Program (CAP) conferences.

    3. 36752Tax Practitioner — used to record preparatory time, travel time, and presentation time for outreach activities involving tax practitioners/practitioner groups (e.g., attorneys, certified public accounts, public accountants, enrolled agents, electronic return originators, Tax Executive Institute, etc.).

    4. 36753External Meetings/Speeches/Events — used to record preparatory time, travel time, and presentation time for outreach activities involving external groups when the audience cannot be better defined by a more specific outreach category. This category encompasses efforts related to conferences, fairs, education groups/institutions, etc.

    5. 36754Media — used to record preparatory time, travel time, and presentation time involving outreach activities and interviews with media including print, ratio, and television.

    6. 36770 — EITC — used to record any time expended on working EITC cases or for preparatory time, travel time, and presentation time for any outreach activity primarily related to EITC.

    7. 36771 — EITC Overtime — used to record any overtime expended on working EITC cases or for any overtime grated for preparatory time, travel time, and presentation time for any outreach activity primarily related to EITC.

  2. There are other outreach activities such as the development or distribution of IRS publications, forms, notices, and web site work which do not fit cleanly into one of the SETR codes listed above. Capture the time devoted to these outreach activities using the SETR code which most closely fits the work performed.

  3. Report all time spent on preparing for traveling to, conducting, and performing follow-up activities related to outreach under one of these codes.

  4. Do not report time spent on preparing an outreach plan, reviewing its effectiveness, revising the plan or reporting on progress under these outreach codes. Report these under the "Management and Support" OFP code.

Outreach Effectiveness

  1. Outreach effectiveness/results will be a high-level measure of direct TAS receipts versus total TAS cases received for a particular period of time and provides data on how first-time TAS users became aware of the TAS program.

  2. This measure ties directly to the input measure — outreach resources spent versus plan (See IRM 13.5.1.5.6.5, Time Reporting of Outreach Activities on the Single Entry Time Reporting (SETR) System.). As an output measure, this measure answers the question — What kind of results did you achieve based on what you did to educate people about TAS?

  3. TAS uses the "how received" indicator on TAMIS to determine if the case was received directly from a taxpayer or practitioner or indirectly (i.e., a referral) from an operating or functional division.

  4. TAS also uses a specific field on TAMIS to record how the taxpayer learned about the program. Form 911, Section IV contains a block entitled "Outreach." Use this block to record how the taxpayer learned about the program. A corresponding field on TAMIS captures this data. Possible entries for the Form 911 block and TAMIS field are:

    1. 00Default (used for indirect TAS receipts) — this code identifies a case that did not come as a direct contact from a taxpayer. In other words, use this code for cases identified by an operational or functional division employee.

    2. 10Repeat customer — the taxpayer has used PRP and/or TAS services before and is already aware of the service provided.

    3. 20IRS publications/forms/notices — the taxpayer learned of TAS through information contained in IRS publications, forms, and/or notices and not through a personal presentation or TAS outreach activities.

    4. 30Web sites — the taxpayer learned of TAS from a web site.

    5. 40Congressional office — the taxpayer learned of TAS through contact with a Senator, Congressperson, and/or a Congressional staff member.

    6. 50Taxpayer practitioner — the taxpayer learned of TAS from a tax practitioner/practitioner group such as an attorney, accountant, enrolled agent, electronic return originator (ERO), Tax Executive Institute, etc.

    7. 60Media — the taxpayer learned of TAS through the media (e.g., television, radio, newspaper, magazine, radio spot, etc.).

    8. 70External meetings/speeches/events — the taxpayer learned of TAS through an outreach event conducted by the IRS such as a conference, fair, education group/institution, etc.

    9. 80Other — the taxpayer learned of TAS through another means not defined above.

    10. 90Reserved — used by TAS employees when taxpayer became aware of TAS through a unique outreach effort that the local office wanted to track separately.

Immediate Interventions

  1. This section describes the ODTA balanced measure for immediate interventions.

Definitions
  1. Immediate intervention — an expeditious response to an operational issue identified internally or externally that adversely affects customers when there is not enough time for the normal processes to work. The acceptance criteria are:

    1. The issue must impact a population of customers either locally or nationally (i.e., generally more than one person).

    2. The issue is so highly visible, sensitive, and/or critical that there is no time for the normal corrective processes to work (e.g., EITC refunds that are denied for electronically filed returns due to a programming error).

    3. The resolution can be identified within a week to ten business days of identification.

  2. Number of immediate interventions — TAS calculates this balanced measure as the number of immediate interventions the ODTA's office began working in a given period of time. For example, if the ODTA's office began working on five immediate interventions in the first quarter of a fiscal year, the measure would be five for that period. TAS considers the work to begin when the immediate intervention is entered into the Service Wide Action Planning (SWAP) database. TAS uses the SWAP "scheduled start date" to determine when work began.

Source of the Data
  1. The SWAP database is the ODTA's source for the number of immediate interventions. TAS uses the "Custom Report for SWAP" to determine the number of immediate interventions worked in a given period of time. This is done by sorting the report by the "scheduled start date" and counting the number that fall within the given time period. This report is number 12 on the main Reports page of the ODTA's web site: http://advocate.no.irs.gov/reports.asp.

Advocacy Projects

  1. This section describes the ODTA's balanced measure for advocacy projects.

Definitions
  1. Advocacy project — An advocacy project is a means by which TAS can research an advocacy issue and develop and test a proposed solution for eventual implementation. By means of advocacy projects TAS identifies and addresses systemic and procedural issues, analyzes underlying causes of the problem, and proposes corrective action. TAS classifies the projects into six general categories:

    1. A recommendation for a modification and/or addition to hardware or software such as IDRS, TAMIS, AIMS, ICS, and the Non-Master File.

    2. A recommendation for a change to an existing law or the enactment of a new law.

    3. A change to existing operational procedures or processes.

    4. A change to existing regulations, revenue rulings, or policy statements.

    5. A change in the way the tax law is administered, resulting in the consistent application of the tax law and fair and equitable treatment of taxpayers.

    6. A change to oral and written communications with our customers and stakeholders.

  2. Number of advocacy projects — TAS calculates this balanced measure by counting the advocacy projects the ODTA's office began working in a given period of time. For example, if the ODTA's office began working on 35 advocacy projects in the first quarter of a fiscal year, the measure would be 35 for that period. TAS considers the work to begin when the advocacy project is entered into the SWAP database. TAS uses the SWAP "scheduled start date" to determine when work began.

Source of the Data
  1. The SWAP database is the source for the number of advocacy projects. TAS uses the "Customer Report for SWAP" to determine the number of advocacy projects worked in a given period of time. This is done by sorting the report by the "scheduled start date" and counting the number that fall within the given time period. This report is number 12 on the main Reports page of the ODTA's web site: http://advocate.no.irs.gov/reports.asp.

TAS Diagnostic Tools

This is an Image: 32657001.gif

Please click here for the text description of the image.

TAS Balanced Measures

This is an Image: 32657002.gif

Please click here for the text description of the image.

Sharing Quality Review Results

This is an Image: 32657003.gif

Please click here for the text description of the image.

This is an Image: 32657004.gif

Please click here for the text description of the image.

This is an Image: 32657005.gif

Please click here for the text description of the image.

Quality Sample Sizes for Centralized Quality Review

This is an Image: 32657006.gif

Please click here for the text description of the image.

This is an Image: 32657007.gif

Please click here for the text description of the image.

Selecting Cases for the Centralized Quality Review Sample

This is an Image: 32657008.gif

Please click here for the text description of the image.

This is an Image: 32657009.gif

Please click here for the text description of the image.

This is an Image: 32657010.gif

Please click here for the text description of the image.

Documentation Required for TAS Cases

This is an Image: 32657011.gif

Please click here for the text description of the image.

TAS Casework Quality Index (CQI) Standards

This is an Image: 32657012.gif

Please click here for the text description of the image.

This is an Image: 32657013.gif

Please click here for the text description of the image.

This is an Image: 32657014.gif

Please click here for the text description of the image.

DIALOGUE Process

This is an Image: 32657015.gif

Please click here for the text description of the image.

Outreach Plan Template

This is an Image: 32657016.gif

Please click here for the text description of the image.

Local Outreach Plan Development

This is an Image: 32657017.gif

Please click here for the text description of the image.

This is an Image: 32657018.gif

Please click here for the text description of the image.

Outreach Event Report

This is an Image: 32657019.gif

Please click here for the text description of the image.