13.5.1  TAS Balanced Measure System (Cont. 1)

13.5.1.5 
Taxpayer Advocate Service Balanced Measures

13.5.1.5.4 
External Customer Satisfaction

13.5.1.5.4.3  (10-01-2001)
Use and Limitations of Customer Satisfaction Survey Data

  1. Customer Satisfaction Survey data is used only toward the improvement of organization operations.

  2. Taxpayer privacy — Survey reports will not identify individual IRS employees nor the taxpayers or practitioners with whom TAS interacted. For this survey, those conducting the survey assure external customers that their responses are confidential and guarantee their privacy by combining their responses with those of other taxpayers and by reporting only statistical totals.

  3. Employee privacy — TAS will never use survey information to identify an individual employee or to evaluate the performance of an individual employee.

  4. Actions necessary to assure privacy — Area offices will scrutinize and sanitize data before sharing them with field offices to ensure that no one can identify taxpayers, powers of attorney, or employees. Area offices will delete from their databases records that could result in employee or taxpayer identification. Where the amount of data produced for an individual office is so small as to jeopardize the privacy of either the taxpayer or employee, we will not report the data for that organization but will combine the data with all the offices in the area.

  5. Access to customer satisfaction survey data — Individual offices have permission to access data and reports only for their own offices, for their areas, and for TAS as a whole. IRS prohibits cross-comparisons of specific individual offices, which are not appropriate.

  6. Outputs — There are two different outputs from the customer satisfaction surveys.

    1. Quarterly reports — Summaries of survey results are in quarterly reports that provide a synopsis of customer satisfaction responses for each area and for all of TAS. Reports contain an overall customer satisfaction score and cross-tabulated tables that portray customer characteristics (e.g., major issue code and complexity code) against overall satisfaction scores.

    2. Annual reports — Reports for individual offices will be produced at such time as the data are deemed to be statistically significant. Area offices will receive a database containing the customer satisfaction data pertinent to their local offices which they may use in developing analytical approaches. In the event that there is an insufficient amount of data to assure statistical validity for an individual office, the data of that office will be combined with that of the overall area. Local offices should use that data in developing their improvement plans.

13.5.1.5.5  (10-01-2001)
Employee Satisfaction

  1. The employee satisfaction measure assesses how well management provides employees with the necessary support, resources, and tools needed to accomplish their jobs.

  2. The measure also focuses on the work environment, manager/employee relations, and other factors that affect an employee's ability to do a good job.

  3. How employees rate employee satisfaction impacts their productivity level and quality of customer service.

13.5.1.5.5.1  (10-01-2001)
Definition of the Employee Satisfaction Measure

  1. IRS and TAS define employee satisfaction as a measure of employee perception of management practices, organization barriers, and overall work environment that affect employees' efforts to do a good job.

13.5.1.5.5.2  (10-01-2001)
Measuring Employee Satisfaction

  1. IRS designed its employee satisfaction survey to evaluate management's effectiveness, the quality of the work environment, and to identify specific issues that affect the work group dynamics and productivity.

  2. Managers and their employees use the results of this survey to identify key areas for improvement.

13.5.1.5.5.3  (10-01-2001)
Accountability

  1. Each manager's performance plan includes employee satisfaction.

  2. Managers use survey results to develop personal commitments.

13.5.1.5.5.4  (10-01-2001)
Roles and Responsibilities — Managers

  1. Managers must promote employee satisfaction as part of their daily operations.

  2. All managers will share the results of the survey with their employees.

  3. Meetings are mandatory for all first-line managers. Managers should use the workgroup report as a starting point for their discussions.

  4. When a manager schedules a meeting which includes bargaining unit employees, (s)he must invite a union representative to attend the meeting.

  5. Managers should elevate any unresolved issues and ones that they cannot resolve locally through the normal chain of command.

  6. Managers should provide sufficient time and resources for all personnel who are involved in planning for and administering the employee satisfaction survey.

13.5.1.5.5.5  (10-01-2001)
Roles and Responsibilities — Employees

  1. Employees should support the employee satisfaction survey process by taking the survey and answering all survey items as candidly and honestly as possible.

  2. They should also participate fully in the survey results meetings to discuss employee satisfaction issues, using the data to plan actions and to follow up on commitments made within the work group.

13.5.1.5.5.6  (10-01-2001)
Partnership with NTEU

  1. NTEU should be a partner in all activities related to the employee satisfaction survey.

  2. These activities include, but are not limited, to:

    1. marketing,

    2. developing local survey items,

    3. administering the survey;

    4. communicating survey results;

    5. ensuring steward coverage for workgroup meetings;

    6. selecting and using of facilitators;

    7. developing elevated issues; and

    8. developing action plans to improve employee satisfaction.

13.5.1.5.6  (10-01-2001)
Outreach Resources Spent versus Plan

  1. Hours and dollars spent on outreach efforts will provide an indication of the amount of effort put into promoting TAS to our internal and external customers. TAS will compare actual hours and dollars spent to planned hours and dollars.

  2. The measure will answer the question — How much energy, time, money, etc., did TAS put or did TAS plan to put into educating the public through outreach programs on the role of the Taxpayer Advocate?

13.5.1.5.6.1  (10-01-2001)
Outreach Plans

  1. TAS expects each local office to develop an outreach plan that meets the needs of taxpayers.

  2. Each plan will be unique in its targeted audiences as well as methods of delivery based on the needs and demographics of the local offices.

  3. The local office will use the outreach plan to identify and complete the activities planned to improve awareness of the TAS program, solicit feedback on IRS problems, and improve customer service.

  4. Outreach plans must include the following information/components:

    1. Office — the local TA office submitting the plan.

    2. Fiscal year — self-explanatory.

    3. Audience — the group or targeted audience you want to reach with your message.

    4. Basis — the reason this particular audience was selected.

    5. Method — the plan to deliver our message to your audience (e.g., speech, targeted mail out, booth at seminar/fair, etc.).

    6. Projected cost — the cost for your office to conduct the identified activity.

    7. Responsible party — the staff member responsible for this activity.

    8. Target date — the date you want this activity to be completed.

    9. Actual date — the date you actually completed the activity.

    10. Comments — Annotated information from the approval process (area use) or local data (local office).

    11. Submitted by — name of person submitting the outreach plan.

    12. Approved by — signature of person (usually ATA) approving the plan. If portions of the plan or costs are not approved, the approving office will annotate this.

      Note:

      Exhibit 13.5.1-9, Outreach Plan Template, provides an example of an outreach plan you may use.

  5. When preparing a local outreach plan, give consideration to other plans including the national TAS strategic plan, TAS communication goals, etc. Also consider partnerships with other operating/functional divisions within IRS such as TEC, SPEC, etc. When possible, base the plan on some type of research data such as demographic information. See Exhibit 13.1.5-10, Local Outreach Plan Development.

  6. Each area office is responsible for the review and approval of the specific events and associated costs of local outreach plans. Focus each plan on activities that will result in the greatest return based on the needs of the local office.

  7. LTAs will submit an outreach activity plan to their ATA by a designated date. The ATA will review, provide feedback or recommendation(s) for change to the individual plans and approve the plan.

13.5.1.5.6.2  (10-01-2001)
Reporting Outreach Costs

  1. Report outreach costs by the following methods:

    1. Quarterly reporting the status of the outreach plans;

    2. Use of Project Cost Accounting System (PCAS) code for the Automated Financial System (AFS) and Travel Reimbursement and Accounting System (TRAS).

    3. Time reporting of outreach activities on Single Entry Time Reporting (SETR).

13.5.1.5.6.3  (10-01-2001)
Quarterly Reporting Outreach Plans

  1. Local TAS offices will review and report progress, changes, and results to their ATAs by the 15th day after the end of each quarter.

  2. Include in reports updates to the templates, raw data, and anecdotal data.

  3. Include as raw data the number of outreach activities conducted for the reporting period and the number of people who either heard the message or had the potential to hear the message.

  4. Exhibit 13.5.1-11, Outreach Event Report, is an example of an instrument that may be used to gather data on each event.

  5. Include as anecdotal data any information received regarding the receptiveness of the outreach by the public or internal stakeholders. This may include comments, opinions, suggestions, etc.

  6. The ATA will forward reports to the Director, National TAS Communications and Liaison, by the 25th day after the end of each quarter.

  7. National TAS Communications and Liaison will consolidate data for use in analysis, budget requests, development of communications and marketing plans, etc.

13.5.1.5.6.4  (10-01-2001)
Project Cost Accounting System (PCAS) Code

  1. TAS has established a special code to track travel and other costs related to TAS outreach activities.

  2. The new PCAS code for AFS and TRAS is TASM1.

  3. LTAs must ensure the use of this code when preparing all travel vouchers and any procurement requests relating to outreach.

13.5.1.5.6.5  (10-01-2001)
Time Reporting of Outreach Activities on the Single Entry Time Reporting (SETR) System

  1. SETR has seven Organization Function Program (OFP) codes for use when reporting time spent on outreach activities. These codes are for use by TAS employees:

    1. 36750Internal — used to record preparatory time, travel time, and presentation time for outreach activities involving other IRS employees.

    2. 36751Congressional Office — used to record preparatory time, travel time, and presentation time for outreach activities involving Senators, Congresspersons, and/or their staff members. This category also includes Congressional liaison meetings and Congressional Affairs Program (CAP) conferences.

    3. 36752Tax Practitioner — used to record preparatory time, travel time, and presentation time for outreach activities involving tax practitioners/practitioner groups (e.g., attorneys, certified public accounts, public accountants, enrolled agents, electronic return originators, Tax Executive Institute, etc.).

    4. 36753External Meetings/Speeches/Events — used to record preparatory time, travel time, and presentation time for outreach activities involving external groups when the audience cannot be better defined by a more specific outreach category. This category encompasses efforts related to conferences, fairs, education groups/institutions, etc.

    5. 36754Media — used to record preparatory time, travel time, and presentation time involving outreach activities and interviews with media including print, ratio, and television.

    6. 36770 — EITC — used to record any time expended on working EITC cases or for preparatory time, travel time, and presentation time for any outreach activity primarily related to EITC.

    7. 36771 — EITC Overtime — used to record any overtime expended on working EITC cases or for any overtime grated for preparatory time, travel time, and presentation time for any outreach activity primarily related to EITC.

  2. There are other outreach activities such as the development or distribution of IRS publications, forms, notices, and web site work which do not fit cleanly into one of the SETR codes listed above. Capture the time devoted to these outreach activities using the SETR code which most closely fits the work performed.

  3. Report all time spent on preparing for traveling to, conducting, and performing follow-up activities related to outreach under one of these codes.

  4. Do not report time spent on preparing an outreach plan, reviewing its effectiveness, revising the plan or reporting on progress under these outreach codes. Report these under the "Management and Support" OFP code.

13.5.1.5.7  (10-01-2001)
Outreach Effectiveness

  1. Outreach effectiveness/results will be a high-level measure of direct TAS receipts versus total TAS cases received for a particular period of time and provides data on how first-time TAS users became aware of the TAS program.

  2. This measure ties directly to the input measure — outreach resources spent versus plan (See IRM 13.5.1.5.6.5, Time Reporting of Outreach Activities on the Single Entry Time Reporting (SETR) System.). As an output measure, this measure answers the question — What kind of results did you achieve based on what you did to educate people about TAS?

  3. TAS uses the "how received" indicator on TAMIS to determine if the case was received directly from a taxpayer or practitioner or indirectly (i.e., a referral) from an operating or functional division.

  4. TAS also uses a specific field on TAMIS to record how the taxpayer learned about the program. Form 911, Section IV contains a block entitled "Outreach." Use this block to record how the taxpayer learned about the program. A corresponding field on TAMIS captures this data. Possible entries for the Form 911 block and TAMIS field are:

    1. 00Default (used for indirect TAS receipts) — this code identifies a case that did not come as a direct contact from a taxpayer. In other words, use this code for cases identified by an operational or functional division employee.

    2. 10Repeat customer — the taxpayer has used PRP and/or TAS services before and is already aware of the service provided.

    3. 20IRS publications/forms/notices — the taxpayer learned of TAS through information contained in IRS publications, forms, and/or notices and not through a personal presentation or TAS outreach activities.

    4. 30Web sites — the taxpayer learned of TAS from a web site.

    5. 40Congressional office — the taxpayer learned of TAS through contact with a Senator, Congressperson, and/or a Congressional staff member.

    6. 50Taxpayer practitioner — the taxpayer learned of TAS from a tax practitioner/practitioner group such as an attorney, accountant, enrolled agent, electronic return originator (ERO), Tax Executive Institute, etc.

    7. 60Media — the taxpayer learned of TAS through the media (e.g., television, radio, newspaper, magazine, radio spot, etc.).

    8. 70External meetings/speeches/events — the taxpayer learned of TAS through an outreach event conducted by the IRS such as a conference, fair, education group/institution, etc.

    9. 80Other — the taxpayer learned of TAS through another means not defined above.

    10. 90Reserved — used by TAS employees when taxpayer became aware of TAS through a unique outreach effort that the local office wanted to track separately.

13.5.1.5.8  (10-01-2001)
Immediate Interventions

  1. This section describes the ODTA balanced measure for immediate interventions.

13.5.1.5.8.1  (10-01-2001)
Definitions

  1. Immediate intervention — an expeditious response to an operational issue identified internally or externally that adversely affects customers when there is not enough time for the normal processes to work. The acceptance criteria are:

    1. The issue must impact a population of customers either locally or nationally (i.e., generally more than one person).

    2. The issue is so highly visible, sensitive, and/or critical that there is no time for the normal corrective processes to work (e.g., EITC refunds that are denied for electronically filed returns due to a programming error).

    3. The resolution can be identified within a week to ten business days of identification.

  2. Number of immediate interventions — TAS calculates this balanced measure as the number of immediate interventions the ODTA's office began working in a given period of time. For example, if the ODTA's office began working on five immediate interventions in the first quarter of a fiscal year, the measure would be five for that period. TAS considers the work to begin when the immediate intervention is entered into the Service Wide Action Planning (SWAP) database. TAS uses the SWAP "scheduled start date" to determine when work began.

13.5.1.5.8.2  (10-01-2001)
Source of the Data

  1. The SWAP database is the ODTA's source for the number of immediate interventions. TAS uses the "Custom Report for SWAP" to determine the number of immediate interventions worked in a given period of time. This is done by sorting the report by the "scheduled start date" and counting the number that fall within the given time period. This report is number 12 on the main Reports page of the ODTA's web site: http://advocate.no.irs.gov/reports.asp.

13.5.1.5.9  (10-01-2001)
Advocacy Projects

  1. This section describes the ODTA's balanced measure for advocacy projects.

13.5.1.5.9.1  (10-01-2001)
Definitions

  1. Advocacy project — An advocacy project is a means by which TAS can research an advocacy issue and develop and test a proposed solution for eventual implementation. By means of advocacy projects TAS identifies and addresses systemic and procedural issues, analyzes underlying causes of the problem, and proposes corrective action. TAS classifies the projects into six general categories:

    1. A recommendation for a modification and/or addition to hardware or software such as IDRS, TAMIS, AIMS, ICS, and the Non-Master File.

    2. A recommendation for a change to an existing law or the enactment of a new law.

    3. A change to existing operational procedures or processes.

    4. A change to existing regulations, revenue rulings, or policy statements.

    5. A change in the way the tax law is administered, resulting in the consistent application of the tax law and fair and equitable treatment of taxpayers.

    6. A change to oral and written communications with our customers and stakeholders.

  2. Number of advocacy projects — TAS calculates this balanced measure by counting the advocacy projects the ODTA's office began working in a given period of time. For example, if the ODTA's office began working on 35 advocacy projects in the first quarter of a fiscal year, the measure would be 35 for that period. TAS considers the work to begin when the advocacy project is entered into the SWAP database. TAS uses the SWAP "scheduled start date" to determine when work began.

13.5.1.5.9.2  (10-01-2001)
Source of the Data

  1. The SWAP database is the source for the number of advocacy projects. TAS uses the "Customer Report for SWAP" to determine the number of advocacy projects worked in a given period of time. This is done by sorting the report by the "scheduled start date" and counting the number that fall within the given time period. This report is number 12 on the main Reports page of the ODTA's web site: http://advocate.no.irs.gov/reports.asp.

Exhibit 13.5.1-1  (10-01-2001)
TAS Diagnostic Tools

This image is too large to be displayed in the current screen. Please click the link to view the image.

Exhibit 13.5.1-2  (10-01-2001)
TAS Balanced Measures

This image is too large to be displayed in the current screen. Please click the link to view the image.

Exhibit 13.5.1-3  (10-01-2001)
Sharing Quality Review Results

This image is too large to be displayed in the current screen. Please click the link to view the image.
This image is too large to be displayed in the current screen. Please click the link to view the image.
This image is too large to be displayed in the current screen. Please click the link to view the image.

Exhibit 13.5.1-4  (10-01-2001)
Quality Sample Sizes for Centralized Quality Review

This image is too large to be displayed in the current screen. Please click the link to view the image.
This image is too large to be displayed in the current screen. Please click the link to view the image.

Exhibit 13.5.1-5  (10-01-2001)
Selecting Cases for the Centralized Quality Review Sample

This image is too large to be displayed in the current screen. Please click the link to view the image.
This image is too large to be displayed in the current screen. Please click the link to view the image.
This image is too large to be displayed in the current screen. Please click the link to view the image.

Exhibit 13.5.1-6  (10-01-2001)
Documentation Required for TAS Cases

This image is too large to be displayed in the current screen. Please click the link to view the image.

Exhibit 13.5.1-7  (10-01-2001)
TAS Casework Quality Index (CQI) Standards

This image is too large to be displayed in the current screen. Please click the link to view the image.
This image is too large to be displayed in the current screen. Please click the link to view the image.
This image is too large to be displayed in the current screen. Please click the link to view the image.

Exhibit 13.5.1-8  (10-01-2001)
DIALOGUE Process

This image is too large to be displayed in the current screen. Please click the link to view the image.

Exhibit 13.5.1-9  (10-01-2001)
Outreach Plan Template

This image is too large to be displayed in the current screen. Please click the link to view the image.

Exhibit 13.5.1-10  (10-01-2001)
Local Outreach Plan Development

This image is too large to be displayed in the current screen. Please click the link to view the image.
This image is too large to be displayed in the current screen. Please click the link to view the image.

Exhibit 13.5.1-11  (10-01-2001)
Outreach Event Report

This image is too large to be displayed in the current screen. Please click the link to view the image.

More Internal Revenue Manual