6.410.9 Training Evaluation Policy

Manual Transmittal

November 27, 2018

Purpose

(1) This transmits revised Internal Revenue Manual (IRM) 6.410.9 Training Evaluation Policy for Servicewide Learning & Education.

(2) This IRM establishes general policy and guidance needed to conduct Levels 1 - 4 Training Evaluations.

Material Changes

(1) This IRM 6.410.9 has been rearranged to comply with new rules for formatting the internal controls of IRMs

(2) Subsections have been renumbered throughout the IRM

(3) This IRM 6.410.9 contains revised information related to evaluation policy within the IRS

(4) Enterprise Learning Management System (ELMS) has been changed to Learning Management System (LMS) throughout the IRM

(5) Training items have been changed to training events throughout the IRM

(6) Social Security Number (SSN) has been changed to Standard Employee Identifier (SEID) throughout the IRM

(7) 6.410.9.1, Program Scope and Objectives – Added Program Scope and Objectives, to include Purpose, Audience, Policy Owner, Program Owner, Primary Stakeholders and Program Goals, regarding the Training Evaluation Policy IRM

(8) 6.410.9.1.1, Background – Added Background information about Training Evaluation Policy and the Kirkpatrick Model

(9) 6.410.9.1.2, Authority – Added Authority information for the Training Evaluation Policy

(10) 6.410.9.2, Definitions – Revised definitions. (f) Removed Dimensions from entry for New World Kirkpatrick Model, added descriptions for terms "targeted outcomes" and "support and accountability package" (h) added updated URL for additional resources

(11) 6.410.9.2.1, Regulations and Resources – Changed section title from Evaluation Policy & Regulations to Regulations and Resources. Updated URLs for regulation and resource references

(12) 6.410.9.3, Level 1 Evaluation Overview – Changed section title from Overview of Level 1 Evaluation to Level 1 Evaluation Overview. Revised to more precisely define the Level 1 evaluation and purpose, and the explanation of New World Kirkpatrick Model Level 1 dimensions

(13) 6.410.9.3.1, Level 1 Evaluation Policies – Changed section title from Level 1 Evaluation to Level 1 Evaluation Policies. Added to explain the policies regarding Level 1 Evaluation

(14) 6.410.9.3.2, Level 1 Evaluation Process – Added to explain the purpose and process for Level 1 Evaluation

(15) 6.410.9.4, Level 2 Evaluation Overview – Changed section title from Overview of Level 2 Evaluation to Level 2 Evaluation Overview. Revised to more precisely define the Level 2 evaluation and purpose, and the explanation of New World Kirkpatrick Model Level 2 dimensions

(16) 6.410.9.4.1, Level 2 Evaluation Policies – Changed section title from Level 2 Evaluation to Level 2 Evaluation Policies. Added to explain the policies regarding Level 2 Evaluation

(17) 6.410.9.4.2, Level 2 Evaluation Process – Added to explain the purpose and process for Level 2 Evaluation

(18) 6.410.9.5, Level 3 Evaluation Overview – Changed section title from Overview of Level 3 Evaluation to Level 3 Evaluation Overview. Revised to more precisely define the Level 3 evaluation and purpose, and the explanation of New World Kirkpatrick Model Level 3 dimensions

(19) 6.410.9.5.1, Level 3 Evaluation Policies – Changed section title from Level 3 Evaluation to Level 3 Evaluation Policies. Added to explain the policies regarding Level 3 Evaluation

(20) 6.410.9.5.2, Level 3 Evaluation Process – Added to explain the purpose and process for Level 3 Evaluation

(21) 6.410.9.6, Level 4 Evaluation Overview – Changed section title from Overview of Level 4 Evaluation to Level 4 Evaluation Overview. Revised to more precisely define the Level 4 evaluation and purpose, and the explanation of New World Kirkpatrick Model Level 4 dimensions

(22) 6.410.9.6.1, Level 4 Evaluation Policies – Added to explain the policies regarding Level 4 Evaluation

Effect on Other Documents

This IRM supersedes IRM 6.410.9 dated February 24, 2016, Training Evaluation Policy.

Audience

All Servicewide Learning & Education employees or personnel, including program managers, education specialists and others involved in training activities within each business unit (BU).

Effective Date

(11-27-2018)

Mark W. Scholz, Director, Leadership, Education and Delivery Services

Program Scope and Objectives

  1. This IRM provides general policy and guidance needed to conduct Levels 1 - 4 training evaluations at the IRS.

  2. Purpose: This transmittal revises IRM 6.410.9, Training Evaluation Policy for Servicewide Learning & Education (SL&E).

  3. Audience: This IRM is intended to be used by all divisions and functions.

  4. Policy Owner: Human Capital Office (HCO); Leadership, Education and Delivery Services (LEADS) Division owns this IRM.

  5. Program Owner: HCO-LEADS Servicewide Strategic Training Management (SSTM) is responsible for the administration, policy development and updates related to this IRM.

  6. Primary Stakeholders: SL&E, Embedded L&E and LEADS are the primary stakeholders.

  7. Program Goals: IRM 6.410.9 supports the LEADS goal of supporting a culture of continuous self-development and mission-focused learning for employees by providing tools for measuring and reporting the overall effectiveness of training programs at the Internal Revenue Service (IRS).

Background

  1. This IRM provides essential information on the policy and procedures for developing, delivering and managing Levels 1 – 4 training evaluations at the IRS.

Authority

  1. Per Title 5 of the Code of Federal Regulations (5 CFR 410.202), agencies must evaluate their training programs annually to determine how well such plans and programs contribute to mission accomplishment and meet organizational performance goals. The IRS, through the combined efforts of its Learning & Education (L&E) community, develops and delivers a variety of training courses that are designed to meet these goals. Effectively evaluating these training courses helps us determine course improvement opportunities and the overall value that training is adding to business results and the organization’s mission.

  2. The IRS uses a four-level approach to training evaluation, based on one of the leading industry standards in the field of training. This systematic approach, known as the Kirkpatrick Model, with New World Kirkpatrick Model enhancements, ensures that the IRS gathers complete data that measures the effectiveness of training, while enabling course owners to identify course improvement opportunities. The Kirkpatrick Model effectively demonstrates the strategic value that our training programs add to business results and the organization’s mission.

  3. The IRS follows established procedures outlined in the instructional design model known as the Training Development Quality Assurance System (TDQAS) to administer training evaluations. Specific procedures address the evaluation of learner reaction and achievement and the evaluation of job performance and organizational impact. Adherence to TDQAS processes ensures procedural consistency in the overall quality of our training systems and products. Actions associated with evaluations are addressed in IRM 6.410.1.3.1.6 (Learning & Education Policy) at http://irm.web.irs.gov/Part6/Chapter410/Section1/IRM6.410.1.asp.

Definitions

  1. The following terms are used throughout IRM 6.410.9, and specific training definitions are listed in most subsections of this IRM:

    1. Blended Learning - An integrated strategy for delivering training that involves using more than one delivery method (e.g., classroom, online, self-study and coaching) in a single program to achieve the desired performance.

    2. Critical Training - Any training that is required to meet employee and organizational performance goals.

    3. Curriculum-based training - A series of courses designed to achieve specific training goals based on job competencies for employee occupational groups.

    4. Learning Management System (LMS) - The official system of record for training. It offers development and delivery of course evaluations. Online access to LMS resources may be found at https://elms.web.irs.gov.

    5. Evaluation Management System (EMS) - The official system of record for managing training evaluation data.

    6. New World Kirkpatrick Model - A four-level blended approach to training evaluation that focuses on transferring learning to behavior and aligning training with organizational goals. The four levels are:

      Level 1: Reaction - The degree to which participants find the training favorable, engaging and relevant to their jobs.

      Level 2: Learning - The degree to which participants acquire the intended knowledge, skills, attitude, confidence and commitment based on their participation in the training.

      Level 3: Behavior - The degree to which participants apply what they learned during training when they are back on the job.

      Level 4: Results- The degree to which targeted outcomes occur as a result of training and the support and accountability package. Targeted outcomes are training programs’ highest level goals. These should tie into the organization’s overall purpose and its business need.

    7. Mission Critical Occupations (MCOs) - Occupations that comprise the unique core competencies of the IRS and/or have the greatest direct impact on the bureau’s ability to meet its mission. Additional information on MCOs may be found at http://hco.web.irs.gov/workplan/misscritocc.html.

    8. Technology- Enabled Learning (TEL) - TEL or eLearning is a broad set of applications and processes that include web-based learning, computer-based learning, virtual classrooms, and digital learning. It leverages technology within an environment of sound course design. Additional resources may be found at https://portal.ds.irsnet.gov/sites/VL005/Pages/default.aspx/

    9. Training - The process of providing employees the programs, courses or other instruction they need to develop new skills to perform a task or process and/or enhance or improve current skills in their individual job performance. Effective training may result in observably changed behavior.

    10. Training Event - Instruction conducted in a structured learning environment (both eLearning and non-eLearning), that contains behavioral objectives linked to or derived from job competencies or tasks.

    11. Training Development Quality Assurance System (TDQAS) - The TDQAS is the IRS’s educationally benchmarked systems approach to training and instructional systems developmental process. TDQAS is designed to ensure high-quality training products and services. The six phases of TDQAS are assessment, analysis, design, development, implementation and evaluation. For additional information see IRM 6.410.1.3.1 - Training Development Quality Assurance System (TDQAS). Access to TDQAS may be found at https://portal.ds.irsnet.gov/sites/VL005/Pages/default.aspx/ .

Regulations & Resources

  1. Regulations & resources are shown below:

    1. U.S. Office of Personnel Management Training & Development (5CFR 410) at https://www.opm.gov/policy-data-oversight/training-and-development/.

    2. U.S. Office of Personnel Management Training Evaluation Field Guide at https://www.opm.gov/WIKI/uploads/docs/Wiki/OPM/training/Field%20Guide%20to%20Training%20Evaluation_6-8-2011-FINAL.pdf.

    3. National Agreement - Article 30 - Training at http://core.publish.no.irs.gov/docs/pdf/d11678--2015-10-00.pdf.

Level 1 Evaluation Overview

  1. Level 1 evaluation is defined as the degree that participants find training favorable, engaging and relevant to their jobs. Its purpose is to assess the immediate reaction of learners to the training. The data gathered may be used to improve the quality of future training events.

  2. New World Kirkpatrick Model Level 1 dimensions include:

    • Customer Satisfaction – participants’ satisfaction with the training.

    • Engagement - the degree to which participants are actively involved in contributing to the learning experience

    • Relevance - the degree to which participants believe they will have the opportunity to use or apply what they learned in training on the job.

  3. Each business unit will access Level 1 evaluation reports from the LMS and EMS and analyze them for appropriate actions based on trainees' feedback.

Level 1 Evaluation Policies

  1. Level 1 evaluation will be conducted on all training events. A training event is defined as instruction conducted in a structured learning environment (both eLearning and non-eLearning), that contains behavioral objectives linked to or derived from job competencies or tasks.

  2. All training events in the LMS must be documented with “Level 1 is Required”.

  3. All training events must use the appropriate standardized online Level 1 evaluation form available in the LMS. The use of the standardized online Level 1 evaluation forms ensure data is consistently captured, compiled, quantified and reported. Feedback from participants, instructors, the L&E community and the National Treasury Employees Union (NTEU) was considered in the development of the standardized online forms.

  4. The completion of Level 1 evaluation by learners is not mandatory, but is strongly encouraged.

  5. Level 1 evaluation will be anonymous and confidential (e.g., no names, SEIDs, and other self-identifying data will be used).

  6. Level 1 evaluation should be completed during normal duty hours.

  7. Alternative methods for completing Level 1 evaluation should be considered for employees with limited or restricted access to computers or the internet. These options will be determined by the business units and based on available resources.

  8. Alternative methods for completing Level 1 evaluation will be provided by the business units to visually impaired or motor-skills impaired individuals.

  9. Whenever feasible, Servicewide L&E personnel can use evaluative information from instructors to ensure high-quality, effective and efficient delivery of training.

  10. By agreement, NTEU will be provided information upon request.

Level 1 Evaluation Process

  1. The Level 1 evaluation process enables business units to gather and interpret participants’ reactions to the training they received. Analysis of Level 1 results, enables course developers/owners to make initial determinations on course improvement opportunities based on participant feedback.

  2. Level 1 results from mission-critical training should be analyzed from data obtained from the LMS, EMS and/or alternative means used by the business unit. The analysis will be documented in a consolidated report following guidelines established by each business unit.

    Note:

    Based on a business unit’s program priorities, one-time training events may be excluded from analysis.

  3. Documentation includes any or all of the following:

    • Analysis and assessment of Level 1 reports with feedback

    • Overall training satisfaction scores

    • Recommendations and/or corrective actions as a result of Level 1 analysis that can be applied to future training programs

    • Annotation if no recommendations are needed

Level 2 Evaluation Overview

  1. Level 2 evaluation is defined as the degree that participants acquire the intended knowledge, skills, attitude, confidence and commitment based on their participation in training. Level 2 evaluation can be either knowledge-based or performance-based.

  2. New World Kirkpatrick Model Level 2 dimensions include:

    • Knowledge - "I know it."

    • Skill - "I can do it right now."

    • Attitude - "I believe this will be worthwhile to do on the job."

    • Confidence - "I think I can do it on the job."

    • Commitment - "I intend to do it on the job."

  3. Level 2 evaluation reports will be accessed from the Servicewide LMS, EMS, and/or alternative means, such as SharePoint and analyzed to determine:

    1. Level 2 compliance rates

    2. whether trainees acquired the intended skills and knowledge based on test results

Level 2 Evaluation Policies

  1. Level 2 evaluation should focus on mission-critical training programs. On an annual basis, each L&E organization, through discussion with their business unit customer(s), will identify mission-critical training for which Level 2 evaluations will be required and document the status in the LMS.

  2. If a Level 2 is required, a test or other assessment must be developed and administered during training.

  3. Level 2 can be quantitative or qualitative in nature. Examples include, but are not limited to:

    1. knowledge tests

    2. knowledge checks

    3. case studies

    4. team exercises

    5. simulations

    6. presentations

    7. observations

    8. work reviews

    9. self-assessments

    10. pass/fail assessments

    11. role plays

    12. interviews

    13. action learning

  4. Methods for capturing Level 2 results should be developed and included in the course.

  5. If a Level 2 is required, but not developed, the appropriate reason code should be recorded in the LMS.

  6. L&E will ensure that when using vendor-developed training products, the contract between the Service and the vendor stipulates that the results of Level 2 evaluations will be forwarded to L&E.

    Note:

    Continuing Professional Education (CPE) courses where accreditation (continuing education units or credits) is sought must adhere to test requirements and standards of the accrediting authority (i.e., American Institute of Certified Public Accountants or National Association of State Board of Accountancy).

    Criminal Investigation (CI) training courses must adhere to the standards set forth by the Federal Law Enforcement Training Accreditation Board.

Level 2 Evaluation Process

  1. Events created in the LMS should have the Level 2 field populated with “Required” or “Optional.”

  2. If an event has a Level 2 requirement, a test or assessment must be developed and administered.

  3. Procedures to capture Level 2 scores or results must be developed and included in the course.

  4. Once the Level 2 is administered, the course instructor or designee should compile Level 2 results, document them on the Form 14156, Instructor Data Capture Form (IDCF) and submit the form to Centralized Delivery Services (CDS) for input into the LMS.

  5. If a Level 2 evaluation is required but not administered, the instructor or designee will indicate the reason the Level 2 was not administered on the IDCF and submit the form to CDS for input to the LMS.

  6. Level 2 evaluation reports will be anonymous and confidential (e.g., no names, SEIDs, or other self-identifying data will be recorded).

  7. Level 2 results from mission-critical training that is identified as “Required” should be analyzed from data obtained from the LMS, EMS, and/or alternative means and documented in a consolidated report following guidelines established by each business unit.

    Note:

    Based on a business unit's program priorities, one-time training events may be excluded from analysis.

  8. Documentation includes any or all of the following:

    • Test item analysis

    • Average test scores

    • Performance-based results

    • Level 2 final results from pass/fail, complete/not complete assessments, etc.

    • Recommendations and/or corrective actions as a result of Level 2 analysis that can be applied to future training programs

    • Annotation if no recommendations are needed

Level 3 Evaluation Overview

  1. Level 3 evaluation is defined as the degree to which participants apply what they learned during training when they are back on the job.

  2. New World Kirkpatrick Model Level 3 dimensions include:

    • Required Drivers - processes and systems that reinforce, monitor, encourage and reward performance. Examples of required drivers include work review checklists, job aids, recognition, coaching and mentoring, etc.

    • Critical Behaviors - the few, specific actions that will have the biggest impact on the desired results if performed consistently on the job. For the IRS, critical behaviors are the same as enabling learning objectives (ELOs).

    • On-the-Job Learning - a culture and expectation that individuals are responsible for maintaining the knowledge and skills to enhance their own performance.

  3. Level 3 evaluation should be conducted after completion of training and after employees have been given an appropriate time to demonstrate application of learning. They are typically delivered one to six months after training.

  4. The agency recommends the use of the Servicewide EMS to develop and administer Level 3 evaluations.

  5. Level 3 evaluation reports will be accessed from the Servicewide EMS or alternative means such as SharePoint and analyzed for appropriate actions based on trainees’ feedback.

Level 3 EvaluationPolicies

  1. Level 3 evaluation should focus on mission-critical training programs. On an annual basis, each L&E organization, through discussion with their business unit customer(s), will identify mission critical training for which Level 3 evaluations will be required and document the status in the LMS.

  2. If a Level 3 is required, corresponding participant and manager evaluation instruments must be developed and administered within the appropriate timeframe.

  3. It is recommended that Level 3 evaluation have previously administered Level 1 and Level 2 evaluations.

  4. Alternative methods for completing Level 3 evaluation should be considered for employees with limited or restricted access to computers or the internet. These options will be determined by the business units and based on available resources.

  5. Level 3 evaluation instruments that are created using the servicewide EMS will be reviewed/approved by the Evaluation Program Manager in Servicewide Strategic Training Management (SSTM) , prior to delivery.

  6. Managers and learners will complete Level 3 evaluations during normal duty hours.

  7. Level 3 evaluation reports will be anonymous and confidential (i.e., no names, SEIDs or other self-identifying data will be recorded).

  8. Completion of Level 3 evaluation by learners is voluntary, but strongly encouraged.

  9. Completion of the corresponding manager Level 3 evaluation by the learner’s manager is mandatory.

  10. No adverse actions will be taken against learners or managers based on results of Level 3 data reports.

  11. Learners choosing not to complete the evaluation will not be adversely affected.

  12. Information gathered through Level 3 evaluation will not be used to evaluate learners on their annual performance appraisals.

Level 3 Evaluation Process

  1. Events created in the LMS should have the Level 3 field populated with “Required” or “Optional.”

  2. If an event has a required Level 3, corresponding participant and manager surveys should be developed and administered within appropriate timeframes.

  3. All Level 3 surveys should include the mandatory training transfer questions to provide results for Servicewide reporting.

  4. Once developed, all surveys will be sent to the Evaluation Program Manager in SSTM for review and approval, prior to delivery.

  5. Decisions on the number of participants to receive surveys (i.e., all participants vs. a sampling number of participants) should be based on discussions between L&E and your business unit customers. The number of recipients for each survey should be sent to the Evaluation Program Manager in SSTM for documentation purposes.

  6. Corresponding manager surveys should be sent to all surveyed participants’ managers to capture comparative feedback.

  7. Level 3 results for mission-critical training that is identified as “Required” should be analyzed from data obtained from the LMS, EMS, and/or alternative means and documented in a consolidated report following guidelines established by each business unit.

    Note:

    Based on a business unit's program priorities, one-time training events may be excluded from analysis.

  8. Documentation includes any or all of the following:

    • Analysis of Level 3 reports with feedback

    • Recommendations and/or corrective actions as a result of Level 3 analysis that can be applied to future training programs

    • Annotation if no recommendations are needed.

     

    Note:

    The EMS team can be consulted on an alternative process when program managers want to capture results on alternative Level 3 surveys such as focus group interviews.

Level 4 Evaluation Overview

  1. Level 4 evaluation is defined as the degree that targeted outcomes occur as a result of training. It assesses the impact of the improvements in the trainees' performance on the mission of the organization. Departments must be able to identify and provide data on measures that have organizational impact.

  2. New World Kirkpatrick Model Level 4 dimensions include leading indicators which are short-term observations and measurements suggesting that critical behaviors are on track to create a positive impact on desired results. Examples of leading indicators include balanced measure ratings, compliance ratings, customer response ratings, etc.

  3. Support and accountability are subsequent reinforcements found in the on-the-job environment. There is often a disconnect between transferring learning to behavior after training takes place. Thus, support and accountability are important components to achieving Level 4. Ensure that Level 4 results contain illustrations of 1) organizational support and 2) employee and managerial accountability for putting new training behaviors into practice.

  4. Level 4 evaluation will be accessed from the Servicewide or alternate means such as SharePoint and analyzed for appropriate actions.

Level 4 Evaluation Policies

  1. L&E will confer with their assigned business unit executives or designee(s) during the TDQAS Assessment Phase to determine:

    1. program expectations including the customer’s desired results of the training,

    2. if a Level 4 evaluation is warranted and/or feasible,

    3. whether resources and funding are available and will be committed,

    4. the extent the training impact can be isolated against other factors, and

    5. if customer-valued data on business results is available.

  2. Once the need for a Level 4 is determined, the status should be documented as “Required” in the LMS.

  3. It is recommended that Level 1, 2 and 3 evaluations are administered before developing the Level 4 evaluation. Feedback from these levels will provide important quantitative and qualitative data supporting conclusions drawn that links training to business results.

  4. Results for training programs qualifying for a Level 4 evaluation must be documented in EMS.

  5. Level 4 executive reports will be designed following guidelines established by your business unit.