2.127.2 IT Testing Process and Procedures

Manual Transmittal

May 17, 2017

Purpose

(1) This transmits revised Internal Revenue Manual (IRM) 2.127.2, Testing Standards and Procedures, IT Testing Process and Procedures.

Material Changes

(1) IRM 2.127.2 was updated to reflect the removal of the word software in the title of the IRM, as well as the removal of software specific references, as applicable.

(2) Direction is provided for Agile/Iterative testing methodology.

(3) Editorial changes were made throughout the IRM to improve clarity and consistency.

Effect on Other Documents

IRM 2.127.2, dated March 17, 2015, is superseded.

Audience

This process description is applicable to all Information Technology (IT) organizations, contractors, and stakeholders performing testing.

Effective Date

(05-17-2017)

S. Gina Garza
Chief Information Officer

Process Description

  1. IT Testing

Introduction

  1. Introduction

Administration
  1. All proposed changes to this document should be directed to Enterprise Systems Testing (EST), owner of this process description and pursued via the Integrated Process Management (IPM) process to clearly define interfaces, roles, responsibilities, and coordinate participation and collaboration between stakeholders.

Purpose of Process Description
  1. This IT Testing process description describes what happens within the IT Testing process and provides an operational definition of the major components of the process. This description specifies, in a complete, precise, and verifiable manner, the requirements, design, and behavior characteristics of the IT Testing process. The process description (PD) is a documented expression of a set of activities performed to achieve a given purpose. Tailoring of this process in order to meet the individual needs of each project is covered in the Tailoring Guidelines section of this document.

  2. For the purpose of this document, roles such as Business Lead, Project Manager, Test Manager, etc. are provided to describe a set of responsibilities for performing a particular set of related activities.

Document Overview
  1. This document describes a set of interrelated activities, which transform inputs into outputs, to achieve a given purpose and states the guidelines that all projects should follow regarding the IT Testing process. The format and definitions used to describe each of the process steps of the IT Testing process are described below:

    • Purpose – The objective of the process step

    • Roles and Responsibilities – The responsibilities of the individuals or groups for accomplishing a process step

    • Entry Criteria – The elements and conditions (state) necessary to trigger the beginning of a process step

    • Input – Data or material needed to perform the process step. Input can be modified to become an output

    • Process Activity – The list of activities that make up the process step

    • Output – Data or material that are created (artifacts) as part of, produced by, or resulting from performing the process step

    • Exit Criteria – The elements or conditions (state) necessary to trigger the completion of a process step

Process Overview

  1. Process Overview

Work Products
  1. This section describes the work products needed to execute the process (known as inputs) as well as those produced by the IT Testing process (known as outputs).

Work Products Used by this Process (Inputs)
  1. The following work products are used to assist in the implementation of the IT Testing process:

    • Project documentation (Project Management Plan (PMP), Tailoring Plan, Project Charter, etc.)

    • Previous test artifacts (lessons learned, test plans, test cases, test data, End of Test Completion Report (EOTCR), End of Test Report (EOTR), etc.)

    • Requirements documentation (Business System Report (BSR), Unified Work Request (UWR), Statement of Work (SOW), Change Request (CR), etc.)

    • Functional documentation (Design Specification Package (DSP), Functional Specification Package (FSP), Interface Control Document (ICD), Program Requirements Package (PRP), Design Specification Report (DSR), Business Traceability Model (BTM), Core Record Layout (CRL), etc.)

    • Operational documentation (Computer Operator Handbook (COH), Computer Program Book (CPB), User Manual, Business Process Model (BPM), etc.)

    • Security documentation (Risk Assessment, Impact Assessment, etc.)

    • Privacy documentation (Privacy Impact Assessment (PIA), System of Records Notice (SORN), Privacy Requirements, Privacy Testing Guidance, etc.)

    • Projects following the Agile/Iterative path must adhere to the instructions provided in IRM 2.16.1, Enterprise Life Cycle (ELC) - Enterprise Life Cycle Guidance and the Enterprise Systems Testing (EST) Iterative Development and Testing Process Description (see IRM 2.127.2.4.2)

    Note:

    Abbreviations and Acronyms can be found in Exhibit B. Examples of Requirements, Functional, Operational and Security Documentation can be found in Exhibit C.

Work Products Produced by this Process (Outputs)
  1. The following work products (artifacts) are produced by the IT Testing process and may be used as inputs to other processes.

    • Approved Test Plan (Unit, Systems Acceptability Testing (SAT) Plan, System Test Plan (STP), etc.)

    • Approved End of Test Report (Test log, EOTR, EOTCR, etc.)

    • Completed project folder checklist

Roles and Responsibilities
  1. Many roles are involved in the IT Testing process. This section defines the roles used throughout this document in terms of their responsibilities. The responsibilities for each role may vary based upon project structure.

  2. Roles and Responsibilities

    Role Description Definition of Responsibility
    Business Lead
    • Create, communicate, coordinate, and interpret the business requirements

    • Approve various artifacts

    Developer
    • Coordinate identified issues/problems/defects with other testing or project stakeholders or provide a workaround

    • Document all coding

    • Participate in peer reviews of coding and documentation

    • Perform unit testing on the created/changed code

    • Notify project manager of testing status

    • Provide appropriate artifacts to the next phase of testing/deployment

    • Create, update, and maintain appropriate artifacts for testing phases

    Project Manager
    • Ensure team understanding of the business requirements

    • Develop the high level strategies needed to support the development life cycle

    • Ensure that Verification and Validation methods will be planned, documented, and performed

    • Ensure process activities are performed timely

    • Ensure coordination activities are held

    • Ensure issues/problems/defects not adequately addressed are raised to the appropriate level for resolution

    • Ensure all milestones are met for agreed to changes

    • Approve various artifacts

    • Ensure ELC project deliverables and work products have been completed

    Test Analyst
    • Create test related work products (test cases/scripts, test datasets, etc.)

    • Prepare any required reporting documentation for the respective testing activities

    • Execute and document test activities

    • Manage testing requirements, create, duplicate, and execute test cases/scripts, identify and document testing problems, and report testing status

    • Analyze appropriate documentation to extract project requirements

    Test Lead
    • Ensure that all work products are completed (Test Plan, EOTR, etc.)

    • Ensure the verification and acceptance of all test plans and documentation

    • Triage open testing problems, update problem status, and provide solutions or workarounds for test issues/problems/defects

    • Create, update, and maintain appropriate artifacts for testing phases

    Test Manager
    • Provide guidance on test strategy, scope, and approving test plans in accordance with standards and procedures

    • Manage test issues/problems/defects logged by testers, generate problem reports, and ensure issues/problems/defects are assigned to the appropriate developer for resolution

Process Flow Diagrams
  1. IT Testing Flow Diagram

    Figure 2.127.2-1

    This is an Image: 60808001.gif

    Please click here for the text description of the image.

IT Testing Process Steps

  1. Process Steps

Step 1: Perform Planning
  1. Perform Planning

Purpose
  1. The purpose of this process step is to identify the activities required to perform test planning within the IT organization.

Roles and Responsibilities
  1. The project manager is responsible for assigning team responsibilities; facilitating team understanding of the business requirements; identifying and providing the ELC testing artifacts at the Milestone Readiness Review; and developing the high level strategies needed to deliver the solution.

  2. The test manager is responsible for developing the test plan; developing test strategies; conducting peer reviews; verifying requirements are clear and testable; and confirming that entrance and exit criteria are met.

  3. The business lead is responsible for defining business requirements.

  4. The test lead is responsible for analyzing documentation and creating the project folder.

Entry Criteria
  1. Generally, the Perform Planning step occurs after the following events have occurred:

    • Funding approved

    • Requirements received

    • Test team established

Input
  1. The following are inputs to this process step:

    • Previous testing artifacts (lessons learned, test plans, test cases, test data, end of test report, etc.)

    • Requirements Documentation (BSR, UWR, SOW, CR, etc.)

    • Functional Documentation (DSP, FSP, ICD, PRP, DSR, etc.)

    • Operational Documentation (COH, CPB, User Manual, etc.)

    • Security Documentation (Risk Assessment, Impact Assessment, etc.)

    Note:

    Documentation is due 30 days prior to the planned test start date or as mutually agreed upon by the key stakeholders involved.

Process Activity
  1. Assess Requirement(s)

  2. Establish Test Environment

  3. Train Test Team

  4. Develop Test Plan(s)

Output
  1. The following are outputs to this process step:

    • Test plan

    • Test schedule

    • Project folder

    • Project folder checklist

    • Sensitive But Unclassified (SBU) Data Use Questionnaire - Form 14664, and Sensitive But Unclassified Data use Request - Form 14665, as applicable. See IRM 10.5.8 Sensitive But Unclassified (SBU) Data Policy: Protecting SBU in Non-Production Environments

      Note:

      A new policy contained in IRM 10.5.8 entitled, “Information Technology (IT) Security, Sensitive But Unclassified (SBU) Data Policy: Protecting SBU in Non-Production Environments”, replaces the previous Live Data policy and process. Refer to IRM 10.5.8 for complete details, as this process varies greatly from the previous process.

Exit Criteria
  1. This process step is complete when:

    • Requirements baselined

    • Test environment(s) established

    • Draft test plan created

    • Approved repository established (Rational Quality Manager (e.g., RQM), Document Management for Information Technology (DocIT) etc.)

    • Project folder established

Step 2: Perform Preparation
  1. Perform Preparation

Purpose
  1. The purpose of this process step is to outline the activities required to perform test preparation.

Roles and Responsibilities
  1. The test manager is responsible for providing guidance on test strategy, scope, and approving the test plans in accordance with standards and procedures.

  2. The test analyst is responsible for reviewing requirements, developing test data; creating test cases/scripts and identifying and documenting problems.

  3. The business lead is responsible for clarifying business requirements.

  4. The test lead is responsible for status reporting and ensuring that work products are complete.

Entry Criteria
  1. Generally, the Perform Preparation step occurs after the following events have occurred:

    • Scope of test agreed upon

    • Draft test plan created

    • Requirements baselined

    • Test environment established

    • Approved repository established

Input
  1. The following are inputs to this process step:

    • Test schedule

    • Test plan

    • Project folder checklist

    • Documentation received (BSR, UWR, Requirements Traceability Matrix (RTM), FSP, DSR, COH, etc.)

    • Sensitive But Unclassified (SBU) Data Use Questionnaire - Form 14664, and Sensitive But Unclassified Data use Request - Form 14665, as applicable. See IRM10.5.8 Sensitive But Unclassified (SBU) Data Policy: Protecting SBU in Non-Production Environments

Process Activity
  1. Verify Test Environment

  2. Review Documentation

  3. Prepare Test Cases/Scripts/Data

  4. Conduct Test Readiness Review

Output
  1. The following are outputs to this process step:

    • Test plan

    • Test cases, test scripts and test data

    • Test Readiness Review (TRR) checklist and memorandum

    • Project folder checklist

    • Requirements Traceability Document (RTM, Requirements Traceability Verification Matrix (RTVM), etc.)

    • Status reports

    • Peer Review Defect and Resolution Report, if applicable

Exit Criteria
  1. This process step is complete when:

    • TRR completed

    • Test plan finalized and issued

    • Test cases, test scripts and test data developed

    • Environments verified

    • Repository confirmed

Step 3: Execute and Document Test
  1. Execute and Document Test

Purpose
  1. The purpose of this process step is to outline the activities required to execute and document tests.

Roles and Responsibilities
  1. The test manager is responsible for developing and managing the test plan and test schedule; conducting peer reviews; monitoring test workload; redistributing workload; verifying requirements; validating entrance and exit criteria; reporting testing status and developing test artifacts.

  2. The test lead is responsible for ensuring all test products are completed and documented.

  3. The test analyst is responsible for developing test data; creating and executing test cases/scripts; identifying and documenting problems; and reporting testing status and generating problem reports and developing test artifacts.

  4. The developer is responsible for verifying requirements; resolving problems and updating technical documentation.

Entry Criteria
  1. Generally, the Execute and Document Test step occurs after the following events have occurred:

    • TRR acceptance

    • Program transmitted to the testing environments

    • Hardware installed if applicable

    • Repository confirmed

Input
  1. The following are inputs to this process step:

    • Project folder checklist

    • Test cases, test scripts and test data ready for execution

Process Activity
  1. Execute Test Cases/Scripts

  2. Document Results

  3. Report Test Status

Output
  1. The following are outputs to this process step:

    • Test status report(s)

    • Test case waiver or deferral, if applicable

    • Problem tickets, if applicable

Exit Criteria
  1. This process step is complete when:

    • Test cases disposition

    • Defects disposition

    • Repository updated with test results

    • Test status reports updated to reflect results

Step 4: Closeout Test
  1. Closeout Test

Purpose
  1. The purpose of this process step is to outline the activities required to close out the test. This process step provides guidance on the documentation that must be completed for the work accomplished and summarizes the actual results throughout the test.

Roles and Responsibilities
  1. The project manager is responsible for ensuring the ELC project deliverables and work products have been completed.

  2. The test manager is responsible for ensuring all tests have been dispositioned and all required deliverables and work products (e.g., EOTR, Project Folder Checklist, etc.) have been completed.

  3. The test lead is responsible for developing the deliverables and work products as well as conducting close out meetings.

  4. The test analyst is responsible for ensuring all required test artifacts are in the project folder.

  5. The business lead is responsible for confirming that all requirements have been disposition.

Entry Criteria
  1. Generally, the Closeout Test step occurs after the following events have occurred:

    • All test cases have been disposition

    • Defects have been disposition

    • Test status reports have been updated to reflect results

Input
  1. The following are inputs to this process step:

    • Test status reports

    • Test Case Waiver or Deferral, if applicable

    • Updated repositories (e.g., DocIT, RQM, RequisitePro (ReqPro), Requirements and Demand Management RADM tool, etc.)

Process Activity
  1. Finalize Test Artifacts

  2. Issue End of Test Reports

  3. Conduct Closeout Meetings

  4. Finalize Project Folder

Output
  1. The following are outputs to this process step:

    • Approved end of test reports

    • Approved project folder checklist

    • Lessons learned report

Exit Criteria
  1. This process step is complete when:

    • Closeout meetings are concluded (Post Implementation Review (PIR), Lessons Learned, etc.)

Process Measurement
  1. Management will regularly review quantifiable data related to different aspects of the IT Testing process in order to make informed decisions and take appropriate action, if necessary.

Training
  1. Training will be conducted as appropriate for each test team. Training will be provided according to the project's training plan.

Procedure

  1. Introduction

Administration
  1. All proposed changes to this document should be directed to the EST, Test Support and Automation Branch, owner of this procedure and pursued via the IPM process to clearly define interfaces, roles, responsibilities, and coordinate participation and collaboration between stakeholders.

Purpose of Procedure
  1. This document defines the step-by-step instructions on how to conduct the activities used to implement the IT Testing procedure. The purpose of a procedure document is to institutionalize and formalize the preferred method of performing tasks that staff is using. The objective is to have everyone use the same tools and techniques and follow the same repeatable steps. This consistency will allow the organization to quantify how well the procedure is working and ensure optimum efficiency.

  2. Tailoring of this procedure in order to meet the individual needs of each project is covered in the Tailoring Guidelines section of this document.

  3. For the purpose of this document, roles such as Project Manager, Test Manager, Business Lead, etc. are provided to describe a set of responsibilities for performing a particular set of related activities.

Procedure Overview
  1. IT Testing

Purpose
  1. The purpose of this procedure is to outline the steps required to standardize testing to ensure that customer requirements are met. This procedure provides guidance on the test activities to perform beginning with the initial preparation and ending with the stakeholder's concurrence.

Related Process Artifacts
  1. Related Artifacts are:

    • ELC templates (STP, Test Plan (TP), EOTR, EOTCR, System Deployment Plan (SDP))

    • Project Folder Checklist

    • IT Peer Review Procedure

    • IT Test Reference Guide

    • IT Test Type Identification Guide

    • Documentation (e.g., PMP, design documents, reports)

Related Directives
  1. Related Directives are:

    • IRM 1.11 Internal Management Documents System

    • IRM 2.110.1, Engineering Directive

    • IRM 2.127.1, IT Test Policy Directive

    • IRM 1.15 Records and Information Management, Document 12990

Entry Criteria
  1. Generally, the IT Testing procedure occurs after the following events have occurred:

    • Funding approved

    • Scope of test agreed upon

    • Test team established

    • Requirement(s) received

Input
  1. The following are inputs to this IT Testing procedure:

    • Previous testing artifacts (lessons learned, test plans, test cases, test data, end of test report, etc.)

    • Requirements Documentation (BSR, UWR, SOW, CR, etc.)

    • Functional Documentation (DSP, FSP, ICD, PRP, DSR, etc.)

    • Operational Documentation (COH, CPB, User Manual, etc.)

    • Security Documentation (Risk Assessment, Impact Assessment, etc.)

Activities
  1. This Procedure covers activities as follows:

  2. Activities

    A1 Perform Planning
    A2 Perform Preparation
    A3 Execute and Document Test
    A4 Closeout Test
Output
  1. The primary outputs of this IT Testing procedure are:

    • Approved Documentation (EOTR, EOTCR, etc.)

    • Completed Test schedule

    • Completed Project Folder

    • Lessons Learned report

    • Approved Project Folder Checklist

    • Sensitive But Unclassified (SBU) Data use Questionnaire - Form 14664, and Sensitive But Unclassified Data Use Request - Form 14665, as applicable. See IRM10.5.8 Sensitive But Unclassified (SBU) Data Policy: Protecting SBU in Non-Production Environments

Exit Criteria
  1. This IT Testing procedure is exited when:

    • Test cases have been disposition

    • Defects have been disposition

    • Closeout meetings are concluded (PIR, lessons learned, etc.)

Procedure Flow Diagram
  1. A1 - Perform Planning

    Figure 2.127.2-2

    This is an Image: 60808002.gif

    Please click here for the text description of the image.

  2. A2 - Perform Preparation

    Figure 2.127.2-3

    This is an Image: 60808003.gif

    Please click here for the text description of the image.

  3. A3 - Execute and Document Test

    Figure 2.127.2-4

    This is an Image: 60808004.gif

    Please click here for the text description of the image.

  4. A4 - Closeout Test

    Figure 2.127.2-5

    This is an Image: 60808005.gif

    Please click here for the text description of the image.

Activity and Steps
  1. This section delineates the activity steps, including roles and tools or templates, needed to perform each step of this Procedure.

    A1: Perform Planning
    (See Figure 2.127.2–2)
    Steps Roles
    1. Assess Requirements

    • Review requirements documentation (e.g., UWR, BSR, CR)

    • Review design documentation (e.g., DSR, FSP, PRP)

    • Conduct requirements analysis, document testable requirements, and determine test types

      See IRM 2.127.2.4.2 for Test Type Identification Guide and IT Test Reference Guide

    • Conduct walkthrough meeting to approve final testable requirements

    • Brief project stakeholders on which types of tests will be performed

    • Develop the Work Breakdown Structure (WBS)

    • Establish project folder

    Test Analyst
    Test Lead
    Test Manager
    Project Manager
    2. Establish Test Environment

    • Submit request for testing services

    • Identify the test environment(s)

    • Identify test tools

    • Submit request for Enterprise File Transfer Utility (EFTU) support, Request for Computer Services (RCS), etc., if applicable

    • Sensitive But Unclassified (SBU) Data use Questionnaire - Form 14664, and Sensitive But Unclassified Data use Request - Form 14665, as applicable. See IRM10.5.8 Sensitive But Unclassified (SBU) Data Policy: Protecting SBU in Non-Production Environments

    Test Manager
    Project Manager
    3. Train Test Team

    • Train test team on test processes, test tools and test environments (RQM, Rational Functional Tester, RADM Tool, Rational Requirements Composer (RRC)), Knowledge Incident/Problem Service Asset Management (KISAM), etc., if applicable

    Test Lead
    4. Develop Test Plan(s)

    • Review previous Lessons Learned and/or PIR documents

    • Develop the test schedule

    • Develop the test plan using template

      See IRM 2.127.2.4.2 for TP Template

    Test Lead
    Test Manager
    A2: Perform Preparation
    (See Figure 2.127.2–3)
    Steps Roles
    1. Verify Test Environment

    • Verify the type of equipment

      • The test environment should simulate the production environment

      • Verify test environment is functional

    • Coordinate Interface Database Agreements/Files Communication Status Report (FCSR), if applicable

    • Verify all test items listed in the test plan are available, which includes Job Control Language (JCL)/ Executive Control Language (ECL), test tools, and test data files

    • Update execution control language for test environment:

      • JCL

      • ECL

      • JavaScript

      • Active Server Pages, etc.

    Test Analyst
    Test Lead
    Test Manager
    2. Review Documentation

    • Perform requirements analysis

    • Review all received related design documentation

    • Conduct peer review, as applicable

      See IRM 2.127.2.4.2 for IT Peer Review Procedure

    Test Analyst
    Test Lead
    3. Prepare Test Cases/Scripts/Data

    • Review external documentation

    • Create test cases/scripts

    • Create test data

    • Review internal documentation (e.g., test plan, interface database agreement, test cases/scripts, etc.)

    • Report test status

    • Conduct peer review, as applicable

    Test Analyst
    4. Conduct Test Readiness Review (TRR)

    • Identify TRR participants

    • Prepare TRR checklist and memorandum

    • Conduct TRR meeting

      See IRM 2.127.2.4.2 for TRR Procedure

    Test Lead
    Test Manager
    A3: Execute and Document Test
    (See Figure 2.127.2–4)
    Steps Roles
    1. Execute Test Cases/Scripts

    • Execute test cases/scripts

    • Validate processing of data

    • Determine pass/fail status by comparing output to expected results

    Test Analyst
    2. Document Results

    • Document results in approved traceability repository (RADM, RQM, ReqPro, etc.)

    • If test case/script failed,

      Document results in defect repository (ClearQuest, KISAM, etc.)

      See IRM 2.127.2.4.2 for Problem Reporting Procedure

    • Prepare test case deferral or waiver form, if applicable

      See IRM 2.127.2.4.2 for Test Case Deferral and Waiver Procedure

    Test Analyst
    Test Lead
    3. Report Test Status

    • Conduct peer review, as applicable

    • Develop and provide status reports (e.g., daily, weekly, Release Readiness, test type, project etc.)

    Test Manager
    A4: Closeout Test
    (See Figure 2.127.2–5)
    Steps Roles
    1. Finalize Test Artifacts

    • Update repository/folder final results using project folder checklist

    • Update and finalize test schedule with actual results (e.g., WBS)

    • Resolve outstanding issues (problems, backlogs, etc.)

    • Complete work products

    Test Analyst
    Test Lead
    Test Manager
    2. Issue End of Test Reports

    • Develop end of test report for each test type. See IRM 2.127.2.4.2 for EOTR and EOTCR links

      See IRM 2.127.2.4.2 for IT Test Reference Guide link

    • Submit end of test report for approval and concurrence

    • Distribute end of test report electronically to Stakeholders

    Test Lead
    Test Manager
    Project Manager
    3. Conduct Closeout Meetings

    • Conduct meeting (Lessons Learned, PIR, etc.)

      See the Lessons Learned Guide: Strategy & Planning ELC Lessons Learned Guide

    • Document meetings and distribute to stakeholders

    • Attend governance meetings (Executive Steering Committee (ESC), Management Level Governance Board (MLGB), etc.)

    Test Lead
    Test Manager
    4. Finalize Project Folder

    • Place test related documents in project folder

    • Finalize project folder checklist

    • Approve final project folder checklist

    • Place final project folder checklist in project folder

    Test Lead
    Test Analyst
    Test Manager

Tailoring Guidelines

  1. This process may be tailored to meet specific project requirements. If tailoring is permitted, refer to the tailored approach according to what has been documented in the Tailoring Plan. At a minimum, acceptable execution of the IT Testing process specifies that the following mandatory activities be completed:

    • The execution of each STEP identified in the Process Description

    • The Activities within each STEP may be adjusted to meet specific individual project scope, release type, requirements, path, and other business requirements or constraints. It is expected that the results of executing each STEP in the PD with their activities modified to be project specific, would produce Plans customized to the specific project/release with the respective data elements required to be entered into the appropriate areas of the Test Plan, SDP, STP, EOTR and EOTCR


    All tailoring requests, with supporting rationale, should be submitted in writing to and approved by EST.

CMMI, ITIL, PMI Compliance

  1. The Capability Maturity Model Integration (CMMI) can be used to judge the maturity of an organization’s processes and related procedures and process assets and can be used to plan further improvements. CMMI sets the standard for the essential elements of effective and mature processes, improved with quality and efficiency.

  2. The Information Technology Infrastructure Library (ITIL) contains a collection of best practices, enabling organizations to build an efficient framework for delivering IT Service Management (ITSM) and ensuring that they are meeting business goals and delivering benefits that facilitate business change, transformation, and growth.

  3. The Project Management Institute (PMI) organization advances the project management profession through globally recognized standards and certifications.

  4. This process asset is used to indicate that all artifacts are developed or acquired, incorporating CMMI, ITIL, PMI requirements, to meet the business objectives of the organization and that they represent investments by the organization that are expected to provide current and future business value to the IRS.

Definition, References

  1. Definitions, References

Definitions

  1. A Glossary is available on the IT Process Asset Library (PAL)

References

  1. The following is a list of Reference Documents addressed in this IRM:

    • End of Test Completion Report (EOTCR)

    • End of Test Report (EOTR)

    • IT Peer Review Procedure

    • IT Test Case Deferral and Waiver Procedure

    • IT Test Policy Directive

    • IT Test Readiness Review (TRR) Guide

    • IT Test Reference Guide

    • IT Test Type Reference Guide

    • Iterative Development and Testing Process Description

    • Problem Reporting Procedure

    • Project Folder Checklist

    • System Deployment Plan (SDP)

    • System Test Plan (STP)

    • Test Plan (TP)

  2. For your convenience, the Reference Documents can be viewed at the following sites:

    • EST Test Program Management & Center of Excellence (TPMCE) Section 2 - Click the link below to access the TPMCE Section 2 website for document selection http://it.web.irs.gov/es/est/Reports/TSA_TSSAssets.htm

    • SharePoint - Click the link below to access the TPMCE Section 2 site. Select Customer Corner IRM 2.127 & Related Documents (under Libraries) for documentation selection https://organization.ds.irsnet.gov/sites/ESESTtpm/section2/default.aspx

    • DocIT - Click the link below to access the EST TPMCE Section 2 site. Select Customer Corner IRM 2.127 & Related Documents for documentation selection http://docit.web.irs.gov/docit/drl/objectId/0b0075628053a259

    • ELC and IT PAL - ELC specific documents only. See ELC and IT PAL websites for further guidance

  3. The following resources are either in this document or were used to create it.

    • Enterprise Life Cycle Process Management Office (ELCPMO) System Development Phase Guide (New Products)

    • KISAM Procedure

    • RADM Repository

    • Sensitive But Unclassified (SBU) Data use Questionnaire - Form 14664, and Sensitive But Unclassified Data Use Request - Form 14665, as applicable. See IRM10.5.8 Sensitive But Unclassified (SBU) Data Policy: Protecting SBU in Non-Production Environments

    • Privacy Testing Guidance

    • ELC Lessons Learned

Glossary

Terms Definition
Application Collection of software programs that automates a business function. Each application may be part of more than one application and can run on one or more servers or other hardware.
Capability Maturity Model Integration (CMMI) CMMI is a process improvement approach developed by the Software Engineering Institute (SEI). It can be used to guide process improvement across a project, a division, or an entire organization. CMMI helps integrate traditionally separate organizational functions, set process improvement goals and priorities, provide guidance for quality processes, and provide a point of reference for appraising current processes.
Compliance Ensuring that a standard or set of guidelines is followed, or that proper, consistent accounting or other practices are being employed.
Deployment The activity responsible for movement of new or changed hardware, software, documentation, process, etc. to the Live Environment (Production). Deployment is part of the Release and Deployment Management Process.
End Of Test Completion Report (EOTCR) This document can be used by systems classified as New Development or Planned Maintenance. The EOTCR is an ELC requirement. The purpose of the EOTCR is to provide a standard artifact to summarize the complete test effort for the release. The EOTCR also allows the PM an opportunity to mitigate risks that may cause delays to project implementation.
End Of Test Report (EOTR) The EOTR is a requirement for all testing and may be used as an ELC functional equivalent for the EOTCR. The purpose of the EOTR is to provide a standard artifact to summarize the complete test effort for the test type(s). The EOTR also allows the test managers an opportunity to mitigate risks that may cause delays to project implementation.
Integrated Project Team Any group of people with complementary skills and expertise who are committed to delivering specified work products in timely collaboration. Integrated team members provide skills and advocacy appropriate to all phases of the work product's life and are collectively responsible for delivering the work products as specified. An integrated team should include empowered representatives from organizations, disciplines, and functions that have a stake in the success of the work product.
Knowledge Incident/Problem Service Asset Management (KISAM) KISAM maintains the complete inventory of IT and non-IT assets, and computer hardware and software. It is also the official reporting tool for problem management with all IRS developed applications, and shares information with the Enterprise Service Desk (ESD).
Process A structured set of activities designed to accomplish a specific objective. A process takes one or more defined inputs and turns them into defined outputs. A process may include any of the roles and responsibilities, tools, and management controls required to reliably deliver the output. A process may define policies, standards, guidelines, activities, and work instructions if they are needed.
Process Owner A role responsible for ensuring that a process is fit for its purpose. The Process Owner's responsibilities include sponsorship, design, change management and continual improvement of the process and its assets.
Project Folder The Project Folder is a requirement for every test and must be stored in an approved repository. See IRM 1.15 for IRS retention requirements. The Project Folder provides a history of the test. It is a useful source document for auditing purposes, and can be used for future project planning, allocation of resources, and process improvement. The Project Folder contains copies of all required items pertinent to the specific test, including all critical test documentation and work products. It is the responsibility of the Test Manager to review and approve the Project Folder from their team.
Requirements A requirement describes a condition or capability to which a system must conform; either derived directly from user needs, or stated in a contract, standard, specification, or other formally imposed document. A desired feature, property, or behavior of a system.
System Deployment Plan (SDP) The SDP is an ELC requirement. The purpose of the SDP is to provide a standard artifact to summarize the planned deployment activities for the release. The SDP also allows the PM an opportunity to mitigate risks that may cause delays to project implementation
System Test Plan (STP) The STP is an ELC requirement. The purpose of the STP is to provide a standard artifact to summarize the complete test effort for the release. The STP also allows the PM an opportunity to mitigate risks that may cause delays to project implementation.
Test Plan (TP) The TP is a requirement for all testing and may be used as an ELC functional equivalent for the STP. The purpose of the TP is to provide a standard artifact to summarize the complete test effort for the test type(s). The TP also allows the test managers an opportunity to mitigate risks that may cause delays to project implementation.
Test Types A test type is a specific test name whose group of activities has the intention of checking the system in respect to a number of correlated quality characteristics. During testing, various quality characteristic types are reviewed. Quality characteristic examples include: functionality, accessibility, performance, security, and continuity.
Triage A process in which things are ranked in terms of importance or priority to allocate scarce resources. It also applies to different types of business process or workflow situations. In an IT department, IT issues can be categorized by a predefined probability scale factoring in risks and business impacts. It is used in EST to manage allocation of resources to fix testing anomalies.
Validation CMMI for Development v1.3, is the process whose purpose is to demonstrate that a product or product component fulfills its intended use when placed in its intended environment.
In testing, validation is the process of evaluating software at the end of the development process to ensure compliance with requirements from the business. Example: A data capture validation test consists of a partial run simulating the production cycle that occurred while the data was being captured. The successful execution of processing indicates the required data was captured and loaded properly on the test platforms.
Verification CMMI for Development v1.3, is the process for ensuring that selected work products meet their specified requirements.
In testing, verification is the process performed at the end of a test cycle phase with the objective of ensuring that the requirements established during the previous phase have been met. It is an overall software evaluation activity that includes reviewing, inspecting, testing, checking, and auditing.

Acronyms

Acronyms Definition
ASP Analysis Specification Package
BSR Business System Report
CMMI Capability Maturity Model Integration
COH Computer Operator Handbook
CPB Computer Program Book
CR Change Request
DocIT Document Management for Information Technology
DSP Design Specification Package
DSR Design Specification Report
ECL Executive Control Language
EFTU Enterprise File Transfer Utility
ELC Enterprise Life Cycle
ELCPMO Enterprise Life Cycle Process Management Office
EOTCR End Of Test Completion Report
EOTR End Of Test Report
ESC Executive Steering Committee
ESD Enterprise Service Desk
EST Enterprise Systems Testing
FCSR File Communication Status Report
FSP Functional Specification Package
ICD Interface Control Document
IPM Integrated Process Management
IT Information Technology
IT PAL Information Technology Process Asset Library
ITIL Information Technology Infrastructure Library
ITSM Information Technology Service Management
JCL Job Control Language
KISAM Knowledge Incident/Problem Service Asset Management
MLGB Management Level Governance Board
PD Process Description
PIA Privacy Impact Assessment
PIR Post Implementation Review
PMI Project Management Institute
PMP Project Management Plan
PRP Program Requirements Package
RADM Requirements and Demand Management
RCS Request for Computer Services
ReqPro RequisitePro
RQM Rational Quality Manager
RRC Rational Requirements Composer
RTM Requirements Traceability Matrix
RTVM Requirements Traceability Verification Matrix
SAT Systems Acceptability Testing
SDP System Deployment Plan
SEI Software Engineering Institute
SORN System of Records Notice
SOW Statement Of Work
STP System Test Plan
TP Test Plan
TRR Test Readiness Review
UWR Unified Work Request
WBS Work Breakdown Structure

Examples of Requirements, Functional, Operational, and Security Documentation

Requirements Documentation Functional Documentation
Business Requirements Document (Iterative) Analysis Specification Package (ASP)
Business System Report (BSR) Configuration Item/Configuration Unit (CI/CU)
Capabilities (Iterative) Data Model View
Change Request (CR) Design Document (Iterative)
Configuration Control Board (CCB) Request(s) Design Specification Package (DSP)
Cost and/or Technical Proposals Design Specification Report (DSR)
Final Integration Test (FIT) Request Functional Design Specification (FDS)
Memoranda of Agreement (MOA) Functional Specification Package (FSP)
Requirements Traceability Verification Matrix (RTVM) Interface Control Document (ICD)
Service Level Agreement (SLA) Logical Design Description Document
Sprint Backlog Program Requirements Package (PRP)
Statement of Work (SOW) Systems Requirement Report
Systems Acceptability Test (SAT) Request System Test Plan
Unified Work Request (UWR) Use Case Model
Operational Documentation Security Documentation
Computer Program Book (CPB) Information Technology Contingency Plan (ITCP)
Computer Operator Handbook (COH) Interconnection Security Agreement (ISA)
Internal Revenue Manual (IRM) Risk Assessment
Iterative Development and Testing Process Description Security Assessment & Authorization (SA&A)
Manager Guides Security Controls Assessment (SCA)
System Administrator Guides Security Risk Assessment (SRA)
User Guides System Security Plan (SSP)
Privacy Documentation
Sensitive But Unclassified (SBU) Data use Questionnaire - Form 14664, and Sensitive But Unclassified Data Use Request - Form 14665, as applicable. See IRM 10.5.8 Sensitive But Unclassified (SBU) Data Policy: Protecting SBU in Non-Production Environments
Privacy Impact Assessment (PIA)
Privacy Requirements
Privacy Testing Guidance
System of Records Notice (SORN)