Chapter 40 Exercises: Testing, Quality Assurance, and Deployment
These exercises cover the full spectrum of testing practices for COBOL systems, from unit testing individual paragraphs to managing production deployments in mainframe environments. Banking and financial processing scenarios are used throughout.
Tier 1 — Recall
Objective: Confirm understanding of key testing and deployment terminology and concepts.
Exercise 1.1 — Testing Vocabulary
Define each of the following terms in one to two sentences:
- Unit testing
- Integration testing
- Regression testing
- System testing
- User acceptance testing (UAT)
Expected outcome: Five concise, accurate definitions that distinguish each testing level.
Exercise 1.2 — Test Case Components
List the five essential components of a well-written test case for a COBOL batch program.
Expected outcome: Identification of (1) test case ID, (2) description/ objective, (3) preconditions and input data, (4) execution steps, and (5) expected results.
Exercise 1.3 — Code Review Checklist
List at least eight items that should appear on a COBOL code review checklist for a banking application.
Expected outcome: Items such as: proper SQLCODE handling, correct file status checking, boundary condition handling, data validation, naming conventions, paragraph structure, WORKING-STORAGE initialization, PERFORM nesting depth, abend handling, security of sensitive data, and adherence to shop standards.
Exercise 1.4 — Deployment Terminology
Match each term to its correct definition:
| Term | Definition |
|---|---|
| Change management | ? |
| Promotion path | ? |
| Smoke test | ? |
| Rollback plan | ? |
| Release manifest | ? |
Expected outcome: Five correct matches (e.g., change management = the process of controlling and tracking modifications to production systems; promotion path = the sequence of environments a change traverses from development to production).
Exercise 1.5 — Test Data Categories
Describe the difference between the following categories of test data:
- Positive test data
- Negative test data
- Boundary test data
- Edge case test data
Provide one example of each for a COBOL program that processes deposit
transactions with an amount field of PIC 9(7)V99.
Expected outcome: Four descriptions with concrete examples (e.g., boundary: 0.01 and 9999999.99).
Exercise 1.6 — Mainframe Deployment Tools
Name at least four tools or utilities commonly used in mainframe COBOL deployment pipelines and briefly describe each one's role.
Expected outcome: Tools such as: (1) ISPF/SCLM or Endevor for source management, (2) JCL for job execution, (3) IEBCOPY or SMP/E for library management, (4) File-AID or similar for test data management, and optionally (5) Jenkins/Urban Code for CI/CD orchestration.
Exercise 1.7 — COBOL Debugging Techniques
List five debugging techniques available for COBOL programs on the mainframe, and describe when each is most appropriate.
Expected outcome: Techniques such as DISPLAY statements, COBOL interactive debugger (IBM Debug Tool / IDz), abend dump analysis (CEEDUMP, SYSUDUMP), file and database trace facilities, and COBOL USE AFTER STANDARD ERROR declaratives.
Exercise 1.8 — Quality Metrics
Define the following software quality metrics and explain why each is relevant to a COBOL banking application:
- Code coverage
- Defect density
- Cyclomatic complexity
- Mean time to failure (MTTF)
Expected outcome: Four definitions with banking-relevant justifications (e.g., high cyclomatic complexity in interest calculation paragraphs increases the risk of untested paths that could produce incorrect financial results).
Tier 2 — Comprehension
Objective: Interpret testing scenarios, understand test strategies, and explain deployment processes.
Exercise 2.1 — Test Strategy Evaluation
A banking project has the following test plan for a new loan origination COBOL program:
- Developer runs the program once with a sample loan application.
- QA team runs the program with 10 pre-built test cases.
- The program is promoted to production.
Identify at least five deficiencies in this test strategy and explain the risk each one poses.
Expected outcome: Deficiencies such as: no unit testing of individual paragraphs, insufficient test case volume, no negative/boundary testing, no regression testing of existing programs, no UAT sign-off, no performance testing, and no deployment verification (smoke test).
Exercise 2.2 — Understanding a Test Harness
Study the following test harness structure for a COBOL paragraph:
IDENTIFICATION DIVISION.
PROGRAM-ID. TEST-INTEREST-CALC.
DATA DIVISION.
WORKING-STORAGE SECTION.
01 WS-PRINCIPAL PIC 9(9)V99.
01 WS-RATE PIC 9(3)V9(5).
01 WS-DAYS PIC 9(3).
01 WS-INTEREST PIC 9(9)V99.
01 WS-EXPECTED PIC 9(9)V99.
01 WS-TEST-COUNT PIC 9(3) VALUE 0.
01 WS-PASS-COUNT PIC 9(3) VALUE 0.
01 WS-FAIL-COUNT PIC 9(3) VALUE 0.
PROCEDURE DIVISION.
MAIN-LOGIC.
PERFORM TEST-CASE-001
PERFORM TEST-CASE-002
PERFORM TEST-CASE-003
DISPLAY 'TESTS RUN: ' WS-TEST-COUNT
DISPLAY 'PASSED: ' WS-PASS-COUNT
DISPLAY 'FAILED: ' WS-FAIL-COUNT
STOP RUN.
TEST-CASE-001.
MOVE 10000.00 TO WS-PRINCIPAL
MOVE 5.00000 TO WS-RATE
MOVE 30 TO WS-DAYS
MOVE 41.10 TO WS-EXPECTED
PERFORM CALCULATE-INTEREST
PERFORM VERIFY-RESULT.
VERIFY-RESULT.
ADD 1 TO WS-TEST-COUNT
IF WS-INTEREST = WS-EXPECTED
ADD 1 TO WS-PASS-COUNT
DISPLAY 'PASS - TEST ' WS-TEST-COUNT
ELSE
ADD 1 TO WS-FAIL-COUNT
DISPLAY 'FAIL - TEST ' WS-TEST-COUNT
' EXPECTED: ' WS-EXPECTED
' GOT: ' WS-INTEREST
END-IF.
Answer:
- What design pattern does this harness follow?
- How does the harness report results?
- What is missing from this harness that a production-quality test framework would include?
- How would you add a new test case?
Expected outcome: Identification of the arrange-act-assert pattern, summary reporting via counters, and missing features such as setup/teardown, test isolation, test data files, and logging to a file rather than DISPLAY.
Exercise 2.3 — Deployment Pipeline Stages
Describe what happens at each stage of the following mainframe deployment pipeline:
- Source check-in to version control
- Automated build (compile, link-edit, bind)
- Unit test execution
- Promotion to QA environment
- Integration/regression test execution
- UAT sign-off
- Production deployment
- Post-deployment verification
Expected outcome: Eight clear descriptions demonstrating understanding of the end-to-end deployment lifecycle.
Exercise 2.4 — Regression Test Impact Analysis
A developer modifies the interest calculation paragraph in a shared copybook
INTCALC.cpy used by 12 different COBOL programs.
- What is the minimum set of programs that must be regression tested?
- How would you identify all affected programs?
- What testing strategy would minimize risk while keeping the testing effort manageable?
Expected outcome: All 12 programs must be retested since they COPY the modified code. Identification via copybook cross-reference listings or source management tools. Strategy: prioritize programs by risk/volume, run automated regression suites, and perform targeted manual testing for high-risk programs.
Exercise 2.5 — Understanding Abend Codes
A COBOL program abends in production with the following information:
CEE3204S THE SYSTEM DETECTED A PROTECTION EXCEPTION (SYSTEM COMPLETION CODE=0C4)
- What does system completion code 0C4 indicate?
- Name three common COBOL programming errors that can cause this abend.
- What diagnostic information would you collect first?
Expected outcome: 0C4 = protection exception (addressing violation). Common causes: subscript out of range, uninitialized pointer, referencing a field beyond the end of a record. Collect: CEEDUMP or SYSUDUMP, the failing instruction offset, the COBOL listing cross-referenced to the offset.
Exercise 2.6 — Code Coverage Interpretation
A COBOL program has 500 executable statements. After running the test suite, the code coverage tool reports:
- Statement coverage: 72%
- Branch coverage: 58%
- Paragraph coverage: 90%
Interpret these results and explain what actions should be taken before the program is promoted to production.
Expected outcome: 72% statement coverage means 140 statements were never executed — potential untested logic. 58% branch coverage means many IF/EVALUATE branches were not exercised. 90% paragraph coverage is better but still has gaps. Actions: identify uncovered statements/branches, write additional test cases targeting those paths, and prioritize coverage of error-handling and boundary-condition code.
Exercise 2.7 — Change Management Process
Describe a typical change management process for modifying a production COBOL program in a bank, including:
- Change request initiation
- Impact analysis
- Approval workflow
- Implementation and testing
- Production deployment
- Post-implementation review
Expected outcome: Six detailed process steps reflecting enterprise change management practices, including references to change advisory boards, testing gates, and documentation requirements.
Exercise 2.8 — Mock Objects in COBOL Testing
Explain the concept of mocking in the context of COBOL testing. How would you mock a DB2 database call when unit testing a COBOL paragraph that retrieves account information?
Expected outcome: Explanation of mocking as replacing real dependencies with controlled substitutes. For DB2 mocking: create a test version of the program where the EXEC SQL call is replaced with code that populates host variables from predetermined test data, or use a testing framework that intercepts DL/I or SQL calls. Discuss the trade-off between test fidelity and isolation.
Tier 3 — Application
Objective: Create test cases, test harnesses, and deployment artifacts for COBOL banking programs.
Exercise 3.1 — Write Unit Test Cases
A COBOL paragraph VALIDATE-ACCOUNT-NUMBER checks the following rules:
- The account number must be exactly 10 digits.
- The first two digits represent the branch code (01-99).
- The last digit is a check digit calculated using the Luhn algorithm.
Write at least 10 test cases covering positive, negative, and boundary conditions.
Expected outcome: A table with columns for test case ID, input, expected result (VALID/INVALID), and rationale. Should include valid accounts, too-short inputs, too-long inputs, alphabetic characters, branch code 00, branch code 99, correct and incorrect check digits.
Exercise 3.2 — Build a Test Harness
Write a complete COBOL test harness program that tests the following paragraph:
CALCULATE-MONTHLY-PAYMENT.
COMPUTE WS-MONTHLY-PAYMENT ROUNDED =
(WS-PRINCIPAL * (WS-MONTHLY-RATE *
(1 + WS-MONTHLY-RATE) ** WS-NUM-PAYMENTS)) /
((1 + WS-MONTHLY-RATE) ** WS-NUM-PAYMENTS - 1).
The harness should:
- Define at least five test cases with known correct results.
- Call the paragraph for each test case.
- Compare actual output to expected output with a tolerance of $0.01.
- Report PASS/FAIL for each test case.
- Display a summary at the end.
Expected outcome: A complete, compilable COBOL program of approximately 80-120 lines that exercises the monthly payment formula with various principal amounts, interest rates, and loan terms.
Hint: Test cases should include a standard 30-year mortgage, a short-term personal loan, a zero-interest loan (edge case requiring special handling), and a high-interest scenario.
Exercise 3.3 — Create Regression Test Data
Design a comprehensive set of test data files for a COBOL batch program that processes daily account transactions. The input file has the following layout:
01 TXN-RECORD.
05 TXN-ACCT-NO PIC X(10).
05 TXN-TYPE PIC X(1).
88 TXN-DEPOSIT VALUE 'D'.
88 TXN-WITHDRAWAL VALUE 'W'.
88 TXN-TRANSFER VALUE 'T'.
05 TXN-AMOUNT PIC 9(7)V99.
05 TXN-DATE PIC 9(8).
05 TXN-DESCRIPTION PIC X(30).
Create:
- A normal processing file (20 records covering all transaction types).
- An error file (10 records with various data errors).
- A boundary condition file (10 records testing limits).
- A volume test file specification (describe the generation approach for 100,000 records).
Expected outcome: Four detailed test data specifications with actual record values for items 1-3 and a generation strategy for item 4.
Exercise 3.4 — Write a JCL Deployment Job
Write the JCL to deploy a COBOL-DB2 program named ACCTINQ to a production
environment. The job must:
- Copy the load module from the QA load library to the production load library.
- Bind the DB2 package from the QA DBRM library.
- Run a smoke test that executes the program with a test input file.
- Check the return code of each step and stop if any step fails.
Expected outcome: A complete JCL job stream with IEBCOPY, BIND PACKAGE, and program execution steps, using COND parameters for step-level condition checking.
//DEPLOY JOB (ACCTG),'DEPLOY ACCTINQ',CLASS=A,
// MSGCLASS=X,NOTIFY=&SYSUID
//*
//* Expected: Complete JCL with all four steps
Exercise 3.5 — Integration Test Scenario
Design an integration test for a COBOL system consisting of three programs:
- TXNVAL — Validates transactions and writes valid records to an output file.
- TXNPOST — Reads validated transactions and updates DB2 account balances.
- TXNRPT — Reads the DB2 tables and produces a daily transaction report.
Write the complete integration test plan including:
- Test environment setup (tables, files, initial data).
- The execution sequence and JCL.
- Verification checkpoints after each program.
- Expected final state of all tables and files.
- Cleanup procedures.
Expected outcome: A detailed, step-by-step integration test plan that could be executed by a tester unfamiliar with the system.
Exercise 3.6 — Automated Test Script
Write a REXX exec (or describe the logic for a shell script) that automates the following test cycle for a COBOL program:
- Submit the compile JCL and wait for completion.
- Check the compile return code (must be 0 or 4).
- Submit the test execution JCL.
- Compare the actual output file to an expected output file using SUPERC or a similar utility.
- Report PASS or FAIL based on the comparison.
- Log the result with a timestamp.
Expected outcome: A complete REXX script (or detailed pseudocode) of approximately 40-60 lines implementing the automated test cycle.
Exercise 3.7 — Performance Test Plan
Design a performance test plan for a COBOL batch program that processes end-of-day account reconciliation. The program reads 5 million account records from DB2 and writes a reconciliation report.
Include:
- Performance acceptance criteria (elapsed time, CPU time, I/O counts).
- Test data volume specifications.
- DB2 tuning parameters to monitor (buffer pool hit ratios, lock escalation).
- JCL parameters that affect performance (REGION, BUFNO, BLKSIZE).
- How to capture and analyze SMF/RMF data.
Expected outcome: A structured performance test plan with measurable criteria and specific monitoring techniques.
Exercise 3.8 — Defect Report Template
Create a defect report template for COBOL application defects and fill it in for the following scenario:
A COBOL program calculates overdraft fees but charges the fee even when the account has overdraft protection enabled. The fee is $35.00 and was incorrectly charged to 247 accounts in yesterday's batch run.
Expected outcome: A complete defect report including: defect ID, severity, priority, description, steps to reproduce, expected vs. actual behavior, root cause analysis, affected accounts, financial impact ($8,645.00), corrective action (code fix and data correction), and preventive action (additional test cases).
Tier 4 — Analysis
Objective: Evaluate testing strategies, diagnose deployment problems, and optimize QA processes.
Exercise 4.1 — Root Cause Analysis
A COBOL program that calculates year-end tax withholding ran successfully in testing but produced incorrect results in production. Investigation reveals:
- The program was tested with 2024 tax rates.
- Production ran with 2025 data, but the tax rate table was not updated.
- The program reads tax rates from a VSAM file, not from embedded constants.
- The test environment used a copy of the production VSAM file from 6 months ago.
Perform a root cause analysis using the "5 Whys" technique and propose corrective and preventive actions.
Expected outcome: A 5-Whys chain leading to the root cause (test data management process does not ensure current reference data in test environments). Corrective action: update the VSAM file and rerun. Preventive action: implement a test data refresh procedure and add a validation step that compares test reference data dates to the processing date.
Exercise 4.2 — Test Coverage Gap Analysis
A COBOL program has the following paragraph with six paths:
PROCESS-WITHDRAWAL.
IF WS-AMOUNT > WS-BALANCE
IF WS-OVERDRAFT-PROTECTION = 'Y'
PERFORM TRANSFER-FROM-SAVINGS
IF WS-TRANSFER-STATUS = 'SUCCESS'
PERFORM POST-WITHDRAWAL
ELSE
MOVE 'INSUFFICIENT FUNDS' TO WS-MSG
END-IF
ELSE
PERFORM CHARGE-OVERDRAFT-FEE
MOVE 'OVERDRAFT' TO WS-MSG
END-IF
ELSE
PERFORM POST-WITHDRAWAL
END-IF.
The current test suite has three test cases:
- Withdrawal with sufficient balance.
- Withdrawal with insufficient balance and no overdraft protection.
-
Withdrawal with insufficient balance, overdraft protection, and successful transfer.
-
Draw the control flow paths through this paragraph.
- Identify which paths are untested.
- Write test cases for the missing paths.
- Calculate the branch coverage before and after adding the new test cases.
Expected outcome: Path diagram, identification of the untested path (overdraft protection enabled but transfer fails), new test case, and coverage calculation (before: 3/4 branches = 75%; after: 4/4 = 100%).
Exercise 4.3 — Deployment Failure Analysis
A production deployment fails at 2:00 AM with the following symptoms:
- The new load module was copied successfully.
- The DB2 BIND step completed with return code 8.
- The smoke test abended with SQLCODE -818.
Diagnose the problem, explain the chain of events, and propose a resolution.
Expected outcome: The BIND return code 8 indicates a warning or error in the bind process (likely a plan/package issue). SQLCODE -818 means a timestamp mismatch between the DBRM and the load module. The most likely cause: the DBRM in the bind library does not match the load module that was promoted. Resolution: ensure the DBRM was copied to the production DBRM library before binding, then rebind and retest. Also review the deployment checklist to add DBRM promotion as a required step.
Exercise 4.4 — Evaluating a Testing Framework
Your organization is evaluating two approaches for COBOL unit testing:
Option A: Custom test harness programs (as shown in Exercise 2.2) built by each developer.
Option B: A commercial COBOL testing framework (such as IBM zUnit or Micro Focus Unit Testing Framework) with standardized test case definitions, assertions, and reporting.
Compare the two approaches across the following dimensions:
- Development effort per test case
- Consistency and standardization
- Integration with CI/CD pipelines
- Learning curve
- Maintenance cost over five years
- Reporting and metrics capabilities
Recommend one approach and justify your recommendation.
Expected outcome: A balanced comparison leading to a justified recommendation (likely Option B for large organizations), with acknowledgment that Option A may be appropriate for smaller teams or specific situations.
Exercise 4.5 — Security Testing for Financial Programs
A COBOL program processes wire transfer requests. Propose a security testing plan that covers:
- Input validation attacks (SQL injection via embedded SQL, buffer overflow via oversized fields).
- Authorization checks (verifying that the program enforces approval limits).
- Audit trail integrity (ensuring all actions are logged).
- Data encryption validation (sensitive fields in transit and at rest).
- Error message information leakage (ensuring error messages do not expose system internals).
For each category, write at least two specific test cases.
Expected outcome: Ten or more security-focused test cases with clear descriptions, inputs, and expected behaviors.
Exercise 4.6 — Continuous Integration for COBOL
An organization wants to implement continuous integration (CI) for its COBOL mainframe applications. Currently, developers compile and test manually using TSO/ISPF.
- Describe the target CI architecture, including source control, build automation, test automation, and feedback mechanisms.
- Identify three technical challenges specific to mainframe CI and propose solutions for each.
- Design a CI pipeline for a COBOL-DB2 program (list all stages and tools).
Expected outcome: A comprehensive CI architecture description with mainframe-specific considerations (e.g., LPAR resource contention, dataset naming, cross-platform tooling integration).
Exercise 4.7 — Post-Production Defect Trend Analysis
Over the past 12 months, a COBOL banking application has experienced the following production defects:
| Month | Defects | Category |
|---|---|---|
| Jan | 3 | Data validation (2), Logic (1) |
| Feb | 5 | Data validation (3), Logic (2) |
| Mar | 2 | Performance (1), Logic (1) |
| Apr | 7 | Data validation (4), DB2 (3) |
| May | 4 | Data validation (2), DB2 (2) |
| Jun | 3 | Logic (2), File handling (1) |
| Jul | 6 | Data validation (3), Logic (2), DB2 (1) |
| Aug | 2 | Logic (1), Performance (1) |
| Sep | 4 | Data validation (2), DB2 (2) |
| Oct | 3 | Data validation (2), Logic (1) |
| Nov | 5 | Data validation (3), DB2 (1), Logic (1) |
| Dec | 2 | Logic (1), File handling (1) |
Analyze this data to:
- Identify the primary defect category and its trend.
- Identify the month with the highest defect count and hypothesize a cause.
- Propose three targeted quality improvement actions.
- Define metrics to measure the effectiveness of those actions.
Expected outcome: Data-driven analysis showing data validation as the leading category (26 out of 46 defects, 57%), April as the peak month (possibly due to a major release or year-end processing), and improvement actions targeting input validation code reviews, automated boundary testing, and DB2 query review processes.
Exercise 4.8 — Rollback Planning
Design a rollback plan for a production deployment that includes:
- A new COBOL load module replacing an existing one.
- A DB2 schema change (adding a new column with a default value).
- A new VSAM file for reference data.
- Updated JCL for the nightly batch cycle.
For each component, describe:
- How the rollback would be executed.
- The order of rollback operations.
- Data integrity considerations.
- The decision criteria for triggering a rollback.
Expected outcome: A detailed rollback plan covering all four components with attention to the irreversibility of the DB2 schema change and the need for data backout if transactions have been posted using the new column.
Tier 5 — Synthesis
Objective: Design comprehensive testing and deployment strategies for enterprise COBOL systems.
Exercise 5.1 — Enterprise Test Strategy
Design a comprehensive testing strategy for a major COBOL banking system upgrade that involves changes to 45 COBOL programs, 12 DB2 tables, 8 VSAM files, and 15 JCL streams. The system processes $2 billion in daily transactions.
The strategy must include:
- Test environment architecture (number of environments, data strategy, DB2 subsystem allocation).
- Test phases and entry/exit criteria for each phase.
- Test automation approach (tools, frameworks, scripts).
- Test data management (creation, masking of PII, refresh strategy).
- Performance and stress testing plan.
- Defect management process (severity classification, escalation, resolution SLAs).
- Sign-off criteria for production deployment.
- Resource estimates (testers, environments, duration).
Expected outcome: A multi-page test strategy document suitable for review by a project steering committee, with specific, actionable content for each section.
Exercise 5.2 — CI/CD Pipeline Implementation
Design and document a complete CI/CD pipeline for a COBOL-DB2 banking application. The pipeline should handle the full lifecycle from code commit to production deployment.
Provide:
- Pipeline architecture diagram (in text form) showing all stages.
- Tool selection for each stage with justification (e.g., Git for source control, Jenkins/Urban Code Deploy for orchestration, IBM Dependency Based Build for compilation).
- Automated quality gates between stages (compile success, test pass rate, code coverage threshold, security scan results).
- Database schema migration strategy (how DB2 changes are versioned and applied).
- Rollback automation for each stage.
- Monitoring and alerting for pipeline health.
- Sample pipeline definition (Jenkinsfile or equivalent pseudocode).
Expected outcome: A complete CI/CD implementation plan with enough detail for an engineering team to begin building the pipeline.
Exercise 5.3 — Test Data Management System
Design a test data management system for a COBOL mainframe environment that handles:
- Subsetting: Extracting a representative subset of production data for testing, maintaining referential integrity across DB2 tables and VSAM files.
- Masking: De-identifying personally identifiable information (PII) such as Social Security Numbers, account numbers, and customer names while preserving data format and referential integrity.
- Synthetic generation: Creating realistic test data for scenarios not present in production (new product types, extreme volumes, error conditions).
- Refresh automation: Scheduled refresh of test environments with current data.
- Version control: Tracking test data versions alongside code changes.
For each capability, provide:
- The design approach.
- COBOL program fragments or JCL for key operations.
- Data integrity validation checks.
Expected outcome: A comprehensive test data management design with practical COBOL implementations for masking and subsetting operations.
Exercise 5.4 — Production Incident Response
Design a production incident response plan for a critical COBOL batch failure during the nightly processing cycle at a bank. The plan must cover:
- Detection: How the failure is detected (job monitoring, alerts, SMF records).
- Triage: Severity classification and escalation matrix (who is called at 3:00 AM for a severity-1 issue).
- Diagnosis: Step-by-step diagnostic procedure (abend code lookup, dump analysis, log review, data inspection).
- Resolution: Decision tree for resolution options (restart, rerun with fixes, skip and process manually, emergency code change).
- Recovery: How to recover the batch cycle if the window is closing (parallel job execution, dependent job rescheduling).
- Communication: Notification templates for management, downstream systems, and regulators (if required by banking regulations).
- Post-incident review: Template for the post-mortem analysis including root cause, timeline, impact assessment, and preventive actions.
Expected outcome: A complete incident response plan with decision trees, templates, and procedures that could be used as a real operations runbook.
Exercise 5.5 — Quality Assurance Program Design
Design a comprehensive QA program for a COBOL development team of 20 developers working on a core banking system. The program must include:
- Coding standards — Define COBOL coding standards with at least 15 specific rules covering naming conventions, paragraph structure, error handling, data definitions, and documentation.
- Peer review process — Design a code review workflow including review checklists, minimum review criteria, and turnaround time SLAs.
- Static analysis — Describe how static analysis tools can detect common COBOL defects (dead code, uninitialized variables, unreachable paragraphs, PERFORM THRU mismatches).
- Testing standards — Define minimum testing requirements per change type (bug fix, enhancement, new program, copybook change).
- Metrics and reporting — Design a QA dashboard with at least eight metrics tracked over time (defect density, code coverage, review turnaround, test execution rate, etc.).
- Continuous improvement — Propose a quarterly review process for updating standards and addressing quality trends.
Expected outcome: A QA program document comprehensive enough to be adopted by a real COBOL development team, with specific rules, processes, and metrics rather than generic guidelines.