Case Study 1: GlobalBank Real-Time Balance API

The Business Need

GlobalBank's mobile banking app was growing rapidly — from 200,000 active users to 1.2 million in 18 months. The original architecture routed mobile requests through a middleware layer that screen-scraped 3270 transactions, a technique that was slow (800ms average response time), fragile (broke whenever BMS screen layouts changed), and limited to 50 concurrent users.

Priya Kapoor was tasked with building a proper API layer.

The Design Process

Requirements

  • Balance inquiry response time under 200ms (95th percentile)
  • Support 500 concurrent requests
  • JSON format matching the mobile app's existing data model
  • No changes to the COBOL business logic (ACCTINQ had been in production 12 years with zero logic defects)

Architecture Decision

Priya considered three approaches:

  1. Rewrite ACCTINQ in Java — Estimated 4 months, risk of introducing logic bugs in a critical program
  2. CICS Transaction Gateway (CTG) — Direct CICS access from Java middleware, 2 months
  3. z/OS Connect — Direct REST-to-CICS mapping, 3 weeks

She chose z/OS Connect because it required zero changes to the COBOL program and minimal Java middleware.

Implementation

Step 1: Service Definition (Week 1)

The z/OS Connect service definition mapped the HTTP request to the ACCTINQ COMMAREA:

service:
  name: accountBalance
  description: "Real-time account balance inquiry"
  program: ACCTINQ
  transaction: AINQ
  mapping:
    input:
      COMM-FUNCTION: "INQUIRY"
      COMM-ACCT-NO: path.accountNumber
    output:
      success:
        condition: COMM-RETURN-CODE == "00"
        mapping:
          accountName: COMM-ACCT-NAME
          accountType: COMM-ACCT-TYPE
          balance: COMM-BALANCE
          status: COMM-STATUS
          openDate: COMM-OPEN-DATE
      notFound:
        condition: COMM-RETURN-CODE == "01"
        httpStatus: 404
        mapping:
          error: COMM-ERROR-MSG

Step 2: API Gateway Configuration (Week 2)

The Kong API gateway was configured with: - JWT authentication (tokens issued by the mobile app's auth service) - Rate limiting: 100 requests per minute per user - Response caching: 5-second TTL for balance inquiries - Health check endpoint pinging the CICS region every 10 seconds

Step 3: Performance Testing (Week 3)

Load testing revealed the architecture easily met requirements:

Metric Target Actual
Average response time < 200ms 87ms
95th percentile < 200ms 142ms
99th percentile < 500ms 198ms
Concurrent users 500 1,200 (before degradation)
COBOL program execution 6ms average

The Result

The mobile app's balance inquiry became nearly instant. Customer satisfaction scores for the mobile app increased from 3.2 to 4.6 out of 5. The screen-scraping middleware was decommissioned.

Maria Chen's ACCTINQ program — unchanged since 2014 — now served mobile users alongside 3270 terminal users, processing 200,000 API calls per day on top of its existing workload.

Discussion Questions

  1. Why was z/OS Connect preferred over rewriting the COBOL program in Java?
  2. What are the risks of the 5-second response cache? In what scenarios could it cause problems?
  3. The COBOL program executes in 6ms, but the total response time is 87ms. Where does the other 81ms go?
  4. How would you handle the scenario where the mobile app needs data that ACCTINQ does not currently return (e.g., pending transactions)?