Generate Your PRD Free — No account required
Try PRD Generator →
Back to Blog
Tutorials

API Contract Testing Guide for AI-Built Apps

API Contract Testing Guide for AI-Built Apps

Validate AI-generated API code against your spec. This guide covers contract testing with OpenAPI, automated validation, and CI integration.

Context Ark Team
44 min read

API Contract Testing Guide for AI-Built Apps

TL;DR: AI coding tools generate endpoints, but do they match your spec? Contract tests validate that implementation matches definition. No drift allowed.

Table of Contents

  1. What is Contract Testing?
  2. Contract Tests vs Other Tests
  3. OpenAPI-Based Testing
  4. Setting Up Tests
  5. Test Examples
  6. CI Integration
  7. Common Issues
  8. Free Resources

What is Contract Testing?

Contract testing verifies that your API implementation matches its specification (the "contract").

The Contract

Your API spec (OpenAPI/Swagger) defines:

  • Endpoint paths and methods
  • Request body schemas
  • Response schemas
  • Error formats
  • Authentication requirements

The Test

Contract tests validate that your actual API:

  • Returns the correct response shape
  • Enforces required fields
  • Handles errors as specified
  • Requires auth where specified

Why It Matters for AI Code

AI tools generate endpoints based on your spec. But they can:

  • Miss required fields
  • Return wrong shapes
  • Skip validation
  • Ignore error handling

Contract tests catch these before production.


Contract Tests vs Other Tests

Test Type What It Tests Catches
Unit Individual functions Logic bugs
Integration Components together Wiring issues
Contract API matches spec Schema drift, missing validation
E2E Full user flows UX issues

Where Contract Tests Fit

Unit → Integration → CONTRACT → E2E → Production
                        ↑
                   Catches AI drift

Contract tests run after integration (API works) but before E2E (shape is correct).


OpenAPI-Based Testing

The Approach

  1. Write your API spec (OpenAPI 3.x)
  2. Generate type definitions from spec
  3. Write tests that validate responses

Tools

Tool Purpose
Prism Mock + validate against OpenAPI
OpenAPI Validator Response validation
Zod Runtime schema validation
TypeScript Compile-time type checking

Basic Flow

API Spec (YAML) → Generate Types → Write Tests → Validate Responses

Setting Up Tests

Step 1: Install Dependencies

npm install -D @apidevtools/swagger-parser ajv zod vitest

Step 2: Parse Your Spec

// tests/setup.ts
import SwaggerParser from "@apidevtools/swagger-parser";

export async function loadSpec() {
  const spec = await SwaggerParser.parse("./docs/api-spec.yaml");
  return spec;
}

Step 3: Create Validator

// tests/validate.ts
import Ajv from "ajv";
import addFormats from "ajv-formats";

const ajv = new Ajv({ allErrors: true });
addFormats(ajv);

export function createValidator(schema: object) {
  return ajv.compile(schema);
}

Step 4: Write Contract Tests

See examples below.


Test Examples

Example 1: Response Shape Validation

// tests/api/projects.contract.test.ts
import { describe, it, expect, beforeAll } from "vitest";
import { loadSpec } from "../setup";
import { createValidator } from "../validate";

describe("GET /api/projects", () => {
  let validateResponse: (data: unknown) => boolean;

  beforeAll(async () => {
    const spec = await loadSpec();
    const schema =
      spec.paths?.["/projects"]?.get?.responses?.["200"]?.content?.[
        "application/json"
      ]?.schema;
    validateResponse = createValidator(schema);
  });

  it("returns correct response shape", async () => {
    const response = await fetch("http://localhost:3000/api/projects");
    const data = await response.json();

    const isValid = validateResponse(data);

    if (!isValid) {
      console.error("Validation errors:", validateResponse.errors);
    }

    expect(isValid).toBe(true);
  });

  it("returns array of projects with required fields", async () => {
    const response = await fetch("http://localhost:3000/api/projects");
    const data = await response.json();

    expect(data).toHaveProperty("data");
    expect(Array.isArray(data.data)).toBe(true);

    if (data.data.length > 0) {
      const project = data.data[0];
      expect(project).toHaveProperty("id");
      expect(project).toHaveProperty("name");
      expect(project).toHaveProperty("created_at");
    }
  });
});

Example 2: Error Response Validation

describe("POST /api/projects (validation)", () => {
  it("returns 400 with correct error shape for invalid request", async () => {
    const response = await fetch("http://localhost:3000/api/projects", {
      method: "POST",
      headers: { "Content-Type": "application/json" },
      body: JSON.stringify({ name: "" }), // Empty name
    });

    expect(response.status).toBe(400);

    const data = await response.json();

    // Per our error schema
    expect(data).toHaveProperty("error");
    expect(data).toHaveProperty("message");
    expect(typeof data.error).toBe("string");
    expect(typeof data.message).toBe("string");
  });
});

Example 3: Auth Requirement

describe("Protected endpoints", () => {
  it("GET /api/projects requires auth", async () => {
    const response = await fetch("http://localhost:3000/api/projects", {
      headers: {
        // No Authorization header
      },
    });

    expect(response.status).toBe(401);

    const data = await response.json();
    expect(data.error).toBe("unauthorized");
  });

  it("GET /api/projects succeeds with auth", async () => {
    const response = await fetch("http://localhost:3000/api/projects", {
      headers: {
        Authorization: `Bearer ${testToken}`,
      },
    });

    expect(response.status).toBe(200);
  });
});

CI Integration

GitHub Actions

# .github/workflows/contract-tests.yml
name: Contract Tests

on:
  push:
    branches: [main]
  pull_request:
    branches: [main]

jobs:
  contract-tests:
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v4

      - name: Setup Node
        uses: actions/setup-node@v4
        with:
          node-version: "20"
          cache: "npm"

      - name: Install dependencies
        run: npm ci

      - name: Start server
        run: npm run dev &
        env:
          DATABASE_URL: ${{ secrets.TEST_DATABASE_URL }}

      - name: Wait for server
        run: npx wait-on http://localhost:3000/api/health

      - name: Run contract tests
        run: npm run test:contract

Package.json Script

{
  "scripts": {
    "test:contract": "vitest run --config vitest.contract.config.ts"
  }
}

Common Issues

Issue 1: Schema Drift

Symptom: Tests pass but production breaks.

Cause: Spec and implementation diverged.

Fix:

  • Generate types from spec
  • Run contract tests on every PR
  • Block merge if tests fail

Issue 2: Missing Nullable Fields

Symptom: null values cause validation failures.

Cause: Spec says type: string but implementation returns null.

Fix:

# In OpenAPI spec
description:
  type: string
  nullable: true

Issue 3: Date Format Mismatch

Symptom: Dates fail validation.

Cause: Spec says format: date-time but implementation returns different format.

Fix:

  • Standardize on ISO 8601
  • Use addFormats with Ajv

Issue 4: Extra Fields

Symptom: Response has fields not in spec.

Decision:

  • Strict: Fail on extra fields (additionalProperties: false)
  • Loose: Allow extra fields (default)

For AI validation, strict is better—catches hallucinated fields.


Free Resources

Templates

Related Guides


Start with the spec, test against the spec. Generate your API spec →


Last updated: January 2026

testingapicontractsautomation
Share this article
C

Context Ark Team

Writing about AI, documentation, and developer tools

Turn Brain Dumps into PRDs

Don't let AI guess your requirements. Generate a structured PRD with acceptance criteria instantly.