Skip to content

hemantobora/auto-mock

AutoMock πŸ§ͺ⚑

License: MIT Go Version AWS

AutoMock is a cloud-native CLI tool that generates and deploys production-ready mock API servers from natural language descriptions, API collections, or an interactive builder. Spin up fully managed mock servers on AWS with auto-scaling, monitoring, and built-in load testing support.

Cloud provider support is currently AWS-only. GCP and Azure are planned but not yet available.


🌟 Highlights

  • πŸ€– AI-Generated Mocks β€” Describe your API in natural language, get complete MockServer configurations
  • ☁️ Cloud-Native Deployment β€” One command deploys ECS Fargate + ALB + Auto-scaling
  • πŸ“¦ Multi-Format Import β€” Postman, Bruno, Insomnia collections β†’ MockServer expectations
  • πŸ”§ Interactive Builder β€” 7-step guided builder for precise control
  • ⚑ Auto-Scaling β€” CPU/Memory/Request-based (configurable; defaults: 10–200 tasks)
  • πŸ’Ύ Cloud Storage β€” S3-backed, versioned, team-accessible expectations
  • 🎭 Advanced Features β€” Progressive delays, GraphQL, response templates, response limits
  • πŸ§ͺ Load Testing β€” Locust bundle generation and managed AWS deployment
  • πŸ” Production-Ready β€” ALB health checks, CloudWatch monitoring, IAM best practices

πŸ—ΊοΈ Installation

Option A β€” Homebrew (macOS / Linux)

brew tap hemantobora/tap
brew install automock
automock --version

Or directly: brew install hemantobora/tap/automock

Option B β€” Scoop (Windows)

scoop bucket add hemantobora https://github.com/hemantobora/scoop-bucket
scoop install automock
automock --version

Option C β€” Download release binary

  1. Go to the Releases page
  2. Download the archive for your OS/arch (e.g., automock_darwin_arm64.tar.gz)
  3. Extract and place the binary on your PATH:
tar -xzf automock_*.tar.gz
sudo mv automock /usr/local/bin/
automock --version

Option D β€” Build from source

Requires Go 1.22+:

git clone https://github.com/hemantobora/auto-mock.git
cd auto-mock
go build -o automock ./cmd/auto-mock

Upgrading

  • Homebrew: brew upgrade automock
  • Scoop: scoop update automock
  • Binary: download the newer version and replace the existing binary

πŸš€ Quick Start

# Set your AI provider key (choose one)
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."

# Generate mock expectations
automock init --project user-api --provider anthropic

# Deploy to AWS
automock deploy --project user-api

Your mock API is now live at the ALB endpoint shown after deploy.


πŸ“– Documentation

Document Description
GETTING_STARTED.md Complete setup guide, tutorials, examples
automock help CLI reference

✨ Key Features

πŸ€– AI-Powered Generation

Generate complete MockServer configurations from natural language:

automock init --project my-api --provider anthropic

Prompt: "User management API with registration, login, profile CRUD, password reset, and admin functions"

AI generates:

  • All CRUD endpoints (GET /users, POST /users, PUT /users/{id}, etc.)
  • Authentication flows (login, logout, token refresh)
  • Admin-only endpoints with authorization
  • Error responses (400, 401, 403, 404, 500)
  • Realistic test data with proper types
  • Request validation rules
  • Multiple scenarios per endpoint

Supported AI Providers:

  • Anthropic (Claude Sonnet 4.5)
  • OpenAI (GPT-4)
  • Template (no AI, fallback mode)

πŸ“¦ Collection Import

Import existing API definitions from popular tools:

automock init \
  --project api-mock \
  --collection-file api.postman_collection.json \
  --collection-type postman

Supported Formats:

  • Postman Collection v2.1 (.json)
  • Bruno Collection (.json)
  • Insomnia Workspace (.json)

Features:

  • Sequential API execution with variable resolution
  • Interactive matching configuration (guided; no automatic scenario inference)
  • Auto-incremented priorities to avoid collisions
  • Pre/post-script processing (Postman-like JS via embedded engine)
  • Auth mapping to headers when provided in the collection

Example: Multi-Scenario Configuration

GET /api/users/123:
  Priority 100: Anonymous β†’ 401 Unauthorized
  Priority 200: Authenticated β†’ 200 OK (user data)
  Priority 300: Admin β†’ 200 OK (admin view)
  Priority 400: Rate limited β†’ 429 Too Many Requests

πŸ”§ Interactive Builder

Precision-controlled, step-by-step expectation creation:

automock init --project my-api
# Select: interactive

7-Step Process:

  1. Basic Info β€” Description, priority, tags
  2. Request Matching β€” Method, path, query params, headers
  3. Response Configuration β€” Status code, headers, body templates
  4. Advanced Features β€” Delays, caching, compression
  5. Connection Options β€” Socket config, keep-alive
  6. Response Limits β€” Serve unlimited times or expire after N requests
  7. Review & Confirm β€” Validate before saving

Advanced Request Matching:

  • Path parameters: /users/{id}/orders/{orderId}
  • Regex paths: /api/.*/status
  • Query string matching: ?status=active&limit=10
  • Header validation: Authorization: Bearer *
  • Body matching: exact, partial, regex, JSONPath

Response Features:

  • Template variables: $!uuid, $!now_epoch, $!request.headers['X-Request-ID'][0]
  • Progressive delays: 100ms β†’ 150ms β†’ 200ms…
  • Multiple response bodies per expectation

☁️ Cloud Deployment

Deploy production-ready infrastructure with one command:

automock deploy --project my-api

What Gets Deployed:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Application Load Balancer (Public)     β”‚
β”‚  https://automock-{project}-{id}.elb…  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
              β”‚
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β”‚  Target Groups      β”‚
    β”‚  β€’ API (/)          β”‚
    β”‚  β€’ Dashboard        β”‚
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
              β”‚
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  ECS Fargate Cluster                     β”‚
β”‚  β€’ MockServer (port 1080)                β”‚
β”‚  β€’ Config Loader (sidecar)               β”‚
β”‚  β€’ Auto-scaling: configurable            β”‚
β”‚    (defaults: 10–200 tasks)              β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
              β”‚
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β”‚  S3 Bucket          β”‚
    β”‚  expectations.json  β”‚
    β”‚  (versioned)        β”‚
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Infrastructure Features:

  • Auto-Scaling β€” CPU/Memory/Request-based (10–200 tasks by default)
  • Monitoring β€” CloudWatch metrics, logs, alarms
  • Health Checks β€” ALB target health, /mockserver/status
  • Security β€” IAM roles, security groups, private subnets
  • Custom Domain β€” ACM certificate + Route53 (optional, prompted at deploy time)
  • Private ALB β€” Optional internal ALB for VPC-internal clients (e.g., Locust)
  • BYO Networking β€” Use existing VPC, subnets, IGW, NAT, IAM roles, and security groups

Accessing Your Mock:

# API endpoint
curl https://automock-my-api-123.us-east-1.elb.amazonaws.com/api/users

# MockServer Dashboard
open https://automock-my-api-123.us-east-1.elb.amazonaws.com/mockserver/dashboard

πŸ“Š Project Management

Manage expectations throughout their lifecycle:

# View / add / edit / remove / replace / download / delete
automock init --project my-api
# β†’ Select the desired action from the menu
Action Description
view List all expectations
add Add new expectations (any generation mode)
edit Edit specific expectations
remove Remove selected expectations
replace Replace all expectations
download Save to {project}-expectations.json
delete Tear down project and all infrastructure

πŸ§ͺ Load Testing β€” Local

Generate Locust load testing bundles from a collection:

automock load \
  --collection-file api.json \
  --collection-type postman \
  --dir ./load-tests \
  --distributed

cd load-tests
./run_locust_ui.sh
# Browser opens at http://localhost:8089

Generated Files:

  • locustfile.py β€” Test scenarios
  • requirements.txt β€” Python dependencies
  • run_locust_ui.sh / .ps1 β€” Start with web UI
  • run_locust_headless.sh / .ps1 β€” Run without UI
  • run_locust_master.sh / .ps1 β€” Distributed master
  • run_locust_worker.sh / .ps1 β€” Distributed worker

Variable substitution:

  • ${env.VAR} β€” Expanded at load-time across the spec
  • ${data.<field>} and ${user.id|index} β€” Expanded at runtime in path, headers, params, and body
  • In auth.mode: shared, only ${env.*} expands; in auth.mode: per_user, ${data.*} and ${user.*} also expand in the login path/headers/body

☁️ Managed Locust on AWS

Deploy a Locust cluster to AWS via the same deploy command. The stack provisions an ECS Fargate cluster (master + configurable workers), an ALB for the Locust UI, and Cloud Map service discovery between master and workers.

Deploy:

# Upload your load test bundle first
automock load --project my-api --upload --dir ./load-tests

# Deploy infrastructure (prompts for sizing and optional BYO networking)
automock deploy --project my-api

Scale workers:

automock deploy --project my-api
# β†’ When already deployed, prompts for new worker count

Destroy:

automock destroy --project my-api
# β†’ Select: mocks, loadtest, or both

What you get:

  • Public ALB with HTTP and HTTPS access to the Locust master UI (self-signed cert)
  • ECS task definitions for master and workers
  • Cloud Map private namespace for master–worker discovery
  • CloudWatch log groups with configurable retention
  • Security groups with least-privilege rules

Custom Locust image (optional):

By default the module uses public images (locustio/locust:2.31.2 + python:3.11-slim init sidecar). If you prefer faster cold-starts without runtime installs, build a derived image with Locust and boto3 pre-installed and set the locust_container_image Terraform variable before deploying.

Troubleshooting β€” locustfile.py not found:

If Locust logs show Could not find '/workspace/locustfile.py', check:

  1. No bundle has been uploaded yet β€” run automock load --project <name> --upload --dir ./load-tests
  2. The pointer file current.json is missing β€” re-uploading a bundle recreates it
  3. IAM permissions missing: the task role needs s3:GetObject and s3:ListBucket on the project bucket

🎯 Use Cases

Frontend Development

Mock backend APIs before they exist:

automock init --project frontend-mock --provider anthropic
# Describe: "REST API for blog app: posts, comments, users"
automock deploy --project frontend-mock

Integration Testing

Consistent, controlled test environments in CI/CD:

automock deploy --project test-api --skip-confirmation
npm run test:integration -- --api-url https://automock-test-api-123.elb.amazonaws.com
automock destroy --project test-api --force

Third-Party API Simulation

Test against external APIs without rate limits or costs:

automock init \
  --project stripe-mock \
  --collection-file stripe-api.postman_collection.json \
  --collection-type postman

Performance Testing

Generate and deploy load tests against your mock:

automock load \
  --collection-file prod-api.json \
  --collection-type postman \
  --dir ./load-tests
automock load --project prod-api --upload --dir ./load-tests
automock deploy --project prod-api

πŸ’° Cost Estimates

Default AWS Infrastructure (10 tasks, 24/7)

min_tasks and max_tasks are both configurable at deploy time. The table below reflects the defaults. Using BYO networking (existing VPC, subnets, and NAT) eliminates the NAT Gateway cost.

Component Monthly Cost
ECS Fargate (0.25 vCPU, 0.5 GB) ~$35
Application Load Balancer ~$16
NAT Gateway ~$32
Data Transfer ~$9
CloudWatch Logs ~$0.50
S3 Storage ~$0.30
Total ~$93

Rough estimates; varies by region, traffic, and log volume. Validate with the AWS Pricing Calculator.

Hourly rate (10 tasks): ~$0.13/hour

AI Generation Costs

Provider Cost per Generation
Claude Sonnet 4.5 $0.05 – $0.20
GPT-4 $0.10 – $0.30

Cost tips: Destroy when not in use (automock destroy --project <name>), reduce task count for smaller APIs, or use BYO networking to share existing NAT Gateways.


πŸ—οΈ Infrastructure Details

Auto-Scaling Policies

Scale Up (Aggressive):

  • CPU 70–80% β†’ +50% tasks
  • CPU 80–90% β†’ +100% tasks
  • CPU 90%+ β†’ +200% tasks
  • Memory thresholds follow the same pattern
  • Requests/min: 500–1000 β†’ +50%, 1000+ β†’ +100%

Scale Down (Conservative):

  • CPU < 40% for 5 minutes β†’ βˆ’25% tasks
  • Cooldown: 5 minutes between scale events

Limits: minimum 10 tasks, maximum 200 tasks (both configurable)

Monitoring & Alerts

CloudWatch Metrics: ECS CPU/memory/task count, ALB request count/response time/4xx/5xx errors

Alarms: unhealthy host count > 0, 5XX errors > 10/min, CPU > 70% for 10 min, Memory > 80% for 10 min

Security

  • IAM β€” Least-privilege, separate task execution and task roles, no hardcoded credentials; optional permissions boundary and custom role path
  • Networking β€” ECS tasks in private subnets, NAT for outbound only, security groups restrict traffic to ALB
  • Data β€” S3 server-side encryption (AES-256), versioning enabled, CloudWatch Logs retention: 30 days

πŸ”§ Advanced Features

Progressive Response Delays

Simulate degrading performance:

{
  "progressive": {
    "base": 100,
    "step": 50,
    "cap": 500
  }
}

Request 1: 100ms β†’ Request 2: 150ms β†’ … β†’ capped at 500ms

Response Templates

Dynamic values in responses:

{
  "id": "$!uuid",
  "timestamp": "$!now_epoch",
  "requestId": "$!request.headers['x-request-id'][0]",
  "userId": "$!request.pathParameters['userId'][0]",
  "randomScore": "$!rand_int_100"
}

Available variables: $!uuid, $!now_epoch, $!rand_int_100, $!rand_bytes_64, $!request.path, $!request.method, $!request.headers['X-Header'][0], $!request.pathParameters['param'][0], $!request.queryStringParameters['query'][0]

GraphQL Support

Basic GraphQL request matching (no schema validation):

{
  "httpRequest": {
    "method": "POST",
    "path": "/graphql",
    "body": {
      "query": {"contains": "query GetUser"},
      "variables": {"userId": "123"}
    }
  },
  "httpResponse": {
    "body": {"data": {"user": {"id": "123", "name": "John"}}}
  }
}

Supported: query string contains, operation name matching, optional variables matching (exact).


πŸ“‚ Project Structure

auto-mock/
β”œβ”€β”€ cmd/auto-mock/           # CLI entrypoint
β”œβ”€β”€ internal/
β”‚   β”œβ”€β”€ cloud/               # Cloud provider abstraction
β”‚   β”‚   β”œβ”€β”€ aws/             # AWS implementation (S3, ECS, IAM)
β”‚   β”‚   β”œβ”€β”€ factory.go       # Provider detection & initialization
β”‚   β”‚   └── manager.go       # Orchestration & workflows
β”‚   β”œβ”€β”€ mcp/                 # AI provider integration (Anthropic, OpenAI)
β”‚   β”œβ”€β”€ builders/            # Interactive expectation builders
β”‚   β”œβ”€β”€ collections/         # Collection parsers (Postman, Bruno, Insomnia)
β”‚   β”œβ”€β”€ expectations/        # Expectation CRUD operations
β”‚   β”œβ”€β”€ repl/                # Interactive CLI flows
β”‚   β”œβ”€β”€ terraform/           # Embedded infrastructure modules
β”‚   └── models/              # Data structures
β”œβ”€β”€ go.mod
β”œβ”€β”€ build.sh
β”œβ”€β”€ README.md
└── LICENSE

πŸ“Š Roadmap

  • AWS support (S3, ECS, ALB)
  • AI-powered mock generation (Claude, GPT-4)
  • Collection import (Postman, Bruno, Insomnia)
  • Interactive builder
  • Auto-scaling infrastructure
  • CloudWatch monitoring
  • Locust load testing (local + managed AWS)
  • Custom domain support (ACM + Route53)
  • Private ALB for VPC-internal clients
  • BYO networking (VPC, subnets, IAM, security groups)
  • Azure provider support
  • GCP provider support
  • Swagger/OpenAPI import
  • Bruno .bru file format
  • Web UI for expectation management
  • Prometheus metrics export
  • Multiple region deployment
  • Docker Compose local deployment

πŸ› οΈ Development

git clone https://github.com/hemantobora/auto-mock.git
cd auto-mock
go mod download
go build -o automock ./cmd/auto-mock

# Run tests
go test ./...

🀝 Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Areas we'd love help with: Azure/GCP provider support, Swagger/OpenAPI import, Bruno .bru format, Web UI for expectation management, Prometheus metrics export.


πŸ“„ License

This project is licensed under the MIT License β€” see the LICENSE file for details.


πŸ™ Acknowledgments

  • MockServer β€” Powerful HTTP mocking server
  • Anthropic β€” Claude AI for intelligent mock generation
  • OpenAI β€” GPT-4 for intelligent mock generation
  • AWS β€” Cloud infrastructure platform
  • Terraform β€” Infrastructure as Code

πŸ“ž Support


Built with ❀️ by Hemanto Bora

GitHub β€’ Issues

About

AutoMock is an AI-powered, multi-cloud-ready CLI tool that generates and deploys mock API servers based on simple request/response definitions.

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors