OpenAPI-MCP

OpenAPI-MCP

Dockerized MCP server that transforms OpenAPI/Swagger specs into MCP-compatible tools.

150
Stars
31
Forks
150
Watchers
2
Issues
OpenAPI-MCP is a dockerized server that reads Swagger/OpenAPI specification files and generates corresponding Model Context Protocol (MCP) toolsets. It enables MCP-compatible clients to interact dynamically with APIs described by OpenAPI specs, automatically generating the necessary tool schemas and facilitating secure API key handling. The solution supports both local and remote API specs, offers filtering by tags and operations, and can be easily deployed using Docker.

Key Features

Automatic generation of MCP tools from OpenAPI/Swagger specifications
Supports both OpenAPI v2 and v3 formats
Secure API key management and injection
Dockerized deployment for easy containerization
Filters for including or excluding specific operations or tags
Supports both local and remote API specs
Custom header injection via environment variables
Automatic server URL detection and override
No manual endpoint configuration needed
MCP-compliance for integration with AI assistants

Use Cases

Enabling AI agents to access any API with OpenAPI or Swagger documentation
Rapid onboarding of third-party APIs into MCP-compatible environments
Securing API key usage without exposing credentials to clients or agents
Generating standardized tool interfaces for AI model integrations
Creating custom toolsets for LLM-based agents using existing API specs
Prototyping and testing new AI-powered tools leveraging public APIs
Providing a bridge from open API specifications to model-based context protocols
Facilitating context-aware API tool generation for multiplatform agents
Deploying self-serve API transformation services within enterprise environments
Streamlining access control for APIs without manual configuration

README

OpenAPI-MCP: Dockerized MCP Server to allow your AI agent to access any API with existing api docs

Go Reference CI codecov

Trust Score

openapi-mcp logo

Generate MCP tool definitions directly from a Swagger/OpenAPI specification file.

OpenAPI-MCP is a dockerized MCP server that reads a swagger.json or openapi.yaml file and generates a corresponding Model Context Protocol (MCP) toolset. This allows MCP-compatible clients like Cursor to interact with APIs described by standard OpenAPI specifications. Now you can enable your AI agent to access any API by simply providing its OpenAPI/Swagger specification - no additional coding required.

Table of Contents

Demo

Run the demo yourself: Running the Weatherbit Example (Step-by-Step)

demo

Why OpenAPI-MCP?

  • Standard Compliance: Leverage your existing OpenAPI/Swagger documentation.
  • Automatic Tool Generation: Create MCP tools without manual configuration for each endpoint.
  • Flexible API Key Handling: Securely manage API key authentication for the proxied API without exposing keys to the MCP client.
  • Local & Remote Specs: Works with local specification files or remote URLs.
  • Dockerized Tool: Easily deploy and run as a containerized service with Docker.

Features

  • OpenAPI v2 (Swagger) & v3 Support: Parses standard specification formats.
  • Schema Generation: Creates MCP tool schemas from OpenAPI operation parameters and request/response definitions.
  • Secure API Key Management:
    • Injects API keys into requests (header, query, path, cookie) based on command-line configuration.
      • Loads API keys directly from flags (--api-key), environment variables (--api-key-env), or .env files located alongside local specs.
      • Keeps API keys hidden from the end MCP client (e.g., the AI assistant).
  • Server URL Detection: Uses server URLs from the spec as the base for tool interactions (can be overridden).
  • Filtering: Options to include/exclude specific operations or tags (--include-tag, --exclude-tag, --include-op, --exclude-op).
  • Request Header Injection: Pass custom headers (e.g., for additional auth, tracing) via the REQUEST_HEADERS environment variable.

Installation

Docker

The recommended way to run this tool is via Docker.

Using the Pre-built Docker Hub Image (Recommended)

Alternatively, you can use the pre-built image available on Docker Hub.

  1. Pull the Image:
    bash
    docker pull ckanthony/openapi-mcp:latest
    
  2. Run the Container: Follow the docker run examples above, but replace openapi-mcp:latest with ckanthony/openapi-mcp:latest.

Building Locally (Optional)

  1. Build the Docker Image Locally:

    bash
    # Navigate to the repository root
    cd openapi-mcp
    # Build the Docker image (tag it as you like, e.g., openapi-mcp:latest)
    docker build -t openapi-mcp:latest .
    
  2. Run the Container: You need to provide the OpenAPI specification and any necessary API key configuration when running the container.

    • Example 1: Using a local spec file and .env file:

      • Create a directory (e.g., ./my-api) containing your openapi.json or swagger.yaml.
      • If the API requires a key, create a .env file in the same directory (e.g., ./my-api/.env) with API_KEY=your_actual_key (replace API_KEY if your --api-key-env flag is different).
      bash
      docker run -p 8080:8080 --rm \\
          -v $(pwd)/my-api:/app/spec \\
          --env-file $(pwd)/my-api/.env \\
          openapi-mcp:latest \\
          --spec /app/spec/openapi.json \\
          --api-key-env API_KEY \\
          --api-key-name X-API-Key \\
          --api-key-loc header
      

      (Adjust --spec, --api-key-env, --api-key-name, --api-key-loc, and -p as needed.)

    • Example 2: Using a remote spec URL and direct environment variable:

      bash
      docker run -p 8080:8080 --rm \\
          -e SOME_API_KEY="your_actual_key" \\
          openapi-mcp:latest \\
          --spec https://petstore.swagger.io/v2/swagger.json \\
          --api-key-env SOME_API_KEY \\
          --api-key-name api_key \\
          --api-key-loc header
      
    • Key Docker Run Options:

      • -p <host_port>:8080: Map a port on your host to the container's default port 8080.
      • --rm: Automatically remove the container when it exits.
      • -v <host_path>:<container_path>: Mount a local directory containing your spec into the container. Use absolute paths or $(pwd)/.... Common container path: /app/spec.
      • --env-file <path_to_host_env_file>: Load environment variables from a local file (for API keys, etc.). Path is on the host.
      • -e <VAR_NAME>="<value>": Pass a single environment variable directly.
      • openapi-mcp:latest: The name of the image you built locally.
      • --spec ...: Required. Path to the spec file inside the container (e.g., /app/spec/openapi.json) or a public URL.
      • --port 8080: (Optional) Change the internal port the server listens on (must match the container port in -p).
      • --api-key-env, --api-key-name, --api-key-loc: Required if the target API needs an API key.
      • (See --help for all command-line options by running docker run --rm openapi-mcp:latest --help)

Running the Weatherbit Example (Step-by-Step)

This repository includes an example using the Weatherbit API. Here's how to run it using the public Docker image:

  1. Find OpenAPI Specs (Optional Knowledge): Many public APIs have their OpenAPI/Swagger specifications available online. A great resource for discovering them is APIs.guru. The Weatherbit specification used in this example (weatherbitio-swagger.json) was sourced from there.

  2. Get a Weatherbit API Key:

    • Go to Weatherbit.io and sign up for an account (they offer a free tier).
    • Find your API key in your Weatherbit account dashboard.
  3. Clone this Repository: You need the example files from this repository.

    bash
    git clone https://github.com/ckanthony/openapi-mcp.git
    cd openapi-mcp
    
  4. Prepare Environment File:

    • Navigate to the example directory: cd example/weather
    • Copy the example environment file: cp .env.example .env
    • Edit the new .env file and replace YOUR_WEATHERBIT_API_KEY_HERE with the actual API key you obtained from Weatherbit.
  5. Run the Docker Container: From the openapi-mcp root directory (the one containing the example folder), run the following command:

    bash
    docker run -p 8080:8080 --rm \\
        -v $(pwd)/example/weather:/app/spec \\
        --env-file $(pwd)/example/weather/.env \\
        ckanthony/openapi-mcp:latest \\
        --spec /app/spec/weatherbitio-swagger.json \\
        --api-key-env API_KEY \\
        --api-key-name key \\
        --api-key-loc query
    
    • -v $(pwd)/example/weather:/app/spec: Mounts the local example/weather directory (containing the spec and .env file) to /app/spec inside the container.
    • --env-file $(pwd)/example/weather/.env: Tells Docker to load environment variables (specifically API_KEY) from your .env file.
    • ckanthony/openapi-mcp:latest: Uses the public Docker image.
    • --spec /app/spec/weatherbitio-swagger.json: Points to the spec file inside the container.
    • The --api-key-* flags configure how the tool should inject the API key (read from the API_KEY env var, named key, placed in the query string).
  6. Access the MCP Server: The MCP server should now be running and accessible at http://localhost:8080 for compatible clients.

Using Docker Compose (Example):

A docker-compose.yml file is provided in the example/ directory to demonstrate running the Weatherbit API example using the locally built image.

  1. Prepare Environment File: Copy example/weather/.env.example to example/weather/.env and add your actual Weatherbit API key:

    dotenv
    # example/weather/.env
    API_KEY=YOUR_ACTUAL_WEATHERBIT_KEY
    
  2. Run with Docker Compose: Navigate to the example directory and run:

    bash
    cd example
    # This builds the image locally based on ../Dockerfile
    # It does NOT use the public Docker Hub image
    docker-compose up --build
    
    • --build: Forces Docker Compose to build the image using the Dockerfile in the project root before starting the service.
    • Compose will read example/docker-compose.yml, build the image, mount ./weather, read ./weather/.env, and start the openapi-mcp container with the specified command-line arguments.
    • The MCP server will be available at http://localhost:8080.
  3. Stop the service: Press Ctrl+C in the terminal where Compose is running, or run docker-compose down from the example directory in another terminal.

Command-Line Options

The openapi-mcp command accepts the following flags:

Flag Description Type Default
--spec Required. Path or URL to the OpenAPI specification file. string (none)
--port Port to run the MCP server on. int 8080
--api-key Direct API key value (use --api-key-env or .env file instead for security). string (none)
--api-key-env Environment variable name containing the API key. If spec is local, also checks .env file in the spec's directory. string (none)
--api-key-name Required if key used. Name of the API key parameter (header, query, path, or cookie name). string (none)
--api-key-loc Required if key used. Location of API key: header, query, path, or cookie. string (none)
--include-tag Tag to include (can be repeated). If include flags are used, only included items are exposed. string slice (none)
--exclude-tag Tag to exclude (can be repeated). Exclusions apply after inclusions. string slice (none)
--include-op Operation ID to include (can be repeated). string slice (none)
--exclude-op Operation ID to exclude (can be repeated). string slice (none)
--base-url Manually override the target API server base URL detected from the spec. string (none)
--name Default name for the generated MCP toolset (used if spec has no title). string "OpenAPI-MCP Tools"
--desc Default description for the generated MCP toolset (used if spec has no description). string "Tools generated from OpenAPI spec"

Note: You can get this list by running the tool with the --help flag (e.g., docker run --rm ckanthony/openapi-mcp:latest --help).

Environment Variables

  • REQUEST_HEADERS: Set this environment variable to a JSON string (e.g., '{"X-Custom": "Value"}') to add custom headers to all outgoing requests to the target API.

Star History

Star History Chart

Repository Owner

ckanthony
ckanthony

User

Repository Details

Language Go
Default Branch main
Size 524 KB
Contributors 4
MCP Verified Nov 11, 2025

Programming Languages

Go
99.21%
Dockerfile
0.79%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • MCP Link

    MCP Link

    Convert Any OpenAPI V3 API to an MCP Server for seamless AI Agent integration.

    MCP Link enables automatic conversion of any OpenAPI v3-compliant RESTful API into a Model Context Protocol (MCP) server, allowing instant compatibility with AI-driven agent frameworks. It eliminates the need for manual interface creation and code modification by translating OpenAPI schemas into MCP endpoints. MCP Link supports robust feature mapping and authentication, making it easy to expose existing APIs to AI ecosystems using a standardized protocol. The tool is designed for both developers and organizations seeking to streamline API integration with AI agents.

    • 572
    • MCP
    • automation-ai-labs/mcp-link
  • MCP Swagger Server (mss)

    MCP Swagger Server (mss)

    Seamlessly convert OpenAPI/Swagger specs into Model Context Protocol tools for AI integration.

    MCP Swagger Server converts OpenAPI/Swagger API specifications into Model Context Protocol (MCP) compatible tools, enabling REST APIs to become directly callable by AI systems. It supports zero-configuration conversion, multiple transport protocols (SSE, Streamable, Stdio), and secure API access through Bearer Token authentication. The tool offers an interactive command-line interface and configuration options to filter operations, customize transports, and manage API security. Its modular structure includes OpenAPI parsing, web UI, and backend services.

    • 38
    • MCP
    • zaizaizhao/mcp-swagger-server
  • MCP OpenAPI Schema Explorer

    MCP OpenAPI Schema Explorer

    Token-efficient OpenAPI exploration via MCP Resources.

    MCP OpenAPI Schema Explorer is an MCP-compatible server that provides token-efficient, read-only access to OpenAPI (v3.0) and Swagger (v2.0) specifications through MCP Resources. It enables MCP clients such as Claude Desktop and others to interactively explore large API schemas without loading the entire specification into an LLM’s context. The tool automatically converts Swagger specs to OpenAPI v3.0, supports both local files and remote URLs, and is optimized for integration with a broad range of MCP clients.

    • 57
    • MCP
    • kadykov/mcp-openapi-schema-explorer
  • Outsource MCP

    Outsource MCP

    Unified MCP server for multi-provider AI text and image generation

    Outsource MCP is a Model Context Protocol server that bridges AI applications with multiple model providers via a single unified interface. It enables AI tools and clients to access over 20 major providers for both text and image generation, streamlining model selection and API integration. Built on FastMCP and Agno agent frameworks, it supports flexible configuration and is compatible with MCP-enabled AI tools. Authentication is provider-specific, and all interactions use a simple standardized API format.

    • 26
    • MCP
    • gwbischof/outsource-mcp
  • Rootly MCP Server

    Rootly MCP Server

    Seamlessly integrate Rootly incident management into MCP-compatible editors.

    Rootly MCP Server provides an MCP-compliant server to access and manage Rootly's incident management API from within editors like Cursor, Windsurf, and Claude. It enables context-rich workflows and tool generation based on Rootly’s OpenAPI specification, allowing users to resolve incidents directly within their development environment. The server supports flexible authentication and dynamic resource generation while ensuring smart pagination to optimize editor context windows.

    • 36
    • MCP
    • Rootly-AI-Labs/Rootly-MCP-server
  • Higress

    Higress

    AI Native API Gateway with Built-in Model Context Protocol (MCP) Support

    Higress is a cloud-native API gateway built on Istio and Envoy, extensible with Wasm plugins in Go, Rust, or JS. It enables unified management and hosting of both LLM APIs and MCP Servers, allowing AI agents to easily call tools and services via standard protocols. The platform supports seamless conversion of OpenAPI specs to remote MCP servers and provides robust AI gateway features for enterprise and mainstream model providers. Higress is widely adopted in production environments, notably within Alibaba Cloud's core AI applications.

    • 6,814
    • MCP
    • alibaba/higress
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results