MCP OpenAPI Schema Explorer

MCP OpenAPI Schema Explorer

Token-efficient OpenAPI exploration via MCP Resources.

57
Stars
8
Forks
57
Watchers
5
Issues
MCP OpenAPI Schema Explorer is an MCP-compatible server that provides token-efficient, read-only access to OpenAPI (v3.0) and Swagger (v2.0) specifications through MCP Resources. It enables MCP clients such as Claude Desktop and others to interactively explore large API schemas without loading the entire specification into an LLM’s context. The tool automatically converts Swagger specs to OpenAPI v3.0, supports both local files and remote URLs, and is optimized for integration with a broad range of MCP clients.

Key Features

MCP server implementation for OpenAPI and Swagger specs
Token-efficient API schema exploration
Supports OpenAPI v3.0 and Swagger v2.0 (with automatic conversion)
Read-only access via MCP Resources
Integration with MCP clients (Claude Desktop, Cline, Windsurf, etc.)
Ability to load specifications from both local files and remote URLs
No installation required for npx or Docker usage
Global or local installation options via npm
Facilitates API browsing within LLM context windows
Optimized for large API specification handling

Use Cases

Exploring the structure of large OpenAPI or Swagger specifications
Enabling AI model clients to interactively browse API schemas
Reducing token usage when querying API documents with LLMs
Providing context-aware, read-only access to API data for LLMs
Integrating with Model Context Protocol clients to enhance developer tools
Automating API specification conversion from Swagger v2.0 to OpenAPI v3.0
Supporting API documentation review without full file import
Facilitating rapid discovery of API endpoints and components
Empowering context-aware code generation or analysis workflows
Streamlining API integration for chat-driven or LLM-assisted applications

README

MCP OpenAPI Schema Explorer

npm version NPM Downloads Docker Pulls License: MIT codecov Verified on MseeP Lines of code Trust Score

An MCP (Model Context Protocol) server that provides token-efficient access to OpenAPI (v3.0) and Swagger (v2.0) specifications via MCP Resources.

Project Goal

The primary goal of this project is to allow MCP clients (like Cline or Claude Desktop) to explore the structure and details of large OpenAPI specifications without needing to load the entire file into an LLM's context window. It achieves this by exposing parts of the specification through MCP Resources, which are well-suited for read-only data exploration.

This server supports loading specifications from both local file paths and remote HTTP/HTTPS URLs. Swagger v2.0 specifications are automatically converted to OpenAPI v3.0 upon loading.

Why MCP Resources?

The Model Context Protocol defines both Resources and Tools.

  • Resources: Represent data sources (like files, API responses). They are ideal for read-only access and exploration by MCP clients (e.g., browsing API paths in Claude Desktop).
  • Tools: Represent executable actions or functions, often used by LLMs to perform tasks or interact with external systems.

While other MCP servers exist that provide access to OpenAPI specs via Tools, this project specifically focuses on providing access via Resources. This makes it particularly useful for direct exploration within MCP client applications.

For more details on MCP clients and their capabilities, see the MCP Client Documentation.

Installation

For the recommended usage methods (npx and Docker, described below), no separate installation step is required. Your MCP client will download the package or pull the Docker image automatically based on the configuration you provide.

However, if you prefer or need to install the server explicitly, you have two options:

  1. Global Installation: You can install the package globally using npm:

    bash
    npm install -g mcp-openapi-schema-explorer
    

    See Method 3 below for how to configure your MCP client to use a globally installed server.

  2. Local Development/Installation: You can clone the repository and build it locally:

    bash
    git clone https://github.com/kadykov/mcp-openapi-schema-explorer.git
    cd mcp-openapi-schema-explorer
    npm install
    npm run build
    

    See Method 4 below for how to configure your MCP client to run the server from your local build using node.

Adding the Server to your MCP Client

This server is designed to be run by MCP clients (like Claude Desktop, Windsurf, Cline, etc.). To use it, you add a configuration entry to your client's settings file (often a JSON file). This entry tells the client how to execute the server process (e.g., using npx, docker, or node). The server itself doesn't require separate configuration beyond the command-line arguments specified in the client settings entry.

Below are the common methods for adding the server entry to your client's configuration.

Method 1: npx (Recommended)

Using npx is recommended as it avoids global/local installation and ensures the client uses the latest published version.

Example Client Configuration Entry (npx Method):

Add the following JSON object to the mcpServers section of your MCP client's configuration file. This entry instructs the client on how to run the server using npx:

json
{
  "mcpServers": {
    "My API Spec (npx)": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-openapi-schema-explorer@latest",
        "<path-or-url-to-spec>",
        "--output-format",
        "yaml"
      ],
      "env": {}
    }
  }
}

Configuration Notes:

  • Replace "My API Spec (npx)" with a unique name for this server instance in your client.
  • Replace <path-or-url-to-spec> with the absolute local file path or full remote URL of your specification.
  • The --output-format is optional (json, yaml, json-minified), defaulting to json.
  • To explore multiple specifications, add separate entries in mcpServers, each with a unique name and pointing to a different spec.

Method 2: Docker

You can instruct your MCP client to run the server using the official Docker image: kadykov/mcp-openapi-schema-explorer.

Example Client Configuration Entries (Docker Method):

Add one of the following JSON objects to the mcpServers section of your MCP client's configuration file. These entries instruct the client on how to run the server using docker run:

  • Remote URL: Pass the URL directly to docker run.

  • Using a Remote URL:

    json
    {
      "mcpServers": {
        "My API Spec (Docker Remote)": {
          "command": "docker",
          "args": [
            "run",
            "--rm",
            "-i",
            "kadykov/mcp-openapi-schema-explorer:latest",
            "<remote-url-to-spec>"
          ],
          "env": {}
        }
      }
    }
    
  • Using a Local File: (Requires mounting the file into the container)

    json
    {
      "mcpServers": {
        "My API Spec (Docker Local)": {
          "command": "docker",
          "args": [
            "run",
            "--rm",
            "-i",
            "-v",
            "/full/host/path/to/spec.yaml:/spec/api.yaml",
            "kadykov/mcp-openapi-schema-explorer:latest",
            "/spec/api.yaml",
            "--output-format",
            "yaml"
          ],
          "env": {}
        }
      }
    }
    

    Important: Replace /full/host/path/to/spec.yaml with the correct absolute path on your host machine. The path /spec/api.yaml is the corresponding path inside the container.

Method 3: Global Installation (Less Common)

If you have installed the package globally using npm install -g, you can configure your client to run it directly.

bash
# Run this command once in your terminal
npm install -g mcp-openapi-schema-explorer

Example Client Configuration Entry (Global Install Method):

Add the following entry to your MCP client's configuration file. This assumes the mcp-openapi-schema-explorer command is accessible in the client's execution environment PATH.

json
{
  "mcpServers": {
    "My API Spec (Global)": {
      "command": "mcp-openapi-schema-explorer",
      "args": ["<path-or-url-to-spec>", "--output-format", "yaml"],
      "env": {}
    }
  }
}
  • Ensure the command (mcp-openapi-schema-explorer) is accessible in the PATH environment variable used by your MCP client.

Method 4: Local Development/Installation

This method is useful if you have cloned the repository locally for development or to run a modified version.

Setup Steps (Run once in your terminal):

  1. Clone the repository: git clone https://github.com/kadykov/mcp-openapi-schema-explorer.git
  2. Navigate into the directory: cd mcp-openapi-schema-explorer
  3. Install dependencies: npm install
  4. Build the project: npm run build (or just build)

Example Client Configuration Entry (Local Development Method):

Add the following entry to your MCP client's configuration file. This instructs the client to run the locally built server using node.

json
{
  "mcpServers": {
    "My API Spec (Local Dev)": {
      "command": "node",
      "args": [
        "/full/path/to/cloned/mcp-openapi-schema-explorer/dist/src/index.js",
        "<path-or-url-to-spec>",
        "--output-format",
        "yaml"
      ],

      "env": {}
    }
  }
}

Important: Replace /full/path/to/cloned/mcp-openapi-schema-explorer/dist/src/index.js with the correct absolute path to the built index.js file in your cloned repository.

Features

  • MCP Resource Access: Explore OpenAPI specs via intuitive URIs (openapi://info, openapi://paths/..., openapi://components/...).
  • OpenAPI v3.0 & Swagger v2.0 Support: Loads both formats, automatically converting v2.0 to v3.0.
  • Local & Remote Files: Load specs from local file paths or HTTP/HTTPS URLs.
  • Token-Efficient: Designed to minimize token usage for LLMs by providing structured access.
  • Multiple Output Formats: Get detailed views in JSON (default), YAML, or minified JSON (--output-format).
  • Dynamic Server Name: Server name in MCP clients reflects the info.title from the loaded spec.
  • Reference Transformation: Internal $refs (#/components/...) are transformed into clickable MCP URIs.

Available MCP Resources

This server exposes the following MCP resource templates for exploring the OpenAPI specification.

Understanding Multi-Value Parameters (*)

Some resource templates include parameters ending with an asterisk (*), like {method*} or {name*}. This indicates that the parameter accepts multiple comma-separated values. For example, to request details for both the GET and POST methods of a path, you would use a URI like openapi://paths/users/get,post. This allows fetching details for multiple items in a single request.

Resource Templates:

  • openapi://{field}

    • Description: Accesses top-level fields of the OpenAPI document (e.g., info, servers, tags) or lists the contents of paths or components. The specific available fields depend on the loaded specification.
    • Example: openapi://info
    • Output: text/plain list for paths and components; configured format (JSON/YAML/minified JSON) for other fields.
    • Completions: Provides dynamic suggestions for {field} based on the actual top-level keys found in the loaded spec.
  • openapi://paths/{path}

    • Description: Lists the available HTTP methods (operations) for a specific API path.
    • Parameter: {path} - The API path string. Must be URL-encoded (e.g., /users/{id} becomes users%2F%7Bid%7D).
    • Example: openapi://paths/users%2F%7Bid%7D
    • Output: text/plain list of methods.
    • Completions: Provides dynamic suggestions for {path} based on the paths found in the loaded spec (URL-encoded).
  • openapi://paths/{path}/{method*}

    • Description: Gets the detailed specification for one or more operations (HTTP methods) on a specific API path.
    • Parameters:
      • {path} - The API path string. Must be URL-encoded.
      • {method*} - One or more HTTP methods (e.g., get, post, get,post). Case-insensitive.
    • Example (Single): openapi://paths/users%2F%7Bid%7D/get
    • Example (Multiple): openapi://paths/users%2F%7Bid%7D/get,post
    • Output: Configured format (JSON/YAML/minified JSON).
    • Completions: Provides dynamic suggestions for {path}. Provides static suggestions for {method*} (common HTTP verbs like GET, POST, PUT, DELETE, etc.).
  • openapi://components/{type}

    • Description: Lists the names of all defined components of a specific type (e.g., schemas, responses, parameters). The specific available types depend on the loaded specification. Also provides a short description for each listed type.
    • Example: openapi://components/schemas
    • Output: text/plain list of component names with descriptions.
    • Completions: Provides dynamic suggestions for {type} based on the component types found in the loaded spec.
  • openapi://components/{type}/{name*}

    • Description: Gets the detailed specification for one or more named components of a specific type.
    • Parameters:
      • {type} - The component type.
      • {name*} - One or more component names (e.g., User, Order, User,Order). Case-sensitive.
    • Example (Single): openapi://components/schemas/User
    • Example (Multiple): openapi://components/schemas/User,Order
    • Output: Configured format (JSON/YAML/minified JSON).
    • Completions: Provides dynamic suggestions for {type}. Provides dynamic suggestions for {name*} only if the loaded spec contains exactly one component type overall (e.g., only schemas). This limitation exists because the MCP SDK currently doesn't support providing completions scoped to the selected {type}; providing all names across all types could be misleading.

Contributing

Contributions are welcome! Please see the CONTRIBUTING.md file for guidelines on setting up the development environment, running tests, and submitting changes.

Releases

This project uses semantic-release for automated version management and package publishing based on Conventional Commits.

Future Plans

(Future plans to be determined)

Star History

Star History Chart

Repository Owner

kadykov
kadykov

User

Repository Details

Language TypeScript
Default Branch main
Size 1,926 KB
Contributors 5
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

TypeScript
92.81%
JavaScript
6.21%
Just
0.54%
Dockerfile
0.38%
Shell
0.06%

Tags

Topics

api-specification developer-tools development mcp mcp-server model-context-protocol node node-js nodejs openapi swagger typescript

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • @reapi/mcp-openapi

    @reapi/mcp-openapi

    Serve multiple OpenAPI specs for LLM-powered IDE integrations via the Model Context Protocol.

    @reapi/mcp-openapi is a Model Context Protocol (MCP) server that loads and serves multiple OpenAPI specifications, making APIs available to LLM-powered IDEs and development tools. It enables Large Language Models to access, interpret, and work directly with OpenAPI docs within code editors such as Cursor. The server supports dereferenced schemas, maintains an API catalog, and offers project-specific or global configuration. Sponsored by ReAPI, it bridges the gap between API specifications and AI-powered developer environments.

    • 71
    • MCP
    • ReAPI-com/mcp-openapi
  • MCP Swagger Server (mss)

    MCP Swagger Server (mss)

    Seamlessly convert OpenAPI/Swagger specs into Model Context Protocol tools for AI integration.

    MCP Swagger Server converts OpenAPI/Swagger API specifications into Model Context Protocol (MCP) compatible tools, enabling REST APIs to become directly callable by AI systems. It supports zero-configuration conversion, multiple transport protocols (SSE, Streamable, Stdio), and secure API access through Bearer Token authentication. The tool offers an interactive command-line interface and configuration options to filter operations, customize transports, and manage API security. Its modular structure includes OpenAPI parsing, web UI, and backend services.

    • 38
    • MCP
    • zaizaizhao/mcp-swagger-server
  • MCP-Typescribe

    MCP-Typescribe

    An MCP server for serving TypeScript API context to language models.

    MCP-Typescribe is an open-source implementation of the Model Context Protocol (MCP) focused on providing LLMs with contextual, real-time access to TypeScript API documentation. It parses TypeScript (and other) definitions using TypeDoc-generated JSON and serves this information via a queryable server that supports tools used by AI coding assistants. The solution enables AI agents to dynamically explore, search, and understand unknown APIs, accelerating onboarding and supporting agentic behaviors in code generation.

    • 45
    • MCP
    • yWorks/mcp-typescribe
  • OpsLevel MCP Server

    OpsLevel MCP Server

    Read-only MCP server for integrating OpsLevel data with AI tools.

    OpsLevel MCP Server implements the Model Context Protocol to provide AI tools with a secure way to access and interact with OpsLevel account data. It supports read-only operations for a wide range of OpsLevel resources such as actions, campaigns, checks, components, documentation, domains, and more. The tool is compatible with popular environments including Claude Desktop and VS Code, enabling easy integration via configuration and API tokens. Installation options include Homebrew, Docker, and standalone binaries.

    • 8
    • MCP
    • OpsLevel/opslevel-mcp
  • godoc-mcp

    godoc-mcp

    Token-efficient Go documentation server for LLMs using Model Context Protocol.

    godoc-mcp is a Model Context Protocol (MCP) server that provides efficient, structured access to Go package documentation for large language models. It enables LLMs to understand Go projects without reading entire source files by supplying essential documentation and source code at varying levels of granularity. The tool supports project navigation, automatic module setup, caching, and works offline for both standard and third-party Go packages.

    • 88
    • MCP
    • mrjoshuak/godoc-mcp
  • Cross-LLM MCP Server

    Cross-LLM MCP Server

    Unified MCP server for accessing and combining multiple LLM APIs.

    Cross-LLM MCP Server is a Model Context Protocol (MCP) server enabling seamless access to a range of Large Language Model APIs including ChatGPT, Claude, DeepSeek, Gemini, Grok, Kimi, Perplexity, and Mistral. It provides a unified interface for invoking different LLMs from any MCP-compatible client, allowing users to call and aggregate responses across providers. The server implements eight specialized tools for interacting with these LLMs, each offering configurable options like model selection, temperature, and token limits. Output includes model context details as well as token usage statistics for each response.

    • 9
    • MCP
    • JamesANZ/cross-llm-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results