@reapi/mcp-openapi

@reapi/mcp-openapi

Serve multiple OpenAPI specs for LLM-powered IDE integrations via the Model Context Protocol.

71
Stars
13
Forks
71
Watchers
7
Issues
@reapi/mcp-openapi is a Model Context Protocol (MCP) server that loads and serves multiple OpenAPI specifications, making APIs available to LLM-powered IDEs and development tools. It enables Large Language Models to access, interpret, and work directly with OpenAPI docs within code editors such as Cursor. The server supports dereferenced schemas, maintains an API catalog, and offers project-specific or global configuration. Sponsored by ReAPI, it bridges the gap between API specifications and AI-powered developer environments.

Key Features

Loads multiple OpenAPI specifications from a directory
Exposes API operations and schemas via MCP protocol
Supports dereferenced schemas for complete API context
Maintains a catalog of available APIs
Project-specific and global configuration options
Compatible with Cursor and other LLM-powered code editors
Automatic discovery and processing of JSON/YAML OpenAPI files
Facilitates context management to prevent exceeding LLM limits
Easy refresh of the API catalog through IDE prompts
Integration-ready with various team development workflows

Use Cases

Integrating OpenAPI APIs into LLM-powered IDEs for intelligent code assistance
Providing LLMs with up-to-date context from multiple API specifications in a project
Supporting rapid onboarding for developers using AI tools with project-specific API catalogs
Enabling team collaboration by sharing unified API contexts within code editors
Automating schema loading and refreshing for frequent API changes
Testing and debugging OpenAPI specifications within an AI-enhanced environment
Simplifying API understanding and documentation access for LLMs in the development process
Maintaining separate API contexts for different projects to manage LLM context window limits
Enhancing productivity for developers by connecting APIs to code completion and chat tools
Facilitating advanced API search and query features through unified context provided to LLMs

README

@reapi/mcp-openapi

A Model Context Protocol (MCP) server that loads and serves multiple OpenAPI specifications to enable LLM-powered IDE integrations. This server acts as a bridge between your OpenAPI specifications and LLM-powered development tools like Cursor and other code editors.

Features

  • Loads multiple OpenAPI specifications from a directory
  • Exposes API operations and schemas through MCP protocol
  • Enables LLMs to understand and work with your APIs directly in your IDE
  • Supports dereferenced schemas for complete API context
  • Maintains a catalog of all available APIs

Powered by ReAPI

This open-source MCP server is sponsored by ReAPI, a next-generation API platform that simplifies API design and testing. While this server provides local OpenAPI integration for development, ReAPI offers two powerful modules:

🎨 API CMS

  • Design APIs using an intuitive no-code editor
  • Generate and publish OpenAPI specifications automatically
  • Collaborate with team members in real-time
  • Version control and change management

🧪 API Testing

  • The most developer-friendly no-code API testing solution
  • Create and manage test cases with an intuitive interface
  • Powerful assertion and validation capabilities
  • Serverless cloud test executor
  • Perfect for both QA teams and developers
  • CI/CD integration ready

Try ReAPI for free at reapi.com and experience the future of API development.

Cursor Configuration

To integrate the MCP OpenAPI server with Cursor IDE, you have two options for configuration locations:

Option 1: Project-specific Configuration (Recommended)

Create a .cursor/mcp.json file in your project directory. This option is recommended as it allows you to maintain different sets of specs for different projects

json
{
  "mcpServers": {
    "@reapi/mcp-openapi": {
      "command": "npx",
      "args": ["-y", "@reapi/mcp-openapi@latest", "--dir", "./specs"],
      "env": {}
    }
  }
}

Tip: Using a relative path like ./specs makes the configuration portable and easier to share across team members.

Note: We recommend using @latest tag as we frequently update the server with new features and improvements.

Important: Project-specific configuration helps manage LLM context limits. When all specifications are placed in a single folder, the combined metadata could exceed the LLM's context window, leading to errors. Organizing specs by project keeps the context size manageable.

Option 2: Global Configuration

Create or edit ~/.cursor/mcp.json in your home directory to make the server available across all projects:

json
{
  "mcpServers": {
    "@reapi/mcp-openapi": {
      "command": "npx",
      "args": ["-y", "@reapi/mcp-openapi@latest", "--dir", "/path/to/your/specs"],
      "env": {}
    }
  }
}

Enable in Cursor Settings

After adding the configuration:

  1. Open Cursor IDE
  2. Go to Settings > Cursor Settings > MCP
  3. Enable the @reapi/mcp-openapi server
  4. Click the refresh icon next to the server to apply changes

Note: By default, Cursor requires confirmation for each MCP tool execution. If you want to allow automatic execution without confirmation, you can enable Yolo mode in Cursor settings.

The server is now ready to use. When you add new OpenAPI specifications to your directory, you can refresh the catalog by:

  1. Opening Cursor's chat panel
  2. Typing one of these prompts:
    "Please refresh the API catalog"
    "Reload the OpenAPI specifications"
    

OpenAPI Specification Requirements

  1. Place your OpenAPI 3.x specifications in the target directory:

    • Supports both JSON and YAML formats
    • Files should have .json, .yaml, or .yml extensions
    • Scanner will automatically discover and process all specification files
  2. Specification ID Configuration:

    • By default, the filename (without extension) is used as the specification ID
    • To specify a custom ID, add x-spec-id in the OpenAPI info object:
    yaml
    openapi: 3.0.0
    info:
      title: My API
      version: 1.0.0
      x-spec-id: my-custom-api-id  # Custom specification ID
    

    Important: Setting a custom x-spec-id is crucial when working with multiple specifications that have:

    • Similar or identical endpoint paths
    • Same schema names
    • Overlapping operation IDs

    The spec ID helps distinguish between these similar resources and prevents naming conflicts. For example:

    yaml
    # user-service.yaml
    info:
      x-spec-id: user-service
    paths:
      /users:
        get: ...
    
    # admin-service.yaml
    info:
      x-spec-id: admin-service
    paths:
      /users:
        get: ...
    

    Now you can reference these endpoints specifically as user-service/users and admin-service/users

How It Works

  1. The server scans the specified directory for OpenAPI specification files
  2. It processes and dereferences the specifications for complete context
  3. Creates and maintains a catalog of all API operations and schemas
  4. Exposes this information through the MCP protocol
  5. IDE integrations can then use this information to:
    • Provide API context to LLMs
    • Enable intelligent code completion
    • Assist in API integration
    • Generate API-aware code snippets

Tools

  1. refresh-api-catalog

    • Refresh the API catalog
    • Returns: Success message when catalog is refreshed
  2. get-api-catalog

    • Get the API catalog, the catalog contains metadata about all openapi specifications, their operations and schemas
    • Returns: Complete API catalog with all specifications, operations, and schemas
  3. search-api-operations

    • Search for operations across specifications
    • Inputs:
      • query (string): Search query
      • specId (optional string): Specific API specification ID to search within
    • Returns: Matching operations from the API catalog
  4. search-api-schemas

    • Search for schemas across specifications
    • Inputs:
      • query (string): Search query
      • specId (optional string): Specific API specification ID to search
    • Returns: Matching schemas from the API catalog
  5. load-api-operation-by-operationId

    • Load an operation by operationId
    • Inputs:
      • specId (string): API specification ID
      • operationId (string): Operation ID to load
    • Returns: Complete operation details
  6. load-api-operation-by-path-and-method

    • Load an operation by path and method
    • Inputs:
      • specId (string): API specification ID
      • path (string): API endpoint path
      • method (string): HTTP method
    • Returns: Complete operation details
  7. load-api-schema-by-schemaName

    • Load a schema by schemaName
    • Inputs:
      • specId (string): API specification ID
      • schemaName (string): Name of the schema to load
    • Returns: Complete schema details

Roadmap

  1. Semantic Search

    • Enable natural language queries for API operations and schemas
    • Improve search accuracy with semantic understanding
  2. Remote Specs Sync

    • Support syncing OpenAPI specifications from remote sources
  3. Code Templates

    • Expose code templates through MCP protocol
    • Provide reference patterns for LLM code generation
  4. Community Contributions

    • Submit feature requests and bug reports
    • Contribute to improve the server

Example Prompts in Cursor

Here are some example prompts you can use in Cursor IDE to interact with your APIs:

  1. Explore Available APIs

    "Show me all available APIs in the catalog with their operations"
    "List all API specifications and their endpoints"
    
  2. API Operation Details

    "Show me the details of the create pet API endpoint"
    "What are the required parameters for creating a new pet?"
    "Explain the response schema for the pet creation endpoint"
    
  3. Schema and Mock Data

    "Generate mock data for the Pet schema"
    "Create a valid request payload for the create pet endpoint"
    "Show me examples of valid pet objects based on the schema"
    
  4. Code Generation

    "Generate an Axios client for the create pet API"
    "Create a TypeScript interface for the Pet schema"
    "Write a React hook that calls the create pet endpoint"
    
  5. API Integration Assistance

    "Help me implement error handling for the pet API endpoints"
    "Generate unit tests for the pet API client"
    "Create a service class that encapsulates all pet-related API calls"
    
  6. Documentation and Usage

    "Show me example usage of the pet API with curl"
    "Generate JSDoc comments for the pet API client methods"
    "Create a README section explaining the pet API integration"
    
  7. Validation and Types

    "Generate Zod validation schema for the Pet model"
    "Create TypeScript types for all pet-related API responses"
    "Help me implement request payload validation for the pet endpoints"
    
  8. API Search and Discovery

    "Find all endpoints related to pet management"
    "Show me all APIs that accept file uploads"
    "List all endpoints that return paginated responses"
    

These prompts demonstrate how to leverage the MCP server's capabilities for API development. Feel free to adapt them to your specific needs or combine them for more complex tasks.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Star History

Star History Chart

Repository Owner

ReAPI-com
ReAPI-com

Organization

Repository Details

Language TypeScript
Default Branch main
Size 201 KB
Contributors 1
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

TypeScript
97.82%
JavaScript
2.18%

Tags

Topics

cline cursor mcp-server openapi swagger

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • MCP-Typescribe

    MCP-Typescribe

    An MCP server for serving TypeScript API context to language models.

    MCP-Typescribe is an open-source implementation of the Model Context Protocol (MCP) focused on providing LLMs with contextual, real-time access to TypeScript API documentation. It parses TypeScript (and other) definitions using TypeDoc-generated JSON and serves this information via a queryable server that supports tools used by AI coding assistants. The solution enables AI agents to dynamically explore, search, and understand unknown APIs, accelerating onboarding and supporting agentic behaviors in code generation.

    • 45
    • MCP
    • yWorks/mcp-typescribe
  • Taskade MCP

    Taskade MCP

    Tools and server for Model Context Protocol workflows and agent integration

    Taskade MCP provides an official server and tools to implement and interact with the Model Context Protocol (MCP), enabling seamless connectivity between Taskade’s API and MCP-compatible clients such as Claude or Cursor. It includes utilities for generating MCP tools from any OpenAPI schema and supports the deployment of autonomous agents, workflow automation, and real-time collaboration. The platform promotes extensibility by supporting integration via API, OpenAPI, and MCP, making it easier to build and connect agentic systems.

    • 90
    • MCP
    • taskade/mcp
  • Cross-LLM MCP Server

    Cross-LLM MCP Server

    Unified MCP server for accessing and combining multiple LLM APIs.

    Cross-LLM MCP Server is a Model Context Protocol (MCP) server enabling seamless access to a range of Large Language Model APIs including ChatGPT, Claude, DeepSeek, Gemini, Grok, Kimi, Perplexity, and Mistral. It provides a unified interface for invoking different LLMs from any MCP-compatible client, allowing users to call and aggregate responses across providers. The server implements eight specialized tools for interacting with these LLMs, each offering configurable options like model selection, temperature, and token limits. Output includes model context details as well as token usage statistics for each response.

    • 9
    • MCP
    • JamesANZ/cross-llm-mcp
  • @dealx/mcp-server

    @dealx/mcp-server

    MCP server enabling LLMs to search and interact with the DealX platform.

    Implements the Model Context Protocol, providing a standardized interface for large language models to interact with the DealX platform. Supports searching for ads through structured prompts and is designed for easy integration with tools like Claude and VS Code extensions. Flexible configuration options are available for environment variables, logging, and deployment. Extensible architecture supports future feature additions beyond ad search.

    • 0
    • MCP
    • DealExpress/mcp-server
  • OpenStreetMap MCP Server

    OpenStreetMap MCP Server

    Enhancing LLMs with geospatial and location-based capabilities via the Model Context Protocol.

    OpenStreetMap MCP Server enables large language models to interact with rich geospatial data and location-based services through a standardized protocol. It provides APIs and tools for address geocoding, reverse geocoding, points of interest search, route directions, and neighborhood analysis. The server exposes location-related resources and tools, making it compatible with MCP hosts for seamless LLM integration.

    • 134
    • MCP
    • jagan-shanmugam/open-streetmap-mcp
  • LlamaCloud MCP Server

    LlamaCloud MCP Server

    Connect multiple LlamaCloud indexes as tools for your MCP client.

    LlamaCloud MCP Server is a TypeScript-based implementation of a Model Context Protocol server that allows users to connect multiple managed indexes from LlamaCloud as separate tools in MCP-compatible clients. Each tool is defined via command-line parameters, enabling flexible and dynamic access to different document indexes. The server automatically generates tool interfaces, each capable of querying its respective LlamaCloud index, with customizable parameters such as index name, description, and result limits. Designed for seamless integration, it works with clients like Claude Desktop, Windsurf, and Cursor.

    • 82
    • MCP
    • run-llama/mcp-server-llamacloud
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results