@reapi/mcp-openapi
Serve multiple OpenAPI specs for LLM-powered IDE integrations via the Model Context Protocol.
Key Features
Use Cases
README
@reapi/mcp-openapi
A Model Context Protocol (MCP) server that loads and serves multiple OpenAPI specifications to enable LLM-powered IDE integrations. This server acts as a bridge between your OpenAPI specifications and LLM-powered development tools like Cursor and other code editors.
Features
- Loads multiple OpenAPI specifications from a directory
- Exposes API operations and schemas through MCP protocol
- Enables LLMs to understand and work with your APIs directly in your IDE
- Supports dereferenced schemas for complete API context
- Maintains a catalog of all available APIs
Powered by ReAPI
This open-source MCP server is sponsored by ReAPI, a next-generation API platform that simplifies API design and testing. While this server provides local OpenAPI integration for development, ReAPI offers two powerful modules:
🎨 API CMS
- Design APIs using an intuitive no-code editor
- Generate and publish OpenAPI specifications automatically
- Collaborate with team members in real-time
- Version control and change management
🧪 API Testing
- The most developer-friendly no-code API testing solution
- Create and manage test cases with an intuitive interface
- Powerful assertion and validation capabilities
- Serverless cloud test executor
- Perfect for both QA teams and developers
- CI/CD integration ready
Try ReAPI for free at reapi.com and experience the future of API development.
Cursor Configuration
To integrate the MCP OpenAPI server with Cursor IDE, you have two options for configuration locations:
Option 1: Project-specific Configuration (Recommended)
Create a .cursor/mcp.json file in your project directory. This option is recommended as it allows you to maintain different sets of specs for different projects
{
"mcpServers": {
"@reapi/mcp-openapi": {
"command": "npx",
"args": ["-y", "@reapi/mcp-openapi@latest", "--dir", "./specs"],
"env": {}
}
}
}
Tip: Using a relative path like
./specsmakes the configuration portable and easier to share across team members.Note: We recommend using
@latesttag as we frequently update the server with new features and improvements.Important: Project-specific configuration helps manage LLM context limits. When all specifications are placed in a single folder, the combined metadata could exceed the LLM's context window, leading to errors. Organizing specs by project keeps the context size manageable.
Option 2: Global Configuration
Create or edit ~/.cursor/mcp.json in your home directory to make the server available across all projects:
{
"mcpServers": {
"@reapi/mcp-openapi": {
"command": "npx",
"args": ["-y", "@reapi/mcp-openapi@latest", "--dir", "/path/to/your/specs"],
"env": {}
}
}
}
Enable in Cursor Settings
After adding the configuration:
- Open Cursor IDE
- Go to Settings > Cursor Settings > MCP
- Enable the @reapi/mcp-openapi server
- Click the refresh icon next to the server to apply changes
Note: By default, Cursor requires confirmation for each MCP tool execution. If you want to allow automatic execution without confirmation, you can enable Yolo mode in Cursor settings.
The server is now ready to use. When you add new OpenAPI specifications to your directory, you can refresh the catalog by:
- Opening Cursor's chat panel
- Typing one of these prompts:
"Please refresh the API catalog" "Reload the OpenAPI specifications"
OpenAPI Specification Requirements
-
Place your OpenAPI 3.x specifications in the target directory:
- Supports both JSON and YAML formats
- Files should have
.json,.yaml, or.ymlextensions - Scanner will automatically discover and process all specification files
-
Specification ID Configuration:
- By default, the filename (without extension) is used as the specification ID
- To specify a custom ID, add
x-spec-idin the OpenAPI info object:
yamlopenapi: 3.0.0 info: title: My API version: 1.0.0 x-spec-id: my-custom-api-id # Custom specification IDImportant: Setting a custom
x-spec-idis crucial when working with multiple specifications that have:- Similar or identical endpoint paths
- Same schema names
- Overlapping operation IDs
The spec ID helps distinguish between these similar resources and prevents naming conflicts. For example:
yaml# user-service.yaml info: x-spec-id: user-service paths: /users: get: ... # admin-service.yaml info: x-spec-id: admin-service paths: /users: get: ...Now you can reference these endpoints specifically as
user-service/usersandadmin-service/users
How It Works
- The server scans the specified directory for OpenAPI specification files
- It processes and dereferences the specifications for complete context
- Creates and maintains a catalog of all API operations and schemas
- Exposes this information through the MCP protocol
- IDE integrations can then use this information to:
- Provide API context to LLMs
- Enable intelligent code completion
- Assist in API integration
- Generate API-aware code snippets
Tools
-
refresh-api-catalog- Refresh the API catalog
- Returns: Success message when catalog is refreshed
-
get-api-catalog- Get the API catalog, the catalog contains metadata about all openapi specifications, their operations and schemas
- Returns: Complete API catalog with all specifications, operations, and schemas
-
search-api-operations- Search for operations across specifications
- Inputs:
query(string): Search queryspecId(optional string): Specific API specification ID to search within
- Returns: Matching operations from the API catalog
-
search-api-schemas- Search for schemas across specifications
- Inputs:
query(string): Search queryspecId(optional string): Specific API specification ID to search
- Returns: Matching schemas from the API catalog
-
load-api-operation-by-operationId- Load an operation by operationId
- Inputs:
specId(string): API specification IDoperationId(string): Operation ID to load
- Returns: Complete operation details
-
load-api-operation-by-path-and-method- Load an operation by path and method
- Inputs:
specId(string): API specification IDpath(string): API endpoint pathmethod(string): HTTP method
- Returns: Complete operation details
-
load-api-schema-by-schemaName- Load a schema by schemaName
- Inputs:
specId(string): API specification IDschemaName(string): Name of the schema to load
- Returns: Complete schema details
Roadmap
-
Semantic Search
- Enable natural language queries for API operations and schemas
- Improve search accuracy with semantic understanding
-
Remote Specs Sync
- Support syncing OpenAPI specifications from remote sources
-
Code Templates
- Expose code templates through MCP protocol
- Provide reference patterns for LLM code generation
-
Community Contributions
- Submit feature requests and bug reports
- Contribute to improve the server
Example Prompts in Cursor
Here are some example prompts you can use in Cursor IDE to interact with your APIs:
-
Explore Available APIs
"Show me all available APIs in the catalog with their operations" "List all API specifications and their endpoints" -
API Operation Details
"Show me the details of the create pet API endpoint" "What are the required parameters for creating a new pet?" "Explain the response schema for the pet creation endpoint" -
Schema and Mock Data
"Generate mock data for the Pet schema" "Create a valid request payload for the create pet endpoint" "Show me examples of valid pet objects based on the schema" -
Code Generation
"Generate an Axios client for the create pet API" "Create a TypeScript interface for the Pet schema" "Write a React hook that calls the create pet endpoint" -
API Integration Assistance
"Help me implement error handling for the pet API endpoints" "Generate unit tests for the pet API client" "Create a service class that encapsulates all pet-related API calls" -
Documentation and Usage
"Show me example usage of the pet API with curl" "Generate JSDoc comments for the pet API client methods" "Create a README section explaining the pet API integration" -
Validation and Types
"Generate Zod validation schema for the Pet model" "Create TypeScript types for all pet-related API responses" "Help me implement request payload validation for the pet endpoints" -
API Search and Discovery
"Find all endpoints related to pet management" "Show me all APIs that accept file uploads" "List all endpoints that return paginated responses"
These prompts demonstrate how to leverage the MCP server's capabilities for API development. Feel free to adapt them to your specific needs or combine them for more complex tasks.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Tags
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
MCP-Typescribe
An MCP server for serving TypeScript API context to language models.
MCP-Typescribe is an open-source implementation of the Model Context Protocol (MCP) focused on providing LLMs with contextual, real-time access to TypeScript API documentation. It parses TypeScript (and other) definitions using TypeDoc-generated JSON and serves this information via a queryable server that supports tools used by AI coding assistants. The solution enables AI agents to dynamically explore, search, and understand unknown APIs, accelerating onboarding and supporting agentic behaviors in code generation.
- ⭐ 45
- MCP
- yWorks/mcp-typescribe
Taskade MCP
Tools and server for Model Context Protocol workflows and agent integration
Taskade MCP provides an official server and tools to implement and interact with the Model Context Protocol (MCP), enabling seamless connectivity between Taskade’s API and MCP-compatible clients such as Claude or Cursor. It includes utilities for generating MCP tools from any OpenAPI schema and supports the deployment of autonomous agents, workflow automation, and real-time collaboration. The platform promotes extensibility by supporting integration via API, OpenAPI, and MCP, making it easier to build and connect agentic systems.
- ⭐ 90
- MCP
- taskade/mcp
Cross-LLM MCP Server
Unified MCP server for accessing and combining multiple LLM APIs.
Cross-LLM MCP Server is a Model Context Protocol (MCP) server enabling seamless access to a range of Large Language Model APIs including ChatGPT, Claude, DeepSeek, Gemini, Grok, Kimi, Perplexity, and Mistral. It provides a unified interface for invoking different LLMs from any MCP-compatible client, allowing users to call and aggregate responses across providers. The server implements eight specialized tools for interacting with these LLMs, each offering configurable options like model selection, temperature, and token limits. Output includes model context details as well as token usage statistics for each response.
- ⭐ 9
- MCP
- JamesANZ/cross-llm-mcp
@dealx/mcp-server
MCP server enabling LLMs to search and interact with the DealX platform.
Implements the Model Context Protocol, providing a standardized interface for large language models to interact with the DealX platform. Supports searching for ads through structured prompts and is designed for easy integration with tools like Claude and VS Code extensions. Flexible configuration options are available for environment variables, logging, and deployment. Extensible architecture supports future feature additions beyond ad search.
- ⭐ 0
- MCP
- DealExpress/mcp-server
OpenStreetMap MCP Server
Enhancing LLMs with geospatial and location-based capabilities via the Model Context Protocol.
OpenStreetMap MCP Server enables large language models to interact with rich geospatial data and location-based services through a standardized protocol. It provides APIs and tools for address geocoding, reverse geocoding, points of interest search, route directions, and neighborhood analysis. The server exposes location-related resources and tools, making it compatible with MCP hosts for seamless LLM integration.
- ⭐ 134
- MCP
- jagan-shanmugam/open-streetmap-mcp
LlamaCloud MCP Server
Connect multiple LlamaCloud indexes as tools for your MCP client.
LlamaCloud MCP Server is a TypeScript-based implementation of a Model Context Protocol server that allows users to connect multiple managed indexes from LlamaCloud as separate tools in MCP-compatible clients. Each tool is defined via command-line parameters, enabling flexible and dynamic access to different document indexes. The server automatically generates tool interfaces, each capable of querying its respective LlamaCloud index, with customizable parameters such as index name, description, and result limits. Designed for seamless integration, it works with clients like Claude Desktop, Windsurf, and Cursor.
- ⭐ 82
- MCP
- run-llama/mcp-server-llamacloud
Didn't find tool you were looking for?