Context7 MCP

Context7 MCP

Up-to-date code docs for every AI prompt.

36,881
Stars
1,825
Forks
36,881
Watchers
107
Issues
Context7 MCP delivers current, version-specific documentation and code examples directly into large language model prompts. By integrating with model workflows, it ensures responses are accurate and based on the latest source material, reducing outdated and hallucinated code. Users can fetch relevant API documentation and examples by simply adding a directive to their prompts. This allows for more reliable, context-rich answers tailored to real-world programming scenarios.

Key Features

Fetches real-time, version-specific library documentation
Injects code examples directly into LLM prompts
Reduces hallucinated or outdated API usage
Supports natural prompt directives for context retrieval
Multi-language documentation support
Integrates with IDEs (e.g., VS Code, Cursor)
Automatic extraction of relevant code snippets
Seamless user experience without need for manual tab switching
Supports adding new project documentation
Works with various programming languages and frameworks

Use Cases

Generating accurate, up-to-date code snippets in AI coding assistants
Retrieving latest API methods and documentation for specific library versions
Reducing errors from outdated or incorrect code generation
Providing immediate developer guidance within prompts
Boosting productivity in code review and prototyping
Automating answer generation for developer Q&A platforms
Enriching AI model responses with live context from technical docs
Supporting onboarding for new technologies or frameworks
Streamlining workflow for cloud function and middleware development
Delivering multilingual programming documentation to LLMs

README

Cover

Install MCP Server

Context7 MCP - Up-to-date Code Docs For Any Prompt

Website smithery badge NPM Version MIT licensed

繁體中文 简体中文 日本語 한국어 문서 Documentación en Español Documentation en Français Documentação em Português (Brasil) Documentazione in italiano Dokumentasi Bahasa Indonesia Dokumentation auf Deutsch Документация на русском языке Українська документація Türkçe Doküman Arabic Documentation Tiếng Việt

❌ Without Context7

LLMs rely on outdated or generic information about the libraries you use. You get:

  • ❌ Code examples are outdated and based on year-old training data
  • ❌ Hallucinated APIs that don't even exist
  • ❌ Generic answers for old package versions

✅ With Context7

Context7 MCP pulls up-to-date, version-specific documentation and code examples straight from the source — and places them directly into your prompt.

Add use context7 to your prompt in Cursor:

txt
Create a Next.js middleware that checks for a valid JWT in cookies
and redirects unauthenticated users to `/login`. use context7
txt
Configure a Cloudflare Worker script to cache
JSON API responses for five minutes. use context7

Context7 fetches up-to-date code examples and documentation right into your LLM's context.

  • 1️⃣ Write your prompt naturally
  • 2️⃣ Tell the LLM to use context7
  • 3️⃣ Get working code answers

No tab-switching, no hallucinated APIs that don't exist, no outdated code generation.

[!NOTE] This repository hosts the source code of Context7 MCP server. The supporting components — API backend, parsing engine, and crawling engine — are private and not part of this release.

📚 Adding Projects

Check out our project addition guide to learn how to add (or update) your favorite libraries to Context7.

🛠️ Installation

Requirements

  • Node.js >= v18.0.0
  • Cursor, Claude Code, VSCode, Windsurf or another MCP Client
  • Context7 API Key (Optional) for higher rate limits and private repositories (Get yours by creating an account at context7.com/dashboard)

To install Context7 MCP Server for any client automatically via Smithery:

bash
npx -y @smithery/cli@latest install @upstash/context7-mcp --client <CLIENT_NAME> --key <YOUR_SMITHERY_KEY>

You can find your Smithery key in the Smithery.ai webpage.

Go to: Settings -> Cursor Settings -> MCP -> Add new global MCP server

Pasting the following configuration into your Cursor ~/.cursor/mcp.json file is the recommended approach. You may also install in a specific project by creating .cursor/mcp.json in your project folder. See Cursor MCP docs for more info.

Since Cursor 1.0, you can click the install button below for instant one-click installation.

Cursor Remote Server Connection

Install MCP Server

json
{
  "mcpServers": {
    "context7": {
      "url": "https://mcp.context7.com/mcp",
      "headers": {
        "CONTEXT7_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}

Cursor Local Server Connection

Install MCP Server

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

Run this command. See Claude Code MCP docs for more info.

Claude Code Remote Server Connection

sh
claude mcp add --transport http context7 https://mcp.context7.com/mcp --header "CONTEXT7_API_KEY: YOUR_API_KEY"

Claude Code Local Server Connection

sh
claude mcp add context7 -- npx -y @upstash/context7-mcp --api-key YOUR_API_KEY

Run this command in your terminal. See Amp MCP docs for more info.

Without API Key (Basic Usage)

sh
amp mcp add context7 https://mcp.context7.com/mcp

With API Key (Higher Rate Limits & Private Repos)

sh
amp mcp add context7 --header "CONTEXT7_API_KEY=YOUR_API_KEY" https://mcp.context7.com/mcp

Add this to your Windsurf MCP config file. See Windsurf MCP docs for more info.

Windsurf Remote Server Connection

json
{
  "mcpServers": {
    "context7": {
      "serverUrl": "https://mcp.context7.com/mcp",
      "headers": {
        "CONTEXT7_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}

Windsurf Local Server Connection

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

Add this to your VS Code MCP config file. See VS Code MCP docs for more info.

VS Code Remote Server Connection

json
"mcp": {
  "servers": {
    "context7": {
      "type": "http",
      "url": "https://mcp.context7.com/mcp",
      "headers": {
        "CONTEXT7_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}

VS Code Local Server Connection

json
"mcp": {
  "servers": {
    "context7": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

You can easily install Context7 through the Cline MCP Server Marketplace by following these instructions:

  1. Open Cline.
  2. Click the hamburger menu icon (☰) to enter the MCP Servers section.
  3. Use the search bar within the Marketplace tab to find Context7.
  4. Click the Install button.

Or you can directly edit MCP servers configuration:

  1. Open Cline.
  2. Click the hamburger menu icon (☰) to enter the MCP Servers section.
  3. Choose Remote Servers tab.
  4. Click the Edit Configuration button.
  5. Add context7 to mcpServers:
json
{
  "mcpServers": {
    "context7": {
      "url": "https://mcp.context7.com/mcp",
      "type": "streamableHttp",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

It can be installed via Zed Extensions or you can add this to your Zed settings.json. See Zed Context Server docs for more info.

json
{
  "context_servers": {
    "Context7": {
      "source": "custom",
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

To configure Context7 MCP in Augment Code, you can use either the graphical interface or manual configuration.

A. Using the Augment Code UI

  1. Click the hamburger menu.

  2. Select Settings.

  3. Navigate to the Tools section.

  4. Click the + Add MCP button.

  5. Enter the following command:

    npx -y @upstash/context7-mcp@latest
    
  6. Name the MCP: Context7.

  7. Click the Add button.

Once the MCP server is added, you can start using Context7's up-to-date code documentation features directly within Augment Code.


B. Manual Configuration

  1. Press Cmd/Ctrl Shift P or go to the hamburger menu in the Augment panel
  2. Select Edit Settings
  3. Under Advanced, click Edit in settings.json
  4. Add the server configuration to the mcpServers array in the augment.advanced object
json
"augment.advanced": {
  "mcpServers": [
    {
      "name": "context7",
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  ]
}

Once the MCP server is added, restart your editor. If you receive any errors, check the syntax to make sure closing brackets or commas are not missing.

Add this to your Roo Code MCP configuration file. See Roo Code MCP docs for more info.

Roo Code Remote Server Connection

json
{
  "mcpServers": {
    "context7": {
      "type": "streamable-http",
      "url": "https://mcp.context7.com/mcp",
      "headers": {
        "CONTEXT7_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}

Roo Code Local Server Connection

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

See Gemini CLI Configuration for details.

  1. Open the Gemini CLI settings file. The location is ~/.gemini/settings.json (where ~ is your home directory).
  2. Add the following to the mcpServers object in your settings.json file:
json
{
  "mcpServers": {
    "context7": {
      "httpUrl": "https://mcp.context7.com/mcp",
      "headers": {
        "CONTEXT7_API_KEY": "YOUR_API_KEY",
        "Accept": "application/json, text/event-stream"
      }
    }
  }
}

Or, for a local server:

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

If the mcpServers object does not exist, create it.

See Qwen Coder MCP Configuration for details.

  1. Open the Qwen Coder settings file. The location is ~/.qwen/settings.json (where ~ is your home directory).
  2. Add the following to the mcpServers object in your settings.json file:
json
{
  "mcpServers": {
    "context7": {
      "httpUrl": "https://mcp.context7.com/mcp",
      "headers": {
        "CONTEXT7_API_KEY": "YOUR_API_KEY",
        "Accept": "application/json, text/event-stream"
      }
    }
  }
}

Or, for a local server:

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

If the mcpServers object does not exist, create it.

Remote Server Connection

Open Claude Desktop and navigate to Settings > Connectors > Add Custom Connector. Enter the name as Context7 and the remote MCP server URL as https://mcp.context7.com/mcp.

Local Server Connection

Open Claude Desktop developer settings and edit your claude_desktop_config.json file to add the following configuration. See Claude Desktop MCP docs for more info.

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

Add this to your Opencode configuration file. See Opencode MCP docs for more info.

Opencode Remote Server Connection

json
"mcp": {
  "context7": {
    "type": "remote",
    "url": "https://mcp.context7.com/mcp",
    "headers": {
      "CONTEXT7_API_KEY": "YOUR_API_KEY"
    },
    "enabled": true
  }
}

Opencode Local Server Connection

json
{
  "mcp": {
    "context7": {
      "type": "local",
      "command": ["npx", "-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"],
      "enabled": true
    }
  }
}

See OpenAI Codex for more information.

Add the following configuration to your OpenAI Codex MCP server settings:

Local Server Connection

toml
[mcp_servers.context7]
args = ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
command = "npx"
startup_timeout_ms = 20_000

Remote Server Connection

toml
[mcp_servers.context7]
url = "https://mcp.context7.com/mcp"
http_headers = { "CONTEXT7_API_KEY" = "YOUR_API_KEY" }

Optional troubleshooting — only if you see startup "request timed out" or "not found program". Most users can ignore this.

  • First try: increase startup_timeout_ms to 40_000 and retry.
  • Windows quick fix (absolute npx path + explicit env):
toml
[mcp_servers.context7]
command = "C:\\Users\\yourname\\AppData\\Roaming\\npm\\npx.cmd"
args = [
  "-y",
  "@upstash/context7-mcp",
  "--api-key",
  "YOUR_API_KEY"
]
env = { SystemRoot="C:\\Windows", APPDATA="C:\\Users\\yourname\\AppData\\Roaming" }
startup_timeout_ms = 40_000
  • macOS quick fix (use Node + installed package entry point):
toml
[mcp_servers.context7]
command = "/Users/yourname/.nvm/versions/node/v22.14.0/bin/node"
args = ["/Users/yourname/.nvm/versions/node/v22.14.0/lib/node_modules/@upstash/context7-mcp/dist/index.js",
  "--transport",
  "stdio",
  "--api-key",
  "YOUR_API_KEY"
]

Notes: Replace yourname with your OS username. Explicitly setting APPDATA and SystemRoot is essential because these are required by npx on Windows but not set by certain versions of OpenAI Codex mcp clients by default.

See JetBrains AI Assistant Documentation for more details.

  1. In JetBrains IDEs, go to Settings -> Tools -> AI Assistant -> Model Context Protocol (MCP)
  2. Click + Add.
  3. Click on Command in the top-left corner of the dialog and select the As JSON option from the list
  4. Add this configuration and click OK
json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}
  1. Click Apply to save changes.
  2. The same way context7 could be added for JetBrains Junie in Settings -> Tools -> Junie -> MCP Settings

See Kiro Model Context Protocol Documentation for details.

  1. Navigate Kiro > MCP Servers
  2. Add a new MCP server by clicking the + Add button.
  3. Paste the configuration given below:
json
{
  "mcpServers": {
    "Context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"],
      "env": {},
      "disabled": false,
      "autoApprove": []
    }
  }
}
  1. Click Save to apply the changes.

Use the Add manually feature and fill in the JSON configuration information for that MCP server. For more details, visit the Trae documentation.

Trae Remote Server Connection

json
{
  "mcpServers": {
    "context7": {
      "url": "https://mcp.context7.com/mcp"
    }
  }
}

Trae Local Server Connection

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

Use these alternatives to run the local Context7 MCP server with other runtimes. These examples work for any client that supports launching a local MCP server via command + args.

Bun

json
{
  "mcpServers": {
    "context7": {
      "command": "bunx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

Deno

json
{
  "mcpServers": {
    "context7": {
      "command": "deno",
      "args": [
        "run",
        "--allow-env=NO_DEPRECATION,TRACE_DEPRECATION",
        "--allow-net",
        "npm:@upstash/context7-mcp"
      ]
    }
  }
}

If you prefer to run the MCP server in a Docker container:

  1. Build the Docker Image:

    First, create a Dockerfile in the project root (or anywhere you prefer):

    Dockerfile
    FROM node:18-alpine
    
    WORKDIR /app
    
    # Install the latest version globally
    RUN npm install -g @upstash/context7-mcp
    
    # Expose default port if needed (optional, depends on MCP client interaction)
    # EXPOSE 3000
    
    # Default command to run the server
    CMD ["context7-mcp"]
    

    Then, build the image using a tag (e.g., context7-mcp). Make sure Docker Desktop (or the Docker daemon) is running. Run the following command in the same directory where you saved the Dockerfile:

    bash
    docker build -t context7-mcp .
    
  2. Configure Your MCP Client:

    Update your MCP client's configuration to use the Docker command.

    Example for a cline_mcp_settings.json:

    json
    {
      "mcpServers": {
        "Сontext7": {
          "autoApprove": [],
          "disabled": false,
          "timeout": 60,
          "command": "docker",
          "args": ["run", "-i", "--rm", "context7-mcp"],
          "transportType": "stdio"
        }
      }
    }
    

    Note: This is an example configuration. Please refer to the specific examples for your MCP client (like Cursor, VS Code, etc.) earlier in this README to adapt the structure (e.g., mcpServers vs servers). Also, ensure the image name in args matches the tag used during the docker build command.

Install the context7.mcpb file under the mcpb folder and add it to your client. For more information, please check out MCP bundles docs.

The configuration on Windows is slightly different compared to Linux or macOS (Cline is used in the example). The same principle applies to other editors; refer to the configuration of command and args.

json
{
  "mcpServers": {
    "github.com/upstash/context7-mcp": {
      "command": "cmd",
      "args": ["/c", "npx", "-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"],
      "disabled": false,
      "autoApprove": []
    }
  }
}

Add this to your Amazon Q Developer CLI configuration file. See Amazon Q Developer CLI docs for more details.

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

See Warp Model Context Protocol Documentation for details.

  1. Navigate Settings > AI > Manage MCP servers.
  2. Add a new MCP server by clicking the + Add button.
  3. Paste the configuration given below:
json
{
  "Context7": {
    "command": "npx",
    "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"],
    "env": {},
    "working_directory": null,
    "start_on_launch": true
  }
}
  1. Click Save to apply the changes.

Using Context7 with Copilot Coding Agent

Add the following configuration to the mcp section of your Copilot Coding Agent configuration file Repository->Settings->Copilot->Coding agent->MCP configuration:

json
{
  "mcpServers": {
    "context7": {
      "type": "http",
      "url": "https://mcp.context7.com/mcp",
      "headers": {
        "CONTEXT7_API_KEY": "YOUR_API_KEY"
      },
      "tools": ["get-library-docs", "resolve-library-id"]
    }
  }
}

For more information, see the official GitHub documentation.

  1. Open the Copilot CLI MCP config file. The location is ~/.copilot/mcp-config.json (where ~ is your home directory).
  2. Add the following to the mcpServers object in your mcp-config.json file:
json
{
  "mcpServers": {
    "context7": {
      "type": "http",
      "url": "https://mcp.context7.com/mcp",
      "headers": {
        "CONTEXT7_API_KEY": "YOUR_API_KEY"
      },
      "tools": [
        "get-library-docs", 
        "resolve-library-id"
      ]
    }
  }
}

Or, for a local server:

json
{
  "mcpServers": {
    "context7": {
      "type": "local",
      "command": "npx",
      "tools": [
        "get-library-docs", 
        "resolve-library-id"
      ],
      "args": [
        "-y",
        "@upstash/context7-mcp",
        "--api-key",
        "YOUR_API_KEY"
      ]
    }
  }
}

If the mcp-config.json file does not exist, create it.

See LM Studio MCP Support for more information.

One-click install:

Add MCP Server context7 to LM Studio

Manual set-up:

  1. Navigate to Program (right side) > Install > Edit mcp.json.
  2. Paste the configuration given below:
json
{
  "mcpServers": {
    "Context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}
  1. Click Save to apply the changes.
  2. Toggle the MCP server on/off from the right hand side, under Program, or by clicking the plug icon at the bottom of the chat box.

You can configure Context7 MCP in Visual Studio 2022 by following the Visual Studio MCP Servers documentation.

Add this to your Visual Studio MCP config file (see the Visual Studio docs for details):

json
{
  "inputs": [],
  "servers": {
    "context7": {
      "type": "http",
      "url": "https://mcp.context7.com/mcp",
      "headers": {
        "CONTEXT7_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}

Or, for a local server:

json
{
  "mcp": {
    "servers": {
      "context7": {
        "type": "stdio",
        "command": "npx",
        "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
      }
    }
  }
}

For more information and troubleshooting, refer to the Visual Studio MCP Servers documentation.

Add this to your Crush configuration file. See Crush MCP docs for more info.

Crush Remote Server Connection (HTTP)

json
{
  "$schema": "https://charm.land/crush.json",
  "mcp": {
    "context7": {
      "type": "http",
      "url": "https://mcp.context7.com/mcp",
      "headers": {
        "CONTEXT7_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}

Crush Local Server Connection

json
{
  "$schema": "https://charm.land/crush.json",
  "mcp": {
    "context7": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

Open the "Settings" page of the app, navigate to "Plugins," and enter the following JSON:

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

Once saved, enter in the chat get-library-docs followed by your Context7 documentation ID (e.g., get-library-docs /nuxt/ui). More information is available on BoltAI's Documentation site. For BoltAI on iOS, see this guide.

Edit your Rovo Dev CLI MCP config by running the command below -

bash
acli rovodev mcp

Example config -

Remote Server Connection

json
{
  "mcpServers": {
    "context7": {
      "url": "https://mcp.context7.com/mcp"
    }
  }
}

Local Server Connection

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

To configure Context7 MCP in Zencoder, follow these steps:

  1. Go to the Zencoder menu (...)
  2. From the dropdown menu, select Agent tools
  3. Click on the Add custom MCP
  4. Add the name and server configuration from below, and make sure to hit the Install button
json
{
  "command": "npx",
  "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
}

Once the MCP server is added, you can easily continue using it.

See Qodo Gen docs for more details.

  1. Open Qodo Gen chat panel in VSCode or IntelliJ.
  2. Click Connect more tools.
  3. Click + Add new MCP.
  4. Add the following configuration:

Qodo Gen Local Server Connection

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"]
    }
  }
}

Qodo Gen Remote Server Connection

json
{
  "mcpServers": {
    "context7": {
      "url": "https://mcp.context7.com/mcp"
    }
  }
}

See Local and Remote MCPs for Perplexity for more information.

  1. Navigate Perplexity > Settings
  2. Select Connectors.
  3. Click Add Connector.
  4. Select Advanced.
  5. Enter Server Name: Context7
  6. Paste the following JSON in the text area:
json
{
  "args": ["-y", "@upstash/context7-mcp", "--api-key", "YOUR_API_KEY"],
  "command": "npx",
  "env": {}
}
  1. Click Save.

Factory's droid supports MCP servers through its CLI. See Factory MCP docs for more info.

Factory Remote Server Connection (HTTP)

Run this command in your terminal:

sh
droid mcp add context7 https://mcp.context7.com/mcp --type http --header "CONTEXT7_API_KEY: YOUR_API_KEY"

Or without an API key (basic usage with rate limits):

sh
droid mcp add context7 https://mcp.context7.com/mcp --type http

Factory Local Server Connection (Stdio)

Run this command in your terminal:

sh
droid mcp add context7 "npx -y @upstash/context7-mcp" --env CONTEXT7_API_KEY=YOUR_API_KEY

Once configured, Context7 tools will be available in your droid sessions. Type /mcp within droid to manage servers, authenticate, and view available tools.

🔨 Available Tools

Context7 MCP provides the following tools that LLMs can use:

  • resolve-library-id: Resolves a general library name into a Context7-compatible library ID.

    • libraryName (required): The name of the library to search for
  • get-library-docs: Fetches documentation for a library using a Context7-compatible library ID.

    • context7CompatibleLibraryID (required): Exact Context7-compatible library ID (e.g., /mongodb/docs, /vercel/next.js)
    • topic (optional): Focus the docs on a specific topic (e.g., "routing", "hooks")
    • tokens (optional, default 5000): Max number of tokens to return. Values less than 1000 are automatically increased to 1000.

🛟 Tips

Add a Rule

If you don’t want to add use context7 to every prompt, you can define a simple rule in your MCP client's rule section:

  • For Windsurf, in .windsurfrules file
  • For Cursor, from Cursor Settings > Rules section
  • For Claude Code, in CLAUDE.md file

Or the equivalent in your MCP client to auto-invoke Context7 on any code question.

Example Rule

txt
Always use context7 when I need code generation, setup or configuration steps, or
library/API documentation. This means you should automatically use the Context7 MCP
tools to resolve library id and get library docs without me having to explicitly ask.

From then on, you’ll get Context7’s docs in any related conversation without typing anything extra. You can alter the rule to match your use cases.

Use Library Id

If you already know exactly which library you want to use, add its Context7 ID to your prompt. That way, Context7 MCP server can skip the library-matching step and directly continue with retrieving docs.

txt
Implement basic authentication with Supabase. use library /supabase/supabase for API and docs.

The slash syntax tells the MCP tool exactly which library to load docs for.

HTTPS Proxy

If you are behind an HTTP proxy, Context7 uses the standard https_proxy / HTTPS_PROXY environment variables.

💻 Development

Clone the project and install dependencies:

bash
bun i

Build:

bash
bun run build

Run the server:

bash
bun run dist/index.js

CLI Arguments

context7-mcp accepts the following CLI flags:

  • --transport <stdio|http> – Transport to use (stdio by default). Use http for remote HTTP server or stdio for local integration.
  • --port <number> – Port to listen on when using http transport (default 3000).
  • --api-key <key> – API key for authentication (or set CONTEXT7_API_KEY env var). You can get your API key by creating an account at context7.com/dashboard.

Example with HTTP transport and port 8080:

bash
bun run dist/index.js --transport http --port 8080

Another example with stdio transport:

bash
bun run dist/index.js --transport stdio --api-key YOUR_API_KEY

Environment Variables

You can use the CONTEXT7_API_KEY environment variable instead of passing the --api-key flag. This is useful for:

  • Storing API keys securely in .env files
  • Integration with MCP server setups that use dotenv
  • Tools that prefer environment variable configuration

Note: The --api-key CLI flag takes precedence over the environment variable when both are provided.

Example with .env file:

bash
# .env
CONTEXT7_API_KEY=your_api_key_here

Example MCP configuration using environment variable:

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "@upstash/context7-mcp"],
      "env": {
        "CONTEXT7_API_KEY": "YOUR_API_KEY"
      }
    }
  }
}
json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["tsx", "/path/to/folder/context7/src/index.ts", "--api-key", "YOUR_API_KEY"]
    }
  }
}
bash
npx -y @modelcontextprotocol/inspector npx @upstash/context7-mcp

🚨 Troubleshooting

If you encounter ERR_MODULE_NOT_FOUND, try using bunx instead of npx:

json
{
  "mcpServers": {
    "context7": {
      "command": "bunx",
      "args": ["-y", "@upstash/context7-mcp"]
    }
  }
}

This often resolves module resolution issues in environments where npx doesn't properly install or resolve packages.

For errors like Error: Cannot find module 'uriTemplate.js', try the --experimental-vm-modules flag:

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "--node-options=--experimental-vm-modules", "@upstash/context7-mcp@1.0.6"]
    }
  }
}

Use the --experimental-fetch flag to bypass TLS-related problems:

json
{
  "mcpServers": {
    "context7": {
      "command": "npx",
      "args": ["-y", "--node-options=--experimental-fetch", "@upstash/context7-mcp"]
    }
  }
}
  1. Try adding @latest to the package name
  2. Use bunx as an alternative to npx
  3. Consider using deno as another alternative
  4. Ensure you're using Node.js v18 or higher for native fetch support

⚠️ Disclaimer

1- Context7 projects are community-contributed and while we strive to maintain high quality, we cannot guarantee the accuracy, completeness, or security of all library documentation. Projects listed in Context7 are developed and maintained by their respective owners, not by Context7. If you encounter any suspicious, inappropriate, or potentially harmful content, please use the "Report" button on the project page to notify us immediately. We take all reports seriously and will review flagged content promptly to maintain the integrity and safety of our platform. By using Context7, you acknowledge that you do so at your own discretion and risk.

2- This repository hosts the MCP server’s source code. The supporting components — API backend, parsing engine, and crawling engine — are private and not part of this release.

🤝 Connect with Us

Stay updated and join our community:

📺 Context7 In Media

⭐ Star History

Star History Chart

📄 License

MIT

Star History

Star History Chart

Repository Owner

upstash
upstash

Organization

Repository Details

Language JavaScript
Default Branch master
Size 4,954 KB
Contributors 30
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

JavaScript
58.2%
TypeScript
38.71%
Dockerfile
3.09%

Tags

Topics

llm mcp mcp-server vibe-coding

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Unichat MCP Server

    Unichat MCP Server

    Universal MCP server providing context-aware AI chat and code tools across major model vendors.

    Unichat MCP Server enables sending standardized requests to leading AI model vendors, including OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, and Inception, utilizing the Model Context Protocol. It features unified endpoints for chat interactions and provides specialized tools for code review, documentation generation, code explanation, and programmatic code reworking. The server is designed for seamless integration with platforms like Claude Desktop and installation via Smithery. Vendor API keys are required for secure access to supported providers.

    • 37
    • MCP
    • amidabuddha/unichat-mcp-server
  • MCP CLI

    MCP CLI

    A powerful CLI for seamless interaction with Model Context Protocol servers and advanced LLMs.

    MCP CLI is a modular command-line interface designed for interacting with Model Context Protocol (MCP) servers and managing conversations with large language models. It integrates with the CHUK Tool Processor and CHUK-LLM to provide real-time chat, interactive command shells, and automation capabilities. The system supports a wide array of AI providers and models, advanced tool usage, context management, and performance metrics. Rich output formatting, concurrent tool execution, and flexible configuration make it suitable for both end-users and developers.

    • 1,755
    • MCP
    • chrishayuk/mcp-cli
  • Lucidity MCP

    Lucidity MCP

    Intelligent prompt-based code quality analysis for AI coding assistants.

    Lucidity MCP is a Model Context Protocol (MCP) server that empowers AI coding assistants to deliver high-quality code through intelligent, prompt-driven analysis. It offers comprehensive detection of code issues across multiple quality dimensions, providing structured and actionable feedback. With language-agnostic capabilities, extensible framework, and flexible transport options, Lucidity MCP seamlessly integrates into developer workflows and AI systems.

    • 72
    • MCP
    • hyperb1iss/lucidity-mcp
  • godoc-mcp

    godoc-mcp

    Token-efficient Go documentation server for LLMs using Model Context Protocol.

    godoc-mcp is a Model Context Protocol (MCP) server that provides efficient, structured access to Go package documentation for large language models. It enables LLMs to understand Go projects without reading entire source files by supplying essential documentation and source code at varying levels of granularity. The tool supports project navigation, automatic module setup, caching, and works offline for both standard and third-party Go packages.

    • 88
    • MCP
    • mrjoshuak/godoc-mcp
  • NyxDocs

    NyxDocs

    MCP server for real-time cryptocurrency project documentation and insights.

    NyxDocs is a Model Context Protocol (MCP) compatible server built in Python for managing and serving up-to-date documentation for cryptocurrency projects. It aggregates information from multiple sources such as CoinGecko, GitHub, GitBook, Notion, and official websites, providing real-time data and updates on blockchain ecosystems. Featuring tools for searching projects, retrieving detailed info, extracting documentation, and monitoring changes, it is tailored for developers and AI contexts needing access to accurate crypto documentation. The architecture leverages a FastMCP-based server core, automated document scrapers, and supports multi-blockchain environments.

    • 3
    • MCP
    • nyxn-ai/NyxDocs
  • kibitz

    kibitz

    The coding agent for professionals with MCP integration.

    kibitz is a coding agent that supports advanced AI collaboration by enabling seamless integration with Model Context Protocol (MCP) servers via WebSockets. It allows users to configure Anthropic API keys, system prompts, and custom context providers for each project, enhancing contextual understanding for coding tasks. The platform is designed for developers and professionals seeking tailored AI-driven coding workflows and provides flexible project-specific configuration.

    • 104
    • MCP
    • nick1udwig/kibitz
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results