Vercel AI SDK Documentation MCP Agent

Vercel AI SDK Documentation MCP Agent

AI-powered documentation agent for Vercel AI SDK with Model Context Protocol support.

41
Stars
8
Forks
41
Watchers
3
Issues
The Vercel AI SDK Documentation MCP Agent is a server implementing the Model Context Protocol to enable AI-powered search and conversational querying of the Vercel AI SDK documentation. It features natural language understanding, semantic search using FAISS, and session-based context management for in-depth assistance. Integration with popular MCP clients like Claude Desktop and Cursor ensures seamless use in developer workflows. Automated documentation indexing and a Gemini-powered agent enhance accuracy and contextuality of responses.

Key Features

Direct natural language querying of Vercel AI SDK docs
Semantic similarity search with FAISS vector store
Session-based context management for conversations
Automated crawling and indexing of documentation
MCP server for integration with AI assistants
Gemini model-powered answering service
Direct semantic search via API
TypeScript and Node.js implementation
Integration with Claude Desktop and Cursor
Environment-variable configuration for API keys

Use Cases

Instantly answering developer questions about the Vercel AI SDK
Providing contextual and accurate responses based on official documentation
Assisting AI assistants with access to structured SDK documentation
Enabling natural language search interfaces for technical docs
Supporting ongoing technical support conversations via session management
Automating the process of keeping documentation indices up to date
Integrating knowledge bases into developer tools like code editors
Accelerating onboarding for teams working with the Vercel AI SDK
Reducing time spent searching through static documentation
Empowering advanced AI workflows with standardized context management

README

Vercel AI SDK Documentation MCP Agent

A Model Context Protocol (MCP) server that provides AI-powered search and querying capabilities for the Vercel AI SDK documentation. This project enables developers to ask questions about the Vercel AI SDK and receive accurate, contextualized responses based on the official documentation.

MCP Compatible TypeScript Node.js

Features

  • Direct Documentation Search: Query the Vercel AI SDK documentation index directly using similarity search
  • AI-Powered Agent: Ask natural language questions about the Vercel AI SDK and receive comprehensive answers
  • Session Management: Maintain conversation context across multiple queries
  • Automated Indexing: Includes tools to fetch, process, and index the latest Vercel AI SDK documentation

Architecture

This system consists of several key components:

  1. MCP Server: Exposes tools via the Model Context Protocol for integration with AI assistants
  2. DocumentFetcher: Crawls and processes the Vercel AI SDK documentation
  3. VectorStoreManager: Creates and manages the FAISS vector index for semantic search
  4. AgentService: Provides AI-powered answers to questions using the Google Gemini model
  5. DirectQueryService: Offers direct semantic search of the documentation

Setup Instructions

Prerequisites

  • Node.js 18+
  • npm
  • A Google API key for Gemini model access

Environment Variables

Create a .env file in the project root with the following variables:

GOOGLE_GENERATIVE_AI_API_KEY=your-google-api-key-here

You'll need to obtain a Google Gemini API key from the Google AI Studio.

Installation

  1. Clone the repository

    git clone https://github.com/IvanAmador/vercel-ai-docs-mcp.git
    cd vercel-ai-docs-mcp-agent
    
  2. Install dependencies

    npm install
    
  3. Build the project

    npm run build
    
  4. Build the documentation index

    npm run build:index
    
  5. Start the MCP server

    npm run start
    

Integration with Claude Desktop

Claude Desktop is a powerful AI assistant that supports MCP servers. To connect the Vercel AI SDK Documentation MCP agent with Claude Desktop:

  1. First, install Claude Desktop if you don't have it already.

  2. Open Claude Desktop settings (via the application menu, not within the chat interface).

  3. Navigate to the "Developer" tab and click "Edit Config".

  4. Add the Vercel AI Docs MCP server to your configuration:

json
{
  "mcpServers": {
    "vercel-ai-docs": {
      "command": "node",  
      "args": ["ABSOLUTE_PATH_TO_PROJECT/dist/main.js"],
      "env": {
        "GOOGLE_GENERATIVE_AI_API_KEY": "your-google-api-key-here"
      }
    }
  }
}

Make sure to replace:

  • ABSOLUTE_PATH_TO_PROJECT with the actual path to your project folder
  • your-google-api-key-here with your Google Gemini API key
  1. Save the config file and restart Claude Desktop.

  2. To verify the server is connected, look for the hammer πŸ”¨ icon in the Claude chat interface.

For more detailed information about setting up MCP servers with Claude Desktop, visit the MCP Quickstart Guide.

Integration with Other MCP Clients

This MCP server is compatible with any client that implements the Model Context Protocol. Here are a few examples:

Cursor

Cursor is an AI-powered code editor that supports MCP servers. To integrate with Cursor:

  1. Add a .cursor/mcp.json file to your project directory (for project-specific configuration) or a ~/.cursor/mcp.json file in your home directory (for global configuration).

  2. Add the following to your configuration file:

json
{
  "mcpServers": {
    "vercel-ai-docs": {
      "command": "node",  
      "args": ["ABSOLUTE_PATH_TO_PROJECT/dist/main.js"],
      "env": {
        "GOOGLE_GENERATIVE_AI_API_KEY": "your-google-api-key-here"
      }
    }
  }
}

For more information about using MCP with Cursor, refer to the Cursor MCP documentation.

Usage

The MCP server exposes three primary tools:

1. agent-query

Query the Vercel AI SDK documentation using an AI agent that can search and synthesize information.

json
{
  "name": "agent-query",
  "arguments": {
    "query": "How do I use the streamText function?",
    "sessionId": "unique-session-id"
  }
}

2. direct-query

Perform a direct similarity search against the Vercel AI SDK documentation index.

json
{
  "name": "direct-query",
  "arguments": {
    "query": "streamText usage",
    "limit": 5
  }
}

3. clear-memory

Clears the conversation memory for a specific session or all sessions.

json
{
  "name": "clear-memory",
  "arguments": {
    "sessionId": "unique-session-id"
  }
}

To clear all sessions, omit the sessionId parameter.

Development

Project Structure

β”œβ”€β”€ config/              # Configuration settings
β”œβ”€β”€ core/                # Core functionality
β”‚   β”œβ”€β”€ indexing/        # Document indexing and vector store
β”‚   └── query/           # Query services (agent and direct)
β”œβ”€β”€ files/               # Storage directories
β”‚   β”œβ”€β”€ docs/            # Processed documentation
β”‚   β”œβ”€β”€ faiss_index/     # Vector index files
β”‚   └── sessions/        # Session data
β”œβ”€β”€ mcp/                 # MCP server and tools
β”‚   β”œβ”€β”€ server.ts        # MCP server implementation
β”‚   └── tools/           # MCP tool definitions
β”œβ”€β”€ scripts/             # Build and utility scripts
└── utils/               # Helper utilities

Build Scripts

  • npm run build: Compile TypeScript files
  • npm run build:index: Build the documentation index
  • npm run dev:index: Build and index in development mode
  • npm run dev: Build and start in development mode

Troubleshooting

Common Issues

  1. Index not found or failed to load

    Run npm run build:index to create the index before starting the server.

  2. API rate limits

    When exceeding Google API rate limits, the agent service may return errors. Implement appropriate backoff strategies.

  3. Model connection issues

    Ensure your Google API key is valid and has access to the specified Gemini model.

  4. Claude Desktop not showing MCP server

    • Check your configuration file for syntax errors.
    • Make sure the path to the server is correct and absolute.
    • Check Claude Desktop logs for errors.
    • Restart Claude Desktop after making configuration changes.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT

Star History

Star History Chart

Repository Owner

IvanAmador
IvanAmador

User

Repository Details

Language TypeScript
Default Branch main
Size 64 KB
Contributors 1
MCP Verified Nov 12, 2025

Programming Languages

TypeScript
100%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • MCP-Typescribe

    MCP-Typescribe

    An MCP server for serving TypeScript API context to language models.

    MCP-Typescribe is an open-source implementation of the Model Context Protocol (MCP) focused on providing LLMs with contextual, real-time access to TypeScript API documentation. It parses TypeScript (and other) definitions using TypeDoc-generated JSON and serves this information via a queryable server that supports tools used by AI coding assistants. The solution enables AI agents to dynamically explore, search, and understand unknown APIs, accelerating onboarding and supporting agentic behaviors in code generation.

    • ⭐ 45
    • MCP
    • yWorks/mcp-typescribe
  • Agentset MCP

    Agentset MCP

    Open-source MCP server for Retrieval-Augmented Generation (RAG) document applications.

    Agentset MCP provides a Model Context Protocol (MCP) server designed to power context-aware, document-based applications using Retrieval-Augmented Generation. It enables developers to rapidly integrate intelligent context retrieval into their workflows and supports integration with AI platforms such as Claude. The server is easily installable via major JavaScript package managers and supports environment configuration for namespaces, tenant IDs, and tool descriptions.

    • ⭐ 22
    • MCP
    • agentset-ai/mcp-server
  • Exa MCP Server

    Exa MCP Server

    Fast, efficient web and code context for AI coding assistants.

    Exa MCP Server provides a Model Context Protocol (MCP) server interface that connects AI assistants to Exa AI’s powerful search capabilities, including code, documentation, and web search. It enables coding agents to retrieve precise, token-efficient context from billions of sources such as GitHub, StackOverflow, and documentation sites, reducing hallucinations in coding agents. The platform supports integration with popular tools like Cursor, Claude, and VS Code through standardized MCP configuration, offering configurable access to various research and code-related tools via HTTP.

    • ⭐ 3,224
    • MCP
    • exa-labs/exa-mcp-server
  • RAG Documentation MCP Server

    RAG Documentation MCP Server

    Vector-based documentation search and context augmentation for AI assistants

    RAG Documentation MCP Server provides vector-based search and retrieval tools for documentation, enabling large language models to reference relevant context in their responses. It supports managing multiple documentation sources, semantic search, and real-time context delivery. Documentation can be indexed, searched, and managed with queueing and processing features, making it highly suitable for AI-driven assistants. Integration with Claude Desktop and support for Qdrant vector databases is also available.

    • ⭐ 238
    • MCP
    • hannesrudolph/mcp-ragdocs
  • Azure MCP Server

    Azure MCP Server

    Connect AI agents with Azure services through Model Context Protocol.

    Azure MCP Server provides a seamless interface between AI agents and Azure services by implementing the Model Context Protocol (MCP) specification. It enables integration with tools like GitHub Copilot for Azure and supports a wide range of Azure resource management tasks directly via conversational AI interfaces. Designed for extensibility and compatibility, it offers enhanced contextual capabilities for agents working with Azure environments.

    • ⭐ 1,178
    • MCP
    • Azure/azure-mcp
  • Biel.ai MCP Server

    Biel.ai MCP Server

    Seamlessly connect IDEs to your company’s product documentation using an MCP server.

    Biel.ai MCP Server enables AI tools such as Cursor, VS Code, and Claude Desktop to access and utilize a company’s product documentation and knowledge base through the Model Context Protocol. It provides a hosted RAG layer that makes documentation searchable and usable, supporting real-time, context-rich completion and answers for developers. The server can be used as a hosted solution or self-hosted locally or via Docker for advanced customization.

    • ⭐ 2
    • MCP
    • TechDocsStudio/biel-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results