Ragie Model Context Protocol Server
Seamless knowledge base retrieval via Model Context Protocol for enhanced AI context.
Key Features
Use Cases
README
Ragie Model Context Protocol Server
A Model Context Protocol (MCP) server that provides access to Ragie's knowledge base retrieval capabilities.
Description
This server implements the Model Context Protocol to enable AI models to retrieve information from a Ragie knowledge base. It provides a single tool called "retrieve" that allows querying the knowledge base for relevant information.
Prerequisites
- Node.js >= 18
- A Ragie API key
Installation
The server requires the following environment variable:
RAGIE_API_KEY(required): Your Ragie API authentication key
The server will start and listen on stdio for MCP protocol messages.
Install and run the server with npx:
RAGIE_API_KEY=your_api_key npx @ragieai/mcp-server
Command Line Options
The server supports the following command line options:
--description, -d <text>: Override the default tool description with custom text--partition, -p <id>: Specify the Ragie partition ID to query
Examples:
# With custom description
RAGIE_API_KEY=your_api_key npx @ragieai/mcp-server --description "Search the company knowledge base for information"
# With partition specified
RAGIE_API_KEY=your_api_key npx @ragieai/mcp-server --partition your_partition_id
# Using both options
RAGIE_API_KEY=your_api_key npx @ragieai/mcp-server --description "Search the company knowledge base" --partition your_partition_id
Cursor Configuration
To use this MCP server with Cursor:
Option 1: Create an MCP configuration file
- Save a file called
mcp.json
- For tools specific to a project, create a
.cursor/mcp.jsonfile in your project directory. This allows you to define MCP servers that are only available within that specific project. - For tools that you want to use across all projects, create a
~/.cursor/mcp.jsonfile in your home directory. This makes MCP servers available in all your Cursor workspaces.
Example mcp.json:
{
"mcpServers": {
"ragie": {
"command": "npx",
"args": [
"-y",
"@ragieai/mcp-server",
"--partition",
"optional_partition_id"
],
"env": {
"RAGIE_API_KEY": "your_api_key"
}
}
}
}
Option 2: Use a shell script
- Save a file called
ragie-mcp.shon your system:
#!/usr/bin/env bash
export RAGIE_API_KEY="your_api_key"
npx -y @ragieai/mcp-server --partition optional_partition_id
-
Give the file execute permissions:
chmod +x ragie-mcp.sh -
Add the MCP server script by going to Settings -> Cursor Settings -> MCP Servers in the Cursor UI.
Replace your_api_key with your actual Ragie API key and optionally set the partition ID if needed.
Claude Desktop Configuration
To use this MCP server with Claude desktop:
- Create the MCP config file
claude_desktop_config.json:
- For MacOS: Use
~/Library/Application Support/Claude/claude_desktop_config.json - For Windows: Use
%APPDATA%/Claude/claude_desktop_config.json
Example claude_desktop_config.json:
{
"mcpServers": {
"ragie": {
"command": "npx",
"args": [
"-y",
"@ragieai/mcp-server",
"--partition",
"optional_partition_id"
],
"env": {
"RAGIE_API_KEY": "your_api_key"
}
}
}
}
Replace your_api_key with your actual Ragie API key and optionally set the partition ID if needed.
- Restart Claude desktop for the changes to take effect.
The Ragie retrieval tool will now be available in your Claude desktop conversations.
Features
Retrieve Tool
The server provides a retrieve tool that can be used to search the knowledge base. It accepts the following parameters:
query(string): The search query to find relevant informationtopK(number, optional, default: 8): The maximum number of results to returnrerank(boolean, optional, default: true): Whether to try and find only the most relevant informationrecencyBias(boolean, optional, default: false): Whether to favor results towards more recent information
The tool returns:
- An array of content chunks containing matching text from the knowledge base
Development
This project is written in TypeScript and uses the following main dependencies:
@modelcontextprotocol/sdk: For implementing the MCP serverragie: For interacting with the Ragie APIzod: For runtime type validation
Development setup
Running the server in dev mode:
RAGIE_API_KEY=your_api_key npm run dev -- --partition optional_partition_id
Building the project:
npm run build
License
MIT License - See LICENSE.txt for details.
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
Biel.ai MCP Server
Seamlessly connect IDEs to your company’s product documentation using an MCP server.
Biel.ai MCP Server enables AI tools such as Cursor, VS Code, and Claude Desktop to access and utilize a company’s product documentation and knowledge base through the Model Context Protocol. It provides a hosted RAG layer that makes documentation searchable and usable, supporting real-time, context-rich completion and answers for developers. The server can be used as a hosted solution or self-hosted locally or via Docker for advanced customization.
- ⭐ 2
- MCP
- TechDocsStudio/biel-mcp
Driflyte MCP Server
Bridging AI assistants with deep, topic-aware knowledge from web and code sources.
Driflyte MCP Server acts as a bridge between AI-powered assistants and diverse, topic-aware content sources by exposing a Model Context Protocol (MCP) server. It enables retrieval-augmented generation workflows by crawling, indexing, and serving topic-specific documents from web pages and GitHub repositories. The system is extensible, with planned support for additional knowledge sources, and is designed for easy integration with popular AI tools such as ChatGPT, Claude, and VS Code.
- ⭐ 9
- MCP
- serkan-ozal/driflyte-mcp-server
mcp-local-rag
Local RAG server for web search and context injection using Model Context Protocol.
mcp-local-rag is a local server implementing the Model Context Protocol (MCP) to provide retrieval-augmented generation (RAG) capabilities. It performs live web search, extracts relevant context using Google's MediaPipe Text Embedder, and supplies the information to large language models (LLMs) for enhanced, up-to-date responses. The tool is designed for easy local deployment, requiring no external APIs, and is compatible with multiple MCP clients. Security audits are available, and integration is demonstrated across several LLM platforms.
- ⭐ 89
- MCP
- nkapila6/mcp-local-rag
Yuque-MCP-Server
Seamless integration of Yuque knowledge base with Model-Context-Protocol for AI model context management.
Yuque-MCP-Server provides an MCP-compatible server for interacting with the Yuque knowledge base platform. It enables AI models to retrieve, manage, and analyze Yuque documents and user information through a standardized Model-Context-Protocol interface. The server supports operations such as document creation, reading, updating, deletion, advanced search, and team statistics retrieval, making it ideal for AI-powered workflows. Inspired by Figma-Context-MCP, it facilitates contextual awareness and dynamic knowledge management for AI applications.
- ⭐ 31
- MCP
- HenryHaoson/Yuque-MCP-Server
Dappier MCP Server
Real-time web search and premium data access for AI agents via Model Context Protocol.
Dappier MCP Server enables fast, real-time web search and access to premium data sources, including news, financial markets, sports, and weather, for AI agents using the Model Context Protocol (MCP). It integrates seamlessly with tools like Claude Desktop and Cursor, allowing users to enhance their AI workflows with up-to-date, trusted information. Simple installation and configuration are provided for multiple platforms, leveraging API keys for secure access. The solution supports deployment via Smithery and direct installation with 'uv', facilitating rapid setup for developers.
- ⭐ 35
- MCP
- DappierAI/dappier-mcp
Exa MCP Server
Fast, efficient web and code context for AI coding assistants.
Exa MCP Server provides a Model Context Protocol (MCP) server interface that connects AI assistants to Exa AI’s powerful search capabilities, including code, documentation, and web search. It enables coding agents to retrieve precise, token-efficient context from billions of sources such as GitHub, StackOverflow, and documentation sites, reducing hallucinations in coding agents. The platform supports integration with popular tools like Cursor, Claude, and VS Code through standardized MCP configuration, offering configurable access to various research and code-related tools via HTTP.
- ⭐ 3,224
- MCP
- exa-labs/exa-mcp-server
Didn't find tool you were looking for?