Kagi MCP Server
Integrate Kagi Search and Summarization as Model Context Protocol tools.
Key Features
Use Cases
README
Kagi MCP server
Setup Intructions
Before anything, unless you are just using non-search tools, ensure you have access to the search API. It is currently in closed beta and available upon request. Please reach out to support@kagi.com for an invite.
Install uv first.
MacOS/Linux:
curl -LsSf https://astral.sh/uv/install.sh | sh
Windows:
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
Installing via Smithery
Alternatively, you can install Kagi for Claude Desktop via Smithery:
npx -y @smithery/cli install kagimcp --client claude
Setup with Claude
Claude Desktop
// claude_desktop_config.json
// Can find location through:
// Hamburger Menu -> File -> Settings -> Developer -> Edit Config
{
"mcpServers": {
"kagi": {
"command": "uvx",
"args": ["kagimcp"],
"env": {
"KAGI_API_KEY": "YOUR_API_KEY_HERE",
"KAGI_SUMMARIZER_ENGINE": "YOUR_ENGINE_CHOICE_HERE" // Defaults to "cecil" engine if env var not present
}
}
}
}
Claude Code
Add the Kagi mcp server with the following command (setting summarizer engine optional):
claude mcp add kagi -e KAGI_API_KEY="YOUR_API_KEY_HERE" KAGI_SUMMARIZER_ENGINE="YOUR_ENGINE_CHOICE_HERE" -- uvx kagimcp
Now claude code can use the Kagi mcp server. However, claude code comes with its own web search functionality by default, which may conflict with Kagi. You can disable claude's web search functionality with the following in your claude code settings file (~/.claude/settings.json):
{
"permissions": {
"deny": [
"WebSearch"
]
}
}
Pose query that requires use of a tool
e.g. "Who was time's 2024 person of the year?" for search, or "summarize this video: https://www.youtube.com/watch?v=jNQXAC9IVRw" for summarizer.
Debugging
Run:
npx @modelcontextprotocol/inspector uvx kagimcp
Local/Dev Setup Instructions
Clone repo
git clone https://github.com/kagisearch/kagimcp.git
Install dependencies
Install uv first.
MacOS/Linux:
curl -LsSf https://astral.sh/uv/install.sh | sh
Windows:
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
Then install MCP server dependencies:
cd kagimcp
# Create virtual environment and activate it
uv venv
source .venv/bin/activate # MacOS/Linux
# OR
.venv/Scripts/activate # Windows
# Install dependencies
uv sync
Setup with Claude Desktop
Using MCP CLI SDK
# `pip install mcp[cli]` if you haven't
mcp install /ABSOLUTE/PATH/TO/PARENT/FOLDER/kagimcp/src/kagimcp/server.py -v "KAGI_API_KEY=API_KEY_HERE"
Manually
# claude_desktop_config.json
# Can find location through:
# Hamburger Menu -> File -> Settings -> Developer -> Edit Config
{
"mcpServers": {
"kagi": {
"command": "uv",
"args": [
"--directory",
"/ABSOLUTE/PATH/TO/PARENT/FOLDER/kagimcp",
"run",
"kagimcp"
],
"env": {
"KAGI_API_KEY": "YOUR_API_KEY_HERE",
"KAGI_SUMMARIZER_ENGINE": "YOUR_ENGINE_CHOICE_HERE" // Defaults to "cecil" engine if env var not present
}
}
}
}
Pose query that requires use of a tool
e.g. "Who was time's 2024 person of the year?" for search, or "summarize this video: https://www.youtube.com/watch?v=jNQXAC9IVRw" for summarizer.
Debugging
Run:
# If mcp cli installed (`pip install mcp[cli]`)
mcp dev /ABSOLUTE/PATH/TO/PARENT/FOLDER/kagimcp/src/kagimcp/server.py
# If not
npx @modelcontextprotocol/inspector \
uv \
--directory /ABSOLUTE/PATH/TO/PARENT/FOLDER/kagimcp \
run \
kagimcp
Then access MCP Inspector at http://localhost:5173. You may need to add your Kagi API key in the environment variables in the inspector under KAGI_API_KEY.
Advanced Configuration
- Level of logging is adjustable through the
FASTMCP_LOG_LEVELenvironment variable (e.g.FASTMCP_LOG_LEVEL="ERROR")- Relevant issue: https://github.com/kagisearch/kagimcp/issues/4
- Summarizer engine can be customized using the
KAGI_SUMMARIZER_ENGINEenvironment variable (e.g.KAGI_SUMMARIZER_ENGINE="daphne")- Learn about the different summarization engines here
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
tavily-search MCP server
A search server that integrates Tavily API with Model Context Protocol tools.
tavily-search MCP server provides an MCP-compliant server to perform search queries using the Tavily API. It returns search results in text format, including AI responses, URLs, and result titles. The server is designed for easy integration with clients like Claude Desktop or Cursor and supports both local and Docker-based deployment. It facilitates AI workflows by offering search functionality as part of a standardized protocol interface.
- ⭐ 44
- MCP
- Tomatio13/mcp-server-tavily
Confluence Communication Server MCP Server
Seamlessly interact with Confluence using standardized MCP tools.
Confluence Communication Server MCP Server is a TypeScript-based implementation that enables seamless interaction with Confluence through the Model Context Protocol. It provides tools for running CQL queries, retrieving, and updating Confluence page content, thus showcasing and adhering to core MCP concepts. The server is compatible with Claude Desktop and supports installation via Smithery, emphasizing ease of integration and automation for Confluence management tasks.
- ⭐ 13
- MCP
- KS-GEN-AI/confluence-mcp-server
Brave Search MCP Server
MCP-compliant server providing advanced Brave Search API tools via STDIO and HTTP.
Implements a Model Context Protocol (MCP) server for integrating with the Brave Search API, offering tools for web, local business, image, video, and news searches along with AI-powered summarization. Supports both STDIO and HTTP transports and adheres to established MCP conventions for context management. Provides structured tool schemas and customizable parameters to handle sophisticated search queries and results. Enables advanced filtering, multi-type result aggregation, and seamless integration for AI model workflows.
- ⭐ 337
- MCP
- brave/brave-search-mcp-server
OpenAI MCP Server
Bridge between Claude and OpenAI models using the MCP protocol.
OpenAI MCP Server enables direct querying of OpenAI language models from Claude via the Model Context Protocol (MCP). It provides a configurable Python server that exposes OpenAI APIs as MCP endpoints. The server is designed for seamless integration, requiring simple configuration updates and environment variable setup. Automated testing is supported to verify connectivity and response from the OpenAI API.
- ⭐ 77
- MCP
- pierrebrunelle/mcp-server-openai
Semgrep MCP Server
A Model Context Protocol server powered by Semgrep for seamless code analysis integration.
Semgrep MCP Server implements the Model Context Protocol (MCP) to enable efficient and standardized communication for code analysis tasks. It facilitates integration with platforms like LM Studio, Cursor, and Visual Studio Code, providing both Docker and Python (PyPI) deployment options. The tool is now maintained in the main Semgrep repository with continued updates, enhancing compatibility and support across developer tools.
- ⭐ 611
- MCP
- semgrep/mcp
mcp-local-rag
Local RAG server for web search and context injection using Model Context Protocol.
mcp-local-rag is a local server implementing the Model Context Protocol (MCP) to provide retrieval-augmented generation (RAG) capabilities. It performs live web search, extracts relevant context using Google's MediaPipe Text Embedder, and supplies the information to large language models (LLMs) for enhanced, up-to-date responses. The tool is designed for easy local deployment, requiring no external APIs, and is compatible with multiple MCP clients. Security audits are available, and integration is demonstrated across several LLM platforms.
- ⭐ 89
- MCP
- nkapila6/mcp-local-rag
Didn't find tool you were looking for?