DuckDuckGo Search MCP Server
A Model Context Protocol server for DuckDuckGo web search and intelligent content retrieval.
Key Features
Use Cases
README
DuckDuckGo Search MCP Server
A Model Context Protocol (MCP) server that provides web search capabilities through DuckDuckGo, with additional features for content fetching and parsing.
Features
- Web Search: Search DuckDuckGo with advanced rate limiting and result formatting
- Content Fetching: Retrieve and parse webpage content with intelligent text extraction
- Rate Limiting: Built-in protection against rate limits for both search and content fetching
- Error Handling: Comprehensive error handling and logging
- LLM-Friendly Output: Results formatted specifically for large language model consumption
Installation
Installing via Smithery
To install DuckDuckGo Search Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @nickclyde/duckduckgo-mcp-server --client claude
Installing via uv
Install directly from PyPI using uv:
uv pip install duckduckgo-mcp-server
Usage
Running with Claude Desktop
- Download Claude Desktop
- Create or edit your Claude Desktop configuration:
- On macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - On Windows:
%APPDATA%\Claude\claude_desktop_config.json
- On macOS:
Add the following configuration:
{
"mcpServers": {
"ddg-search": {
"command": "uvx",
"args": ["duckduckgo-mcp-server"]
}
}
}
- Restart Claude Desktop
Development
For local development, you can use the MCP CLI:
# Run with the MCP Inspector
mcp dev server.py
# Install locally for testing with Claude Desktop
mcp install server.py
Available Tools
1. Search Tool
async def search(query: str, max_results: int = 10) -> str
Performs a web search on DuckDuckGo and returns formatted results.
Parameters:
query: Search query stringmax_results: Maximum number of results to return (default: 10)
Returns: Formatted string containing search results with titles, URLs, and snippets.
2. Content Fetching Tool
async def fetch_content(url: str) -> str
Fetches and parses content from a webpage.
Parameters:
url: The webpage URL to fetch content from
Returns: Cleaned and formatted text content from the webpage.
Features in Detail
Rate Limiting
- Search: Limited to 30 requests per minute
- Content Fetching: Limited to 20 requests per minute
- Automatic queue management and wait times
Result Processing
- Removes ads and irrelevant content
- Cleans up DuckDuckGo redirect URLs
- Formats results for optimal LLM consumption
- Truncates long content appropriately
Error Handling
- Comprehensive error catching and reporting
- Detailed logging through MCP context
- Graceful degradation on rate limits or timeouts
Contributing
Issues and pull requests are welcome! Some areas for potential improvement:
- Additional search parameters (region, language, etc.)
- Enhanced content parsing options
- Caching layer for frequently accessed content
- Additional rate limiting strategies
License
This project is licensed under the MIT License.
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
mcp-local-rag
Local RAG server for web search and context injection using Model Context Protocol.
mcp-local-rag is a local server implementing the Model Context Protocol (MCP) to provide retrieval-augmented generation (RAG) capabilities. It performs live web search, extracts relevant context using Google's MediaPipe Text Embedder, and supplies the information to large language models (LLMs) for enhanced, up-to-date responses. The tool is designed for easy local deployment, requiring no external APIs, and is compatible with multiple MCP clients. Security audits are available, and integration is demonstrated across several LLM platforms.
- ⭐ 89
- MCP
- nkapila6/mcp-local-rag
Scrapeless MCP Server
A real-time web integration layer for LLMs and AI agents built on the open MCP standard.
Scrapeless MCP Server is a powerful integration layer enabling large language models, AI agents, and applications to interact with the web in real time. Built on the open Model Context Protocol, it facilitates seamless connections between models like ChatGPT, Claude, and tools such as Cursor to external web capabilities, including Google services, browser automation, and advanced data extraction. The system supports multiple transport modes and is designed to provide dynamic, real-world context to AI workflows. Robust scraping, dynamic content handling, and flexible export formats are core parts of the feature set.
- ⭐ 57
- MCP
- scrapeless-ai/scrapeless-mcp-server
mcp-server-webcrawl
Advanced search and retrieval for web crawler data via MCP.
mcp-server-webcrawl provides an AI-oriented server that enables advanced filtering, analysis, and search over data from various web crawlers. Designed for seamless integration with large language models, it supports boolean search, filtering by resource types and HTTP status, and is compatible with popular crawling formats. It facilitates AI clients, such as Claude Desktop, with prompt routines and customizable workflows, making it easy to manage, query, and analyze archived web content. The tool supports integration with multiple crawler outputs and offers templates for automated routines.
- ⭐ 32
- MCP
- pragmar/mcp-server-webcrawl
Driflyte MCP Server
Bridging AI assistants with deep, topic-aware knowledge from web and code sources.
Driflyte MCP Server acts as a bridge between AI-powered assistants and diverse, topic-aware content sources by exposing a Model Context Protocol (MCP) server. It enables retrieval-augmented generation workflows by crawling, indexing, and serving topic-specific documents from web pages and GitHub repositories. The system is extensible, with planned support for additional knowledge sources, and is designed for easy integration with popular AI tools such as ChatGPT, Claude, and VS Code.
- ⭐ 9
- MCP
- serkan-ozal/driflyte-mcp-server
tavily-search MCP server
A search server that integrates Tavily API with Model Context Protocol tools.
tavily-search MCP server provides an MCP-compliant server to perform search queries using the Tavily API. It returns search results in text format, including AI responses, URLs, and result titles. The server is designed for easy integration with clients like Claude Desktop or Cursor and supports both local and Docker-based deployment. It facilitates AI workflows by offering search functionality as part of a standardized protocol interface.
- ⭐ 44
- MCP
- Tomatio13/mcp-server-tavily
WebScraping.AI MCP Server
MCP server for advanced web scraping and AI-driven data extraction
WebScraping.AI MCP Server implements the Model Context Protocol to provide web data extraction and question answering functionalities. It integrates with WebScraping.AI to offer robust tools for retrieving, rendering, and parsing web content, including structured data and natural language answers from web pages. It supports JavaScript rendering, proxy management, device emulation, and custom extraction configurations, making it suitable for both individual and team deployments in AI-assisted workflows.
- ⭐ 33
- MCP
- webscraping-ai/webscraping-ai-mcp-server
Didn't find tool you were looking for?