
cloudflare/mcp-server-cloudflare
Connect Cloudflare services to Model Context Protocol (MCP) clients for AI-powered management.
Key Features
Use Cases
README
Cloudflare MCP Server
Model Context Protocol (MCP) is a new, standardized protocol for managing context between large language models (LLMs) and external systems. In this repository, you can find several MCP servers allowing you to connect to Cloudflare's service from an MCP client (e.g. Cursor, Claude) and use natural language to accomplish tasks through your Cloudflare account.
These MCP servers allow your MCP Client to read configurations from your account, process information, make suggestions based on data, and even make those suggested changes for you. All of these actions can happen across Cloudflare's many services including application development, security and performance.
The following servers are included in this repository:
Server Name | Description | Server URL |
---|---|---|
Documentation server | Get up to date reference information on Cloudflare | https://docs.mcp.cloudflare.com/sse |
Workers Bindings server | Build Workers applications with storage, AI, and compute primitives | https://bindings.mcp.cloudflare.com/sse |
Workers Builds server | Get insights and manage your Cloudflare Workers Builds | https://builds.mcp.cloudflare.com/sse |
Observability server | Debug and get insight into your application's logs and analytics | https://observability.mcp.cloudflare.com/sse |
Radar server | Get global Internet traffic insights, trends, URL scans, and other utilities | https://radar.mcp.cloudflare.com/sse |
Container server | Spin up a sandbox development environment | https://containers.mcp.cloudflare.com/sse |
Browser rendering server | Fetch web pages, convert them to markdown and take screenshots | https://browser.mcp.cloudflare.com/sse |
Logpush server | Get quick summaries for Logpush job health | https://logs.mcp.cloudflare.com/sse |
AI Gateway server | Search your logs, get details about the prompts and responses | https://ai-gateway.mcp.cloudflare.com/sse |
AutoRAG server | List and search documents on your AutoRAGs | https://autorag.mcp.cloudflare.com/sse |
Audit Logs server | Query audit logs and generate reports for review | https://auditlogs.mcp.cloudflare.com/sse |
DNS Analytics server | Optimize DNS performance and debug issues based on current set up | https://dns-analytics.mcp.cloudflare.com/sse |
Digital Experience Monitoring server | Get quick insight on critical applications for your organization | https://dex.mcp.cloudflare.com/sse |
Cloudflare One CASB server | Quickly identify any security misconfigurations for SaaS applications to safeguard users & data | https://casb.mcp.cloudflare.com/sse |
GraphQL server | Get analytics data using Cloudflare’s GraphQL API | https://graphql.mcp.cloudflare.com/sse |
Access the remote MCP server from any MCP client
If your MCP client has first class support for remote MCP servers, the client will provide a way to accept the server URL directly within its interface (e.g. Cloudflare AI Playground)
If your client does not yet support remote MCP servers, you will need to set up its respective configuration file using mcp-remote (https://www.npmjs.com/package/mcp-remote) to specify which servers your client can access.
{
"mcpServers": {
"cloudflare-observability": {
"command": "npx",
"args": ["mcp-remote", "https://observability.mcp.cloudflare.com/sse"]
},
"cloudflare-bindings": {
"command": "npx",
"args": ["mcp-remote", "https://bindings.mcp.cloudflare.com/sse"]
}
}
}
Using Cloudflare's MCP servers from the OpenAI Responses API
To use one of Cloudflare's MCP servers with OpenAI's responses API, you will need to provide the Responses API with an API token that has the scopes (permissions) required for that particular MCP server.
For example, to use the Browser Rendering MCP server with OpenAI, create an API token in the Cloudflare dashboard here, with the following permissions:
Need access to more Cloudflare tools?
We're continuing to add more functionality to this remote MCP server repo. If you'd like to leave feedback, file a bug or provide a feature request, please open an issue on this repository
Troubleshooting
"Claude's response was interrupted ... "
If you see this message, Claude likely hit its context-length limit and stopped mid-reply. This happens most often on servers that trigger many chained tool calls such as the observability server.
To reduce the chance of running in to this issue:
- Try to be specific, keep your queries concise.
- If a single request calls multiple tools, try to to break it into several smaller tool calls to keep the responses short.
Paid Features
Some features may require a paid Cloudflare Workers plan. Ensure your Cloudflare account has the necessary subscription level for the features you intend to use.
Contributing
Interested in contributing, and running this server locally? See CONTRIBUTING.md to get started.
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers

mcpmcp-server
Seamlessly discover, set up, and integrate MCP servers with AI clients.
mcpmcp-server enables users to discover, configure, and connect MCP servers with preferred clients, optimizing AI integration into daily workflows. It supports streamlined setup via JSON configuration, ensuring compatibility with various platforms such as Claude Desktop on macOS. The project simplifies the connection process between AI clients and remote Model Context Protocol servers. Users are directed to an associated homepage for further platform-specific guidance.
- ⭐ 17
- MCP
- glenngillen/mcpmcp-server

mcp
Universal remote MCP server connecting AI clients to productivity tools.
WayStation MCP acts as a remote Model Context Protocol (MCP) server, enabling seamless integration between AI clients like Claude or Cursor and a wide range of productivity applications, such as Notion, Monday, Airtable, Jira, and more. It supports multiple secure connection transports and offers both general and user-specific preauthenticated endpoints. The platform emphasizes ease of integration, OAuth2-based authentication, and broad app compatibility. Users can manage their integrations through a user dashboard, simplifying complex workflow automations for AI-powered productivity.
- ⭐ 27
- MCP
- waystation-ai/mcp

awslabs/mcp
Specialized MCP servers for seamless AWS integration in AI and development environments.
AWS MCP Servers is a suite of specialized servers implementing the open Model Context Protocol (MCP) to bridge large language model (LLM) applications with AWS services, tools, and data sources. It provides a standardized way for AI assistants, IDEs, and developer tools to access up-to-date AWS documentation, perform cloud operations, and automate workflows with context-aware intelligence. Featuring a broad catalog of domain-specific servers, quick installation for popular platforms, and both local and remote deployment options, it enhances cloud-native development, infrastructure management, and workflow automation for AI-driven tools. The project includes Docker, Lambda, and direct integration instructions for environments such as Amazon Q CLI, Cursor, Windsurf, Kiro, and VS Code.
- ⭐ 6,220
- MCP
- awslabs/mcp

OpenMCP
A standard and registry for converting web APIs into MCP servers.
OpenMCP defines a standard for converting various web APIs into servers compatible with the Model Context Protocol (MCP), enabling efficient, token-aware communication with client LLMs. It also provides an open-source registry of compliant servers, allowing clients to access a wide array of external services. The platform supports integration with local and remote hosting environments and offers tools for configuring supported clients, such as Claude desktop and Cursor. Comprehensive guidance is offered for adapting different API formats including REST, gRPC, GraphQL, and more into MCP endpoints.
- ⭐ 252
- MCP
- wegotdocs/open-mcp

mcp-server-js
Enable secure, AI-driven process automation and code execution on YepCode via Model Context Protocol.
YepCode MCP Server acts as a Model Context Protocol (MCP) server that facilitates seamless communication between AI platforms and YepCode’s workflow automation infrastructure. It allows AI assistants and clients to execute code, manage environment variables, and interact with storage through standardized tools. The server can expose YepCode processes directly as MCP tools and supports both hosted and local installations via NPX or Docker. Enterprise-grade security and real-time interaction make it suitable for integrating advanced automation into AI-powered environments.
- ⭐ 31
- MCP
- yepcode/mcp-server-js
Didn't find tool you were looking for?