
lunar
Unified API gateway for managing and optimizing third-party API consumption and AI agent traffic.
Key Features
Use Cases
README
Welcome to Lunar.dev
Lunar.dev is an open-source platform for managing, governing and optimizing third-party API consumption across applications and AI agent workloads at scale.
Consumption Management for the AI Era
As AI agents and autonomous workflows increasingly rely on external APIs, there's a growing need for a mediation layer that acts as a central aggregation point between applications, agents, and the services they depend on.
Lunar.dev provides that layer—serving as a unified API Gateway for AI, delivering:
- Live API Traffic Visibility: Get real-time metrics on latency, errors, cost, and token usage across all outbound traffic, including LLM and agent calls.
- AI-Aware Policy Enforcement: Control tool access, throttle agent actions, and govern agentic traffic with fine-grained rules.
- Advanced Traffic Shaping: Apply rate limits, retries, priority queues, and circuit breakers to manage load and ensure reliability.
- Cost & Performance Optimization: Identify waste, smooth traffic peaks, and reduce overuse of costly APIs through smart gateway policies.
- Centralized MCP Aggregation: Streamline operations by consolidating multiple MCP servers into a single gateway, enhancing security, observability, and management.
Choose Your Path
Lunar.dev is composed of two major components:
- Lunar Proxy – our core API gateway and control layer
- Lunar MCPX – a zero-code aggregator for multiple MCP servers with unified API access
Explore the one that fits your needs—or use both for a full-stack solution.
Open Source at the Core
This project was born out of the need for a more robust, production-ready approach to managing third-party APIs. It remains open-source at its core and free for non-production/personal use. For production environments, we offer advanced features through guided onboarding and platform tiers; visit our website or reach out directly for more information
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers

pluggedin-mcp-proxy
Unified proxy server for Model Context Protocol data exchanges and AI integrations
Aggregates multiple Model Context Protocol (MCP) servers into a single, unified proxy interface, supporting real-time discovery, management, and orchestration of AI model resources, tools, and prompts. Enables seamless interaction between MCP clients such as Claude, Cline, and Cursor, while integrating advanced document search, AI document exchange, and workspace management. Provides flexible transport modes (STDIO and Streamable HTTP), robust authentication, and comprehensive security measures for safe and scalable AI data exchange.
- ⭐ 87
- MCP
- VeriTeknik/pluggedin-mcp-proxy

cloudflare/mcp-server-cloudflare
Connect Cloudflare services to Model Context Protocol (MCP) clients for AI-powered management.
Cloudflare MCP Server enables integration between Cloudflare's suite of services and clients using the Model Context Protocol (MCP). It provides multiple specialized servers that allow AI models to access, analyze, and manage configurations, logs, analytics, and other features across Cloudflare's platform. Users can leverage natural language interfaces in compatible MCP clients to read data, gain insights, and perform automated actions on their Cloudflare accounts. This project aims to streamline the orchestration of security, development, monitoring, and infrastructure tasks through standardized MCP connections.
- ⭐ 2,919
- MCP
- cloudflare/mcp-server-cloudflare

OpenMCP
A standard and registry for converting web APIs into MCP servers.
OpenMCP defines a standard for converting various web APIs into servers compatible with the Model Context Protocol (MCP), enabling efficient, token-aware communication with client LLMs. It also provides an open-source registry of compliant servers, allowing clients to access a wide array of external services. The platform supports integration with local and remote hosting environments and offers tools for configuring supported clients, such as Claude desktop and Cursor. Comprehensive guidance is offered for adapting different API formats including REST, gRPC, GraphQL, and more into MCP endpoints.
- ⭐ 252
- MCP
- wegotdocs/open-mcp

awslabs/mcp
Specialized MCP servers for seamless AWS integration in AI and development environments.
AWS MCP Servers is a suite of specialized servers implementing the open Model Context Protocol (MCP) to bridge large language model (LLM) applications with AWS services, tools, and data sources. It provides a standardized way for AI assistants, IDEs, and developer tools to access up-to-date AWS documentation, perform cloud operations, and automate workflows with context-aware intelligence. Featuring a broad catalog of domain-specific servers, quick installation for popular platforms, and both local and remote deployment options, it enhances cloud-native development, infrastructure management, and workflow automation for AI-driven tools. The project includes Docker, Lambda, and direct integration instructions for environments such as Amazon Q CLI, Cursor, Windsurf, Kiro, and VS Code.
- ⭐ 6,220
- MCP
- awslabs/mcp

1mcp-app/agent
A unified server that aggregates and manages multiple Model Context Protocol servers.
1MCP Agent provides a single, unified interface that aggregates multiple Model Context Protocol (MCP) servers, enabling seamless integration and management of external tools for AI assistants. It acts as a proxy, managing server configuration, authentication, health monitoring, and dynamic server control with features like asynchronous loading, tag-based filtering, and advanced security options. Compatible with popular AI development environments, it simplifies setup by reducing redundant server instances and resource usage. Users can configure, monitor, and scale model tool integrations across various AI clients through easy CLI commands or Docker deployment.
- ⭐ 96
- MCP
- 1mcp-app/agent

magg
Meta-MCP aggregator and manager for LLM capability extension.
Magg is a server that implements the Model Context Protocol (MCP), acting as a central aggregator and proxy for multiple MCP servers. It enables Large Language Models (LLMs) to dynamically discover, add, configure, and manage external tools at runtime. By aggregating tools from different MCP servers under unified namespaces, it streamlines capability management and introduces features such as configuration persistence, authentication, and real-time notifications. Magg offers both command-line and Docker deployment, with support for HTTP, stdio, and in-memory transport.
- ⭐ 62
- MCP
- sitbon/magg
Didn't find tool you were looking for?