lunar

lunar

Unified API gateway for managing and optimizing third-party API consumption and AI agent traffic.

273
Stars
18
Forks
273
Watchers
3
Issues
Lunar.dev is an open-source platform designed to manage, govern, and optimize third-party API usage across applications and AI agent workloads. It offers a unified gateway with real-time traffic monitoring, AI-aware policy enforcement, and sophisticated traffic shaping. The platform consolidates and aggregates multiple Model Context Protocol (MCP) servers, providing centralized visibility, policy control, and cost optimization. Lunar.dev includes core components for proxying and for zero-code MCP aggregation with unified API access.

Key Features

Real-time API traffic metrics and visibility
AI-aware policy enforcement and governance
Advanced traffic shaping with rate limiting, retries, and circuit breakers
Centralized aggregation of multiple MCP servers
Unified API access for agents and applications
Cost optimization and resource management
Fine-grained rule-based access control
Live monitoring of latency, errors, and token usage
Priority queues for traffic management
Open source with production-tier extensibility

Use Cases

Aggregating multiple MCP servers into a single management gateway
Controlling and governing API usage by AI agents and autonomous workflows
Enforcing access policies and rate limits on outbound API calls
Monitoring cost, performance, and reliability of third-party APIs
Reducing wasteful API overuse with smart gateway policies
Ensuring secure and observable agentic traffic across an organization
Optimizing traffic flow to third-party or LLM APIs during high-load periods
Centralizing audit and observability for API requests from diverse sources
Preventing runaway costs with circuit breakers and prioritization
Facilitating zero-code integration for managing agent and application outbound traffic

README

License Documentation Website

Welcome to Lunar.dev

Lunar.dev is an open-source platform for managing, governing and optimizing third-party API consumption across applications and AI agent workloads at scale.

Consumption Management for the AI Era

As AI agents and autonomous workflows increasingly rely on external APIs, there's a growing need for a mediation layer that acts as a central aggregation point between applications, agents, and the services they depend on.

Lunar.dev provides that layer—serving as a unified API Gateway for AI, delivering:

  • Live API Traffic Visibility: Get real-time metrics on latency, errors, cost, and token usage across all outbound traffic, including LLM and agent calls.
  • AI-Aware Policy Enforcement: Control tool access, throttle agent actions, and govern agentic traffic with fine-grained rules.
  • Advanced Traffic Shaping: Apply rate limits, retries, priority queues, and circuit breakers to manage load and ensure reliability.
  • Cost & Performance Optimization: Identify waste, smooth traffic peaks, and reduce overuse of costly APIs through smart gateway policies.
  • Centralized MCP Aggregation: Streamline operations by consolidating multiple MCP servers into a single gateway, enhancing security, observability, and management.

Choose Your Path

Lunar.dev is composed of two major components:

  • Lunar Proxy – our core API gateway and control layer
  • Lunar MCPX – a zero-code aggregator for multiple MCP servers with unified API access

Explore the one that fits your needs—or use both for a full-stack solution.

Open Source at the Core

This project was born out of the need for a more robust, production-ready approach to managing third-party APIs. It remains open-source at its core and free for non-production/personal use. For production environments, we offer advanced features through guided onboarding and platform tiers; visit our website or reach out directly for more information

Star History

Star History Chart

Repository Owner

TheLunarCompany
TheLunarCompany

Organization

Repository Details

Language Go
Default Branch main
Size 18,459 KB
Contributors 29
License MIT License
MCP Verified Sep 1, 2025

Programming Languages

Go
47.55%
TypeScript
32.66%
Python
9.09%
Gherkin
2.86%
Java
2.14%
Shell
1.54%
Lua
1.5%
JavaScript
1.35%
Dockerfile
0.51%
HCL
0.49%
CSS
0.24%
Smarty
0.06%
HTML
0.01%

Topics

api api-consumer api-consumption api-proxy caching mcp mcp-gateway mcp-server mcp-servers priority-queue quota rate-limit resilience throttling visibility

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • pluggedin-mcp-proxy

    pluggedin-mcp-proxy

    Unified proxy server for Model Context Protocol data exchanges and AI integrations

    Aggregates multiple Model Context Protocol (MCP) servers into a single, unified proxy interface, supporting real-time discovery, management, and orchestration of AI model resources, tools, and prompts. Enables seamless interaction between MCP clients such as Claude, Cline, and Cursor, while integrating advanced document search, AI document exchange, and workspace management. Provides flexible transport modes (STDIO and Streamable HTTP), robust authentication, and comprehensive security measures for safe and scalable AI data exchange.

    • 87
    • MCP
    • VeriTeknik/pluggedin-mcp-proxy
  • cloudflare/mcp-server-cloudflare

    cloudflare/mcp-server-cloudflare

    Connect Cloudflare services to Model Context Protocol (MCP) clients for AI-powered management.

    Cloudflare MCP Server enables integration between Cloudflare's suite of services and clients using the Model Context Protocol (MCP). It provides multiple specialized servers that allow AI models to access, analyze, and manage configurations, logs, analytics, and other features across Cloudflare's platform. Users can leverage natural language interfaces in compatible MCP clients to read data, gain insights, and perform automated actions on their Cloudflare accounts. This project aims to streamline the orchestration of security, development, monitoring, and infrastructure tasks through standardized MCP connections.

    • 2,919
    • MCP
    • cloudflare/mcp-server-cloudflare
  • OpenMCP

    OpenMCP

    A standard and registry for converting web APIs into MCP servers.

    OpenMCP defines a standard for converting various web APIs into servers compatible with the Model Context Protocol (MCP), enabling efficient, token-aware communication with client LLMs. It also provides an open-source registry of compliant servers, allowing clients to access a wide array of external services. The platform supports integration with local and remote hosting environments and offers tools for configuring supported clients, such as Claude desktop and Cursor. Comprehensive guidance is offered for adapting different API formats including REST, gRPC, GraphQL, and more into MCP endpoints.

    • 252
    • MCP
    • wegotdocs/open-mcp
  • awslabs/mcp

    awslabs/mcp

    Specialized MCP servers for seamless AWS integration in AI and development environments.

    AWS MCP Servers is a suite of specialized servers implementing the open Model Context Protocol (MCP) to bridge large language model (LLM) applications with AWS services, tools, and data sources. It provides a standardized way for AI assistants, IDEs, and developer tools to access up-to-date AWS documentation, perform cloud operations, and automate workflows with context-aware intelligence. Featuring a broad catalog of domain-specific servers, quick installation for popular platforms, and both local and remote deployment options, it enhances cloud-native development, infrastructure management, and workflow automation for AI-driven tools. The project includes Docker, Lambda, and direct integration instructions for environments such as Amazon Q CLI, Cursor, Windsurf, Kiro, and VS Code.

    • 6,220
    • MCP
    • awslabs/mcp
  • 1mcp-app/agent

    1mcp-app/agent

    A unified server that aggregates and manages multiple Model Context Protocol servers.

    1MCP Agent provides a single, unified interface that aggregates multiple Model Context Protocol (MCP) servers, enabling seamless integration and management of external tools for AI assistants. It acts as a proxy, managing server configuration, authentication, health monitoring, and dynamic server control with features like asynchronous loading, tag-based filtering, and advanced security options. Compatible with popular AI development environments, it simplifies setup by reducing redundant server instances and resource usage. Users can configure, monitor, and scale model tool integrations across various AI clients through easy CLI commands or Docker deployment.

    • 96
    • MCP
    • 1mcp-app/agent
  • magg

    magg

    Meta-MCP aggregator and manager for LLM capability extension.

    Magg is a server that implements the Model Context Protocol (MCP), acting as a central aggregator and proxy for multiple MCP servers. It enables Large Language Models (LLMs) to dynamically discover, add, configure, and manage external tools at runtime. By aggregating tools from different MCP servers under unified namespaces, it streamlines capability management and introduces features such as configuration persistence, authentication, and real-time notifications. Magg offers both command-line and Docker deployment, with support for HTTP, stdio, and in-memory transport.

    • 62
    • MCP
    • sitbon/magg
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results