McGravity
Unified load balancer and proxy for multiple MCP servers
Key Features
Use Cases
README
McGravity
About
McGravity is a tool that connects multiple MCP (Model Context Protocol) servers into one unified service. It lets you reuse the same MCP server and scale underlying MCP server connections almost infinitely.
The current version works as a basic CLI tool, but McGravity will grow to become a full-featured proxy for MCP servers - like Nginx but for modern Gen AI tools and servers.
Why McGravity?
Without McGravity:
┌─────────┐ ┌─────────┐
│ Client │────▶│MCP │
│ │ │Server 1 │
└─────────┘ └─────────┘
│
│ ┌─────────┐
└──────────▶│MCP │
│Server 2 │
└─────────┘
With McGravity:
┌─────────┐ ┌─────────┐ ┌─────────┐
│ Client │────▶│McGravity│────▶│MCP │
│ │ │ │ │Server 1 │
└─────────┘ └─────────┘ └─────────┘
│
│ ┌─────────┐
└─────────▶│MCP │
│Server 2 │
└─────────┘
McGravity solves these problems:
- Connect to multiple MCP servers through one endpoint
- Balance load between MCP servers
- Provide a single point of access for your applications
Installation
# Install dependencies
bun install
# Build the project into a single executable
bun build src/index.ts --compile --outfile mcgravity
Docker
McGravity is available on Docker Hub: tigranbs/mcgravity.
docker pull tigranbs/mcgravity
# Basic usage
docker run -p 3001:3001 tigranbs/mcgravity http://mcp1.example.com http://mcp2.example.com
# With custom host and port
docker run -p 4000:4000 tigranbs/mcgravity --host 0.0.0.0 --port 4000 http://mcp1.example.com
Usage
Basic command:
./mcgravity <mcp-server-address1> <mcp-server-address2> ...
With options:
./mcgravity --host localhost --port 3001 http://mcp1.example.com http://mcp2.example.com
Using configuration file:
./mcgravity --config config.yaml
Options
--host <host>: Host to bind the server to (default: localhost)--port <port>: Port to bind the server to (default: 3001)--config <path>: Path to the config file (default: config.yaml)--mcp-version <version>: Version of the MCP server (default: 1.0.0)--mcp-name <name>: Name of the MCP server (default: mcgravity)--help: Show help information
Configuration
McGravity can be configured using a YAML file. See config.example.yaml for a sample configuration:
name: mcgravity
version: 1.0.0
description: A simple MCP server
servers:
echo-server:
url: http://localhost:3000/sse
name: echo-server
version: 1.0.0
description: A simple echo server
tags:
- echo
You can run the included echo server example for testing:
# Start the echo server first
bun examples/echo-server.ts
# Then start McGravity pointing to the echo server
./mcgravity --config config.yaml
Examples
Start McGravity with default settings:
./mcgravity http://mcp1.example.com http://mcp2.example.com
Specify host and port:
./mcgravity --host 0.0.0.0 --port 4000 http://mcp1.example.com http://mcp2.example.com
Running Tests
To run all tests:
bun test
To run integration tests only:
bun run test:integration
Integration Tests
The integration tests verify that McGravity can:
- Connect to an MCP server (the example echo server)
- Correctly proxy capabilities from the target MCP server
- Pass requests from clients to the target MCP server and return responses
For more details about the test suite, see the test README.
The tests are automatically run in GitHub Actions CI on push and PR events.
Future Plans
McGravity will expand to include:
- Web interface for monitoring
- Advanced load balancing
- MCP server health checks
- Authentication and access control
- Plugin system for custom integrations
Development
TypeScript and Code Style
This project uses:
- TypeScript with Bun runtime
- ESLint for code linting with TypeScript-specific rules
- Prettier for code formatting
The configuration is optimized for Bun with appropriate TypeScript settings for the runtime environment.
Run the following commands:
# Format code with Prettier
bun run format
# Check if code is properly formatted
bun run format:check
# Lint code with ESLint
bun run lint
# Fix auto-fixable linting issues
bun run lint:fix
VS Code is configured to format code on save and provide linting information when the recommended extensions are installed.
Contributing
Contributions are welcome! Feel free to open issues or submit pull requests.
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
Globalping MCP Server
Enable AI models to run network tests globally via natural language.
Globalping MCP Server implements the Model Context Protocol, enabling AI models to interface with a global network measurement platform through natural language. It allows AI clients to perform network diagnostic tests such as ping, traceroute, DNS, MTR, and HTTP from thousands of locations worldwide. The server offers AI-friendly context handling, detailed parameter descriptions, comparative analysis of network performance, and supports secure authentication using OAuth or API tokens.
- ⭐ 33
- MCP
- jsdelivr/globalping-mcp-server
pluggedin-mcp-proxy
Unified proxy server for Model Context Protocol data exchanges and AI integrations
Aggregates multiple Model Context Protocol (MCP) servers into a single, unified proxy interface, supporting real-time discovery, management, and orchestration of AI model resources, tools, and prompts. Enables seamless interaction between MCP clients such as Claude, Cline, and Cursor, while integrating advanced document search, AI document exchange, and workspace management. Provides flexible transport modes (STDIO and Streamable HTTP), robust authentication, and comprehensive security measures for safe and scalable AI data exchange.
- ⭐ 87
- MCP
- VeriTeknik/pluggedin-mcp-proxy
Outsource MCP
Unified MCP server for multi-provider AI text and image generation
Outsource MCP is a Model Context Protocol server that bridges AI applications with multiple model providers via a single unified interface. It enables AI tools and clients to access over 20 major providers for both text and image generation, streamlining model selection and API integration. Built on FastMCP and Agno agent frameworks, it supports flexible configuration and is compatible with MCP-enabled AI tools. Authentication is provider-specific, and all interactions use a simple standardized API format.
- ⭐ 26
- MCP
- gwbischof/outsource-mcp
MCP Rubber Duck
A bridge server for querying multiple OpenAI-compatible LLMs through the Model Context Protocol.
MCP Rubber Duck acts as an MCP (Model Context Protocol) server that enables users to query and manage multiple OpenAI-compatible large language models from a unified API. It supports parallel querying of various providers, context management across sessions, failover between providers, and response caching. This tool is designed for debugging and experimentation by allowing users to receive diverse AI-driven perspectives from different model endpoints.
- ⭐ 56
- MCP
- nesquikm/mcp-rubber-duck
Higress
AI Native API Gateway with Built-in Model Context Protocol (MCP) Support
Higress is a cloud-native API gateway built on Istio and Envoy, extensible with Wasm plugins in Go, Rust, or JS. It enables unified management and hosting of both LLM APIs and MCP Servers, allowing AI agents to easily call tools and services via standard protocols. The platform supports seamless conversion of OpenAPI specs to remote MCP servers and provides robust AI gateway features for enterprise and mainstream model providers. Higress is widely adopted in production environments, notably within Alibaba Cloud's core AI applications.
- ⭐ 6,814
- MCP
- alibaba/higress
1mcp-app/agent
A unified server that aggregates and manages multiple Model Context Protocol servers.
1MCP Agent provides a single, unified interface that aggregates multiple Model Context Protocol (MCP) servers, enabling seamless integration and management of external tools for AI assistants. It acts as a proxy, managing server configuration, authentication, health monitoring, and dynamic server control with features like asynchronous loading, tag-based filtering, and advanced security options. Compatible with popular AI development environments, it simplifies setup by reducing redundant server instances and resource usage. Users can configure, monitor, and scale model tool integrations across various AI clients through easy CLI commands or Docker deployment.
- ⭐ 96
- MCP
- 1mcp-app/agent
Didn't find tool you were looking for?