
magg
Meta-MCP aggregator and manager for LLM capability extension.
Key Features
Use Cases
README
🧲 Magg - The MCP Aggregator
A Model Context Protocol server that manages, aggregates, and proxies other MCP servers, enabling LLMs to dynamically extend their own capabilities.
What is Magg?
Magg is a meta-MCP server that acts as a central hub for managing multiple MCP servers. It provides tools that allow LLMs to:
- Search for new MCP servers and discover setup instructions
- Add and configure MCP servers dynamically
- Enable/disable servers on demand
- Aggregate tools from multiple servers under unified prefixes
- Persist configurations across sessions
Think of Magg as a "package manager for LLM tools" - it lets AI assistants install and manage their own capabilities at runtime.
Features
- Self-Service Tool Management: LLMs can search for and add new MCP servers without human intervention.
- Dynamic Configuration Reloading: Automatically detects and applies config changes without restarting.
- Automatic Tool Proxying: Tools from added servers are automatically exposed with configurable prefixes.
- ProxyMCP Tool: A built-in tool that proxies the MCP protocol to itself, for clients that don't support notifications or dynamic tool updates (which is most of them currently).
- Smart Configuration: Uses MCP sampling to intelligently configure servers from just a URL.
- Persistent Configuration: Maintains server configurations in
.magg/config.json
. - Multiple Transport Support: Works with stdio, HTTP, and in-memory transports.
- Bearer Token Authentication: Optional RSA-based JWT authentication for secure HTTP access.
- Docker Support: Pre-built images for production, staging, and development workflows.
- Health Monitoring: Built-in
magg_status
andmagg_check
tools for server health checks. - Real-time Messaging: Full support for MCP notifications and messages - receive tool/resource updates and progress notifications from backend servers.
- Python 3.12+ Support: Fully compatible with Python 3.12 and 3.13.
- Kit Management: Bundle related MCP servers into kits for easy loading/unloading as a group.
- MBro CLI: Included MCP Browser for interactive exploration and management of MCP servers, with script support for automation.
Installation
Prerequisites
- Python 3.12 or higher (3.13+ recommended)
uv
(recommended) - Install from astral.sh/uv
Quick Install (Recommended)
The easiest way to install Magg is as a tool using uv
:
# Install Magg as a tool
uv tool install magg
# Run with stdio transport (for Claude Desktop, Cline, etc.)
magg serve
# Run with HTTP transport (for system-wide access)
magg serve --http
Alternative: Run Directly from GitHub
You can also run Magg directly from GitHub without installing:
# Run with stdio transport
uvx --from git+https://github.com/sitbon/magg.git magg
# Run with HTTP transport
uvx --from git+https://github.com/sitbon/magg.git magg serve --http
Local Development
For development, clone the repository and install in editable mode:
# Clone the repository
git clone https://github.com/sitbon/magg.git
cd magg
# Install in development mode with dev dependencies
uv sync --dev
# Or with poetry
poetry install --with dev
# Run the CLI
magg --help
Docker
Magg is available as pre-built Docker images from GitHub Container Registry:
# Run production image (WARNING log level)
docker run -p 8000:8000 ghcr.io/sitbon/magg:latest
# Run with authentication (mount or set private key)
docker run -p 8000:8000 \
-v ~/.ssh/magg:/home/magg/.ssh/magg:ro \
ghcr.io/sitbon/magg:latest
# Or with environment variable
docker run -p 8000:8000 \
-e MAGG_PRIVATE_KEY="$(cat ~/.ssh/magg/magg.key)" \
ghcr.io/sitbon/magg:latest
# Run beta image (INFO log level)
docker run -p 8000:8000 ghcr.io/sitbon/magg:beta
# Run with custom config directory
docker run -p 8000:8000 \
-v /path/to/config:/home/magg/.magg \
ghcr.io/sitbon/magg:latest
Docker Image Strategy
Magg uses a multi-stage Docker build with three target stages:
pro
(Production): Minimal image with WARNING log level, suitable for production deploymentspre
(Pre-production): Same as production but with INFO log level for staging/testing (available but not published)dev
(Development): Includes development dependencies and DEBUG logging for troubleshooting
Images are automatically published to GitHub Container Registry with the following tags:
- Version tags (from main branch):
1.2.3
,1.2
,dev
,1.2-dev
,1.2-dev-py3.12
, etc. - Branch tags (from beta branch):
beta
,beta-dev
- Python-specific dev tags:
beta-dev-py3.12
,beta-dev-py3.13
, etc.
Docker Compose
For easier management, use Docker Compose:
# Clone the repository
git clone https://github.com/sitbon/magg.git
cd magg
# Run production version
docker compose up magg
# Run staging version (on port 8001)
docker compose up magg-beta
# Run development version (on port 8008)
# This uses ./.magg/config.json for configuration
docker compose up magg-dev
# Build and run with custom registry
REGISTRY=my.registry.com docker compose build
REGISTRY=my.registry.com docker compose push
See compose.yaml
and .env.example
for configuration options.
Usage
Running Magg
Magg can run in three modes:
-
Stdio Mode (default) - For integration with Claude Desktop, Cline, Cursor, etc.:
bashmagg serve
-
HTTP Mode - For system-wide access or web integrations:
bashmagg serve --http --port 8000
-
Hybrid Mode - Both stdio and HTTP simultaneously:
bashmagg serve --hybrid magg serve --hybrid --port 8080 # Custom port
This is particularly useful when you want to use Magg through an MCP client while also allowing HTTP access. For example:
With Claude Code:
bash# Configure Claude Code to use Magg in hybrid mode claude mcp add magg -- magg serve --hybrid --port 42000
With mbro:
bash# mbro hosts Magg and connects via stdio mbro connect magg "magg serve --hybrid --port 8080" # Other mbro instances can connect via HTTP mbro connect magg http://localhost:8080
Available Tools
Once Magg is running, it exposes the following tools to LLMs:
magg_list_servers
- List all configured MCP serversmagg_add_server
- Add a new MCP servermagg_remove_server
- Remove a servermagg_enable_server
/magg_disable_server
- Toggle server availabilitymagg_search_servers
- Search for MCP servers onlinemagg_list_tools
- List all available tools from all serversmagg_smart_configure
- Intelligently configure a server from a URLmagg_analyze_servers
- Analyze configured servers and suggest improvementsmagg_status
- Get server and tool statisticsmagg_check
- Health check servers with repair actions (report/remount/unmount/disable)magg_reload_config
- Reload configuration from disk and apply changesmagg_load_kit
- Load a kit and its servers into the configurationmagg_unload_kit
- Unload a kit and optionally its servers from the configurationmagg_list_kits
- List all available kits with their statusmagg_kit_info
- Get detailed information about a specific kit
Quick Inspection with MBro
Magg includes the mbro
(MCP Browser) CLI tool for interactive exploration. A unique feature is the ability to connect to Magg in stdio mode for quick inspection:
# Connect mbro to a Magg instance via stdio (no HTTP server needed)
mbro connect local-magg magg serve
# Now inspect your Magg setup from the MCP client perspective
mbro:local-magg> call magg_status
mbro:local-magg> call magg_list_servers
MBro also supports:
- Scripts: Create
.mbro
files with commands for automation - Shell-style arguments: Use
key=value
syntax instead of JSON - Tab completion: Rich parameter hints after connecting
See the MBro Documentation for details.
Authentication
Magg supports optional bearer token authentication to secure access:
Quick Start
-
Initialize authentication (creates RSA keypair):
bashmagg auth init
-
Generate a JWT token for clients:
bash# Generate token (displays on screen) magg auth token # Export as environment variable export MAGG_JWT=$(magg auth token -q)
-
Connect with authentication:
- Using
MaggClient
(auto-loads from MAGG_JWT):pythonfrom magg.client import MaggClient async def main(): async with MaggClient("http://localhost:8000/mcp") as client: tools = await client.list_tools()
- Using FastMCP with explicit token:
python
from fastmcp import Client from fastmcp.client import BearerAuth jwt_token = "your-jwt-token-here" async with Client("http://localhost:8000/mcp", auth=BearerAuth(jwt_token)) as client: tools = await client.list_tools()
- Using
Key Management
- Keys are stored in
~/.ssh/magg/
by default - Private key can be set via
MAGG_PRIVATE_KEY
environment variable - To disable auth, remove keys or set non-existent
key_path
in.magg/auth.json
Authentication Commands
magg auth init
- Initialize authentication (generates RSA keypair)magg auth status
- Check authentication configurationmagg auth token
- Generate JWT tokenmagg auth public-key
- Display public key (for verification)magg auth private-key
- Display private key (for backup)
See examples/authentication.py for more usage patterns.
Configuration
Magg stores its configuration in .magg/config.json
in your current working directory. This allows for project-specific tool configurations.
Dynamic Configuration Reloading
Magg supports automatic configuration reloading without requiring a restart:
- Automatic file watching: Detects changes to
config.json
and reloads automatically (uses watchdog when available) - SIGHUP signal: Send
kill -HUP <pid>
to trigger immediate reload (Unix-like systems) - MCP tool: Use
magg_reload_config
tool from any MCP client - Smart transitions: Only affected servers are restarted during reload
Configuration reload is enabled by default. You can control it with:
MAGG_AUTO_RELOAD=false
- Disable automatic reloadingMAGG_RELOAD_POLL_INTERVAL=5.0
- Set polling interval in seconds (when watchdog unavailable)
See Configuration Reload Documentation for detailed information.
Environment Variables
Magg supports several environment variables for configuration:
MAGG_CONFIG_PATH
- Path to config file (default:.magg/config.json
)MAGG_LOG_LEVEL
- Logging level: DEBUG, INFO, WARNING, ERROR, CRITICAL (default: INFO)MAGG_STDERR_SHOW=1
- Show stderr output from subprocess MCP servers (default: suppressed)MAGG_AUTO_RELOAD
- Enable/disable config auto-reload (default: true)MAGG_RELOAD_POLL_INTERVAL
- Config polling interval in seconds (default: 1.0)MAGG_READ_ONLY=true
- Run in read-only modeMAGG_SELF_PREFIX
- Prefix for Magg tools (default: "magg"). Tools will be named as{prefix}{sep}{tool}
(e.g.,magg_list_servers
)MAGG_PREFIX_SEP
- Separator between prefix and tool name (default: "_")
Example configuration:
{
"servers": {
"calculator": {
"name": "calculator",
"source": "https://github.com/executeautomation/calculator-mcp",
"command": "npx @executeautomation/calculator-mcp",
"prefix": "calc",
"enabled": true
}
}
}
Adding Servers
Servers can be added in several ways:
-
Using the LLM (recommended):
"Add the Playwright MCP server" "Search for and add a calculator tool"
-
Manual configuration via
magg_add_server
:name: playwright url: https://github.com/microsoft/playwright-mcp command: npx @playwright/mcp@latest prefix: pw
-
Direct config editing: Edit
.magg/config.json
directly
Real-time Notifications with MaggClient
The MaggClient
now supports real-time notifications from backend MCP servers:
from magg import MaggClient, MaggMessageHandler
# Using callbacks
handler = MaggMessageHandler(
on_tool_list_changed=lambda n: print("Tools changed!"),
on_progress=lambda n: print(f"Progress: {n.params.progress}")
)
async with MaggClient("http://localhost:8000/mcp", message_handler=handler) as client:
# Client will receive notifications while connected
tools = await client.list_tools()
See Messaging Documentation for advanced usage including custom message handlers.
Kit Management
Magg supports organizing related MCP servers into "kits" - bundles that can be loaded and unloaded as a group:
# List available kits
magg kit list
# Load a kit (adds all its servers)
magg kit load web-tools
# Unload a kit (removes servers only in that kit)
magg kit unload web-tools
# Get information about a kit
magg kit info web-tools
You can also manage kits programmatically through Magg's tools when connected via an MCP client:
magg_list_kits
- List all available kitsmagg_load_kit
- Load a kit and its serversmagg_unload_kit
- Unload a kitmagg_kit_info
- Get detailed kit information
Kits are JSON files stored in ~/.magg/kit.d/
or .magg/kit.d/
that define a collection of related servers. See Kit Documentation for details on creating and managing kits.
MBro Scripts
Automate common workflows with MBro scripts:
# Create a setup script
cat > setup.mbro <<EOF
# Connect to Magg and check status
connect magg magg serve
call magg_status
call magg_list_servers
# Add a new server if needed
call magg_add_server name=calculator source="npx -y @modelcontextprotocol/server-calculator"
EOF
# Run the script
mbro -x setup.mbro
Documentation
For more documentation, see docs/.
Appearances
Magg appears in multiple locations. Please feel free to submit a PR to add more appearances below in alphabetical order.
Listing, Index, and other MCP Sites
Awesome GitHub MCP Lists
Star History
Repository Owner
User
Repository Details
Programming Languages
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers

pluggedin-mcp-proxy
Unified proxy server for Model Context Protocol data exchanges and AI integrations
Aggregates multiple Model Context Protocol (MCP) servers into a single, unified proxy interface, supporting real-time discovery, management, and orchestration of AI model resources, tools, and prompts. Enables seamless interaction between MCP clients such as Claude, Cline, and Cursor, while integrating advanced document search, AI document exchange, and workspace management. Provides flexible transport modes (STDIO and Streamable HTTP), robust authentication, and comprehensive security measures for safe and scalable AI data exchange.
- ⭐ 87
- MCP
- VeriTeknik/pluggedin-mcp-proxy

awslabs/mcp
Specialized MCP servers for seamless AWS integration in AI and development environments.
AWS MCP Servers is a suite of specialized servers implementing the open Model Context Protocol (MCP) to bridge large language model (LLM) applications with AWS services, tools, and data sources. It provides a standardized way for AI assistants, IDEs, and developer tools to access up-to-date AWS documentation, perform cloud operations, and automate workflows with context-aware intelligence. Featuring a broad catalog of domain-specific servers, quick installation for popular platforms, and both local and remote deployment options, it enhances cloud-native development, infrastructure management, and workflow automation for AI-driven tools. The project includes Docker, Lambda, and direct integration instructions for environments such as Amazon Q CLI, Cursor, Windsurf, Kiro, and VS Code.
- ⭐ 6,220
- MCP
- awslabs/mcp

McGravity
Unified load balancer and proxy for multiple MCP servers
McGravity acts as a scalable unified proxy and load balancer for multiple MCP (Model Context Protocol) servers. It allows clients to connect through a single endpoint to access and manage multiple MCP servers efficiently. The tool offers load balancing, configuration via YAML, CLI and Docker support, and plans to evolve with features such as health checks and a web interface. Designed for modern GenAI infrastructure, it simplifies connection, balancing, and scalability of MCP server deployments.
- ⭐ 68
- MCP
- tigranbs/mcgravity

mcp-server-templates
Deploy Model Context Protocol servers instantly with zero configuration.
MCP Server Templates enables rapid, zero-configuration deployment of production-ready Model Context Protocol (MCP) servers using Docker containers and a comprehensive CLI tool. It provides a library of ready-made templates for common integrations—including filesystems, GitHub, GitLab, and Zendesk—and features intelligent caching, smart tool discovery, and flexible configuration options via JSON, YAML, environment variables, or CLI. Perfect for AI developers, data scientists, and DevOps teams, it streamlines the process of setting up and managing MCP servers and has evolved into the MCP Platform for enhanced capabilities.
- ⭐ 5
- MCP
- Data-Everything/mcp-server-templates

1mcp-app/agent
A unified server that aggregates and manages multiple Model Context Protocol servers.
1MCP Agent provides a single, unified interface that aggregates multiple Model Context Protocol (MCP) servers, enabling seamless integration and management of external tools for AI assistants. It acts as a proxy, managing server configuration, authentication, health monitoring, and dynamic server control with features like asynchronous loading, tag-based filtering, and advanced security options. Compatible with popular AI development environments, it simplifies setup by reducing redundant server instances and resource usage. Users can configure, monitor, and scale model tool integrations across various AI clients through easy CLI commands or Docker deployment.
- ⭐ 96
- MCP
- 1mcp-app/agent

mcpmcp-server
Seamlessly discover, set up, and integrate MCP servers with AI clients.
mcpmcp-server enables users to discover, configure, and connect MCP servers with preferred clients, optimizing AI integration into daily workflows. It supports streamlined setup via JSON configuration, ensuring compatibility with various platforms such as Claude Desktop on macOS. The project simplifies the connection process between AI clients and remote Model Context Protocol servers. Users are directed to an associated homepage for further platform-specific guidance.
- ⭐ 17
- MCP
- glenngillen/mcpmcp-server
Didn't find tool you were looking for?