MCP Mermaid
AI-powered dynamic generation of Mermaid diagrams via the Model Context Protocol.
Key Features
Use Cases
README
MCP Mermaid
![Trust Score]()
Generate mermaid diagram and chart with AI MCP dynamically. Also you can use mcp-server-chart to generate chart, graph, map.
✨ Features
- Fully support all features and syntax of
Mermaid. - Support configuration of
backgroundColorandtheme, enabling large AI models to output rich style configurations. - Support exporting to
base64,svg,mermaid, andfileformats, with validation forMermaidto facilitate the model's multi-round output of correct syntax and graphics. UseoutputType: "file"to automatically save PNG diagrams to disk for AI agents.
🤖 Usage
To use with Desktop APP, such as Claude, VSCode, Cline, Cherry Studio, and so on, add the MCP server config below. On Mac system:
{
"mcpServers": {
"mcp-mermaid": {
"command": "npx",
"args": [
"-y",
"mcp-mermaid"
]
}
}
}
On Window system:
{
"mcpServers": {
"mcp-mermaid": {
"command": "cmd",
"args": [
"/c",
"npx",
"-y",
"mcp-mermaid"
]
}
}
}
Also, you can use it on aliyun, modelscope, glama.ai, smithery.ai or others with HTTP, SSE Protocol.
🚰 Run with SSE or Streamable transport
Option 1: Global Installation
Install the package globally:
npm install -g mcp-mermaid
Run the server with your preferred transport option:
# For SSE transport (default endpoint: /sse)
mcp-mermaid -t sse
# For Streamable transport with custom endpoint
mcp-mermaid -t streamable
Option 2: Local Development
If you're working with the source code locally:
# Clone and setup
git clone https://github.com/hustcc/mcp-mermaid.git
cd mcp-mermaid
npm install
npm run build
# Run with npm scripts
npm run start:sse # SSE transport on port 3033
npm run start:streamable # Streamable transport on port 1122
Access Points
Then you can access the server at:
- SSE transport:
http://localhost:3033/sse - Streamable transport:
http://localhost:1122/mcp(local) orhttp://localhost:3033/mcp(global)
🎮 CLI Options
You can also use the following CLI options when running the MCP server. Command options by run cli with -h.
MCP Mermaid CLI
Options:
--transport, -t Specify the transport protocol: "stdio", "sse", or "streamable" (default: "stdio")
--port, -p Specify the port for SSE or streamable transport (default: 3033)
--endpoint, -e Specify the endpoint for the transport:
- For SSE: default is "/sse"
- For streamable: default is "/mcp"
--help, -h Show this help message
🔨 Development
Install dependencies:
npm install
Build the server:
npm run build
Start the MCP server
Using MCP Inspector (for debugging):
npm run start
Using different transport protocols:
# SSE transport (Server-Sent Events)
npm run start:sse
# Streamable HTTP transport
npm run start:streamable
Direct node commands:
# SSE transport on port 3033
node build/index.js --transport sse --port 3033
# Streamable HTTP transport on port 1122
node build/index.js --transport streamable --port 1122
# STDIO transport (for MCP client integration)
node build/index.js --transport stdio
📄 License
MIT@hustcc.
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
Mindpilot MCP
Visualize and understand code structures with on-demand diagrams for AI coding assistants.
Mindpilot MCP provides AI coding agents with the capability to visualize, analyze, and understand complex codebases through interactive diagrams. It operates as a Model Context Protocol (MCP) server, enabling seamless integration with multiple development environments such as VS Code, Cursor, Windsurf, Zed, and Claude Code. Mindpilot ensures local processing for privacy, supports multi-client connections, and offers robust configuration options for server operation and data management. Users can export diagrams and adjust analytics settings for improved user control.
- ⭐ 61
- MCP
- abrinsmead/mindpilot-mcp
Markmap MCP Server
Convert Markdown to interactive mind maps via the Model Context Protocol.
Markmap MCP Server enables seamless conversion of Markdown content into interactive mind maps using the Model Context Protocol (MCP). It leverages the open-source markmap project and provides users with diverse export formats including PNG, JPG, and SVG. Designed for easy integration with MCP clients, it offers tools for automated browser previews, rich interactivity, and batch mind map generation. The server can be installed easily via npm or Smithery and supports configurable output directories.
- ⭐ 137
- MCP
- jinzcdev/markmap-mcp-server
Mindmap MCP Server
Convert Markdown content into interactive mindmaps via an MCP-compliant server.
Mindmap MCP Server provides a Model Context Protocol (MCP) compatible service that transforms Markdown input into interactive mindmap visualizations. It supports command-line, Python, and Docker installation methods for flexibility across platforms. Designed to integrate with MCP clients like Claude Desktop, it ensures seamless Markdown-to-mindmap conversion using markmap under the hood. The server is intended for easy plug-and-play use in model-powered workflows that require structured, visual context formatting.
- ⭐ 205
- MCP
- YuChenSSR/mindmap-mcp-server
piapi-mcp-server
TypeScript-based MCP server for PiAPI media content generation
piapi-mcp-server is a TypeScript implementation of a Model Context Protocol (MCP) server that connects with PiAPI to enable media generation workflows from MCP-compatible applications. It handles image, video, music, TTS, 3D, and voice generation tasks using a wide range of supported models like Midjourney, Flux, Kling, LumaLabs, Udio, and more. Designed for easy integration with clients such as Claude Desktop, it includes an interactive MCP Inspector for development, testing, and debugging.
- ⭐ 62
- MCP
- apinetwork/piapi-mcp-server
Replicate Flux MCP
MCP-compatible server for high-quality image and SVG generation via Replicate models.
Replicate Flux MCP is an advanced Model Context Protocol (MCP) server designed to enable AI assistants to generate high-quality raster images and vector graphics. It leverages Replicate's Flux Schnell model for image synthesis and Recraft V3 SVG model for vector output, supporting seamless integration with AI platforms like Cursor, Claude Desktop, Smithery, and Glama.ai. Users can generate images and SVGs by simply providing natural language prompts, with support for parameter customization, batch processing, and variant creation.
- ⭐ 66
- MCP
- awkoy/replicate-flux-mcp
Sequa MCP
Bridge Sequa's advanced context engine to any MCP-capable AI client.
Sequa MCP acts as a seamless integration layer, connecting Sequa’s knowledge engine with various AI coding assistants and IDEs via the Model Context Protocol (MCP). It enables tools to leverage Sequa’s contextual knowledge streams, enhancing code understanding and task execution across multiple repositories. The solution provides a simple proxy command to interface with standardized MCP transports, supporting configuration in popular environments such as Cursor, Claude, VSCode, and others. Its core purpose is to deliver deep, project-specific context to LLM agents through a unified and streamable endpoint.
- ⭐ 16
- MCP
- sequa-ai/sequa-mcp
Didn't find tool you were looking for?