MCP Mermaid

MCP Mermaid

AI-powered dynamic generation of Mermaid diagrams via the Model Context Protocol.

283
Stars
36
Forks
283
Watchers
6
Issues
MCP Mermaid enables dynamic generation of Mermaid diagrams and charts using AI via the Model Context Protocol (MCP). It supports all features and syntax of Mermaid, and can export diagrams in various formats including SVG, base64, and files. The server can be run locally or globally using different transport protocols such as stdio, SSE, or streamable, making it compatible with tools like Claude, VSCode, and Smithery. It provides customization for background color and theme, and validates output to ensure correct syntax and graphics.

Key Features

Full support for Mermaid syntax and features
Configurable background color and themes
Multiple export formats: base64, SVG, Mermaid source, and file
Validation of diagram syntax and graphics
Runs as an MCP server with stdio, SSE, or streamable transport protocols
CLI interface with various configurable options
Seamless integration with desktop AI tools like VSCode and Claude
Support for multi-round AI model output
Local and global installation options
Compatibility with platforms like aliyun, modelscope, smithery.ai

Use Cases

Automatically generating flowcharts and sequence diagrams from AI prompts
Embedding dynamically created diagrams into documentation systems
Integrating visual diagram creation with AI chat tools in developer workflows
Enabling AI agents to output visual explanations and workflows
Validating and correcting Mermaid syntax through multi-round interaction
Exporting diagrams for presentations or technical documentation
Enhancing data storytelling with auto-generated charts and graphs
Building collaborative diagramming tools using server-side rendering
Providing visualization capabilities to custom AI-powered applications
Automating interactive dashboard or map creation via AI assistants

README

MCP Mermaid build npm Version smithery badge npm License Trust Score

Generate mermaid diagram and chart with AI MCP dynamically. Also you can use mcp-server-chart to generate chart, graph, map.

✨ Features

  • Fully support all features and syntax of Mermaid.
  • Support configuration of backgroundColor and theme, enabling large AI models to output rich style configurations.
  • Support exporting to base64, svg, mermaid, and file formats, with validation for Mermaid to facilitate the model's multi-round output of correct syntax and graphics. Use outputType: "file" to automatically save PNG diagrams to disk for AI agents.

🤖 Usage

To use with Desktop APP, such as Claude, VSCode, Cline, Cherry Studio, and so on, add the MCP server config below. On Mac system:

json
{
  "mcpServers": {
    "mcp-mermaid": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-mermaid"
      ]
    }
  }
}

On Window system:

json
{
  "mcpServers": {
    "mcp-mermaid": {
      "command": "cmd",
      "args": [
        "/c",
        "npx",
        "-y",
        "mcp-mermaid"
      ]
    }
  }
}

Also, you can use it on aliyun, modelscope, glama.ai, smithery.ai or others with HTTP, SSE Protocol.

🚰 Run with SSE or Streamable transport

Option 1: Global Installation

Install the package globally:

bash
npm install -g mcp-mermaid

Run the server with your preferred transport option:

bash
# For SSE transport (default endpoint: /sse)
mcp-mermaid -t sse

# For Streamable transport with custom endpoint
mcp-mermaid -t streamable

Option 2: Local Development

If you're working with the source code locally:

bash
# Clone and setup
git clone https://github.com/hustcc/mcp-mermaid.git
cd mcp-mermaid
npm install
npm run build

# Run with npm scripts
npm run start:sse        # SSE transport on port 3033
npm run start:streamable # Streamable transport on port 1122

Access Points

Then you can access the server at:

  • SSE transport: http://localhost:3033/sse
  • Streamable transport: http://localhost:1122/mcp (local) or http://localhost:3033/mcp (global)

🎮 CLI Options

You can also use the following CLI options when running the MCP server. Command options by run cli with -h.

plain
MCP Mermaid CLI

Options:
  --transport, -t  Specify the transport protocol: "stdio", "sse", or "streamable" (default: "stdio")
  --port, -p       Specify the port for SSE or streamable transport (default: 3033)
  --endpoint, -e   Specify the endpoint for the transport:
                    - For SSE: default is "/sse"
                    - For streamable: default is "/mcp"
  --help, -h       Show this help message

🔨 Development

Install dependencies:

bash
npm install

Build the server:

bash
npm run build

Start the MCP server

Using MCP Inspector (for debugging):

bash
npm run start

Using different transport protocols:

bash
# SSE transport (Server-Sent Events)
npm run start:sse

# Streamable HTTP transport
npm run start:streamable

Direct node commands:

bash
# SSE transport on port 3033
node build/index.js --transport sse --port 3033

# Streamable HTTP transport on port 1122
node build/index.js --transport streamable --port 1122

# STDIO transport (for MCP client integration)
node build/index.js --transport stdio

📄 License

MIT@hustcc.

Star History

Star History Chart

Repository Owner

hustcc
hustcc

User

Repository Details

Language TypeScript
Default Branch main
Size 43 KB
Contributors 7
License MIT License
MCP Verified Nov 11, 2025

Programming Languages

TypeScript
83.79%
JavaScript
13.87%
Dockerfile
2.02%
Shell
0.32%

Tags

Topics

mcp mcp-server mermaid mermaidjs

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Mindpilot MCP

    Mindpilot MCP

    Visualize and understand code structures with on-demand diagrams for AI coding assistants.

    Mindpilot MCP provides AI coding agents with the capability to visualize, analyze, and understand complex codebases through interactive diagrams. It operates as a Model Context Protocol (MCP) server, enabling seamless integration with multiple development environments such as VS Code, Cursor, Windsurf, Zed, and Claude Code. Mindpilot ensures local processing for privacy, supports multi-client connections, and offers robust configuration options for server operation and data management. Users can export diagrams and adjust analytics settings for improved user control.

    • 61
    • MCP
    • abrinsmead/mindpilot-mcp
  • Markmap MCP Server

    Markmap MCP Server

    Convert Markdown to interactive mind maps via the Model Context Protocol.

    Markmap MCP Server enables seamless conversion of Markdown content into interactive mind maps using the Model Context Protocol (MCP). It leverages the open-source markmap project and provides users with diverse export formats including PNG, JPG, and SVG. Designed for easy integration with MCP clients, it offers tools for automated browser previews, rich interactivity, and batch mind map generation. The server can be installed easily via npm or Smithery and supports configurable output directories.

    • 137
    • MCP
    • jinzcdev/markmap-mcp-server
  • Mindmap MCP Server

    Mindmap MCP Server

    Convert Markdown content into interactive mindmaps via an MCP-compliant server.

    Mindmap MCP Server provides a Model Context Protocol (MCP) compatible service that transforms Markdown input into interactive mindmap visualizations. It supports command-line, Python, and Docker installation methods for flexibility across platforms. Designed to integrate with MCP clients like Claude Desktop, it ensures seamless Markdown-to-mindmap conversion using markmap under the hood. The server is intended for easy plug-and-play use in model-powered workflows that require structured, visual context formatting.

    • 205
    • MCP
    • YuChenSSR/mindmap-mcp-server
  • piapi-mcp-server

    piapi-mcp-server

    TypeScript-based MCP server for PiAPI media content generation

    piapi-mcp-server is a TypeScript implementation of a Model Context Protocol (MCP) server that connects with PiAPI to enable media generation workflows from MCP-compatible applications. It handles image, video, music, TTS, 3D, and voice generation tasks using a wide range of supported models like Midjourney, Flux, Kling, LumaLabs, Udio, and more. Designed for easy integration with clients such as Claude Desktop, it includes an interactive MCP Inspector for development, testing, and debugging.

    • 62
    • MCP
    • apinetwork/piapi-mcp-server
  • Replicate Flux MCP

    Replicate Flux MCP

    MCP-compatible server for high-quality image and SVG generation via Replicate models.

    Replicate Flux MCP is an advanced Model Context Protocol (MCP) server designed to enable AI assistants to generate high-quality raster images and vector graphics. It leverages Replicate's Flux Schnell model for image synthesis and Recraft V3 SVG model for vector output, supporting seamless integration with AI platforms like Cursor, Claude Desktop, Smithery, and Glama.ai. Users can generate images and SVGs by simply providing natural language prompts, with support for parameter customization, batch processing, and variant creation.

    • 66
    • MCP
    • awkoy/replicate-flux-mcp
  • Sequa MCP

    Sequa MCP

    Bridge Sequa's advanced context engine to any MCP-capable AI client.

    Sequa MCP acts as a seamless integration layer, connecting Sequa’s knowledge engine with various AI coding assistants and IDEs via the Model Context Protocol (MCP). It enables tools to leverage Sequa’s contextual knowledge streams, enhancing code understanding and task execution across multiple repositories. The solution provides a simple proxy command to interface with standardized MCP transports, supporting configuration in popular environments such as Cursor, Claude, VSCode, and others. Its core purpose is to deliver deep, project-specific context to LLM agents through a unified and streamable endpoint.

    • 16
    • MCP
    • sequa-ai/sequa-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results