Neo4j MCP Clients & Servers

Neo4j MCP Clients & Servers

Seamless natural language and knowledge graph integration for Neo4j via Model Context Protocol.

797
Stars
209
Forks
797
Watchers
23
Issues
Neo4j MCP Clients & Servers provide standardized interfaces that enable large language models and AI assistants to interact with Neo4j databases and cloud services using natural language through the Model Context Protocol (MCP). It includes multiple servers for translating natural language to Cypher queries, managing graph memory, handling Neo4j Aura cloud services, and supporting interactive data modeling. Multiple transport modes such as STDIO, HTTP, and SSE offer flexibility for various deployments including cloud and local environments.

Key Features

Natural language to Cypher query translation
Knowledge graph memory storage and retrieval
Neo4j Aura instance management API
Interactive graph data modeling and visualization
Multiple transport modes (STDIO, HTTP, SSE)
Ready-to-deploy containers for cloud platforms
Session and conversation context retention
Model import/export from Arrows.app
Customizable HTTP configuration options
Support for AI assistant integrations

Use Cases

Querying Neo4j databases using natural language
Visualizing and modeling graph structures interactively
Managing and scaling Neo4j Aura cloud instances via LLMs
Storing and retrieving personal or enterprise knowledge graphs
Accessing graph data across multiple sessions and clients
Enabling AI assistants to automate Neo4j management tasks
Integrating natural language interfaces with existing data workflows
Deploying scalable MCP servers in cloud environments
Importing and exporting data models for collaborative design
Providing self-service database operations through chat-based tools

README

Neo4j MCP Clients & Servers

Model Context Protocol (MCP) is a standardized protocol for managing context between large language models (LLMs) and external systems.

This lets you use Claude Desktop, or any other MCP Client (VS Code, Cursor, Windsurf), to use natural language to accomplish things with Neo4j and your Aura account, e.g.:

  • What is in this graph?
  • Render a chart from the top products sold by frequency, total and average volume
  • List my instances
  • Create a new instance named mcp-test for Aura Professional with 4GB and Graph Data Science enabled
  • Store the fact that I worked on the Neo4j MCP Servers today with Andreas and Oskar

Servers

mcp-neo4j-cypher - natural language to Cypher queries

Details in Readme

Get database schema for a configured database and execute generated read and write Cypher queries on that database.

mcp-neo4j-memory - knowledge graph memory stored in Neo4j

Details in Readme

Store and retrieve entities and relationships from your personal knowledge graph in a local or remote Neo4j instance. Access that information over different sessions, conversations, clients.

mcp-neo4j-cloud-aura-api - Neo4j Aura cloud service management API

Details in Readme

Manage your Neo4j Aura instances directly from the comfort of your AI assistant chat.

Create and destroy instances, find instances by name, scale them up and down and enable features.

mcp-neo4j-data-modeling - interactive graph data modeling and visualization

Details in Readme

Create, validate, and visualize Neo4j graph data models. Allows for model import/export from Arrows.app.

Transport Modes

All servers support multiple transport modes:

  • STDIO (default): Standard input/output for local tools and Claude Desktop integration
  • SSE: Server-Sent Events for web-based deployments
  • HTTP: Streamable HTTP for modern web deployments and microservices

HTTP Transport Configuration

To run a server in HTTP mode, use the --transport http flag:

bash
# Basic HTTP mode
mcp-neo4j-cypher --transport http

# Custom HTTP configuration
mcp-neo4j-cypher --transport http --host 127.0.0.1 --port 8080 --path /api/mcp/

Environment variables are also supported:

bash
export NEO4J_TRANSPORT=http
export NEO4J_MCP_SERVER_HOST=127.0.0.1
export NEO4J_MCP_SERVER_PORT=8080
export NEO4J_MCP_SERVER_PATH=/api/mcp/
mcp-neo4j-cypher

Cloud Deployment

All servers in this repository are containerized and ready for cloud deployment on platforms like AWS ECS Fargate and Azure Container Apps. Each server supports HTTP transport mode specifically designed for scalable, production-ready deployments with auto-scaling and load balancing capabilities.

📋 Complete Cloud Deployment Guide →

The deployment guide covers:

  • AWS ECS Fargate: Step-by-step deployment with auto-scaling and Application Load Balancer
  • Azure Container Apps: Serverless container deployment with built-in scaling and traffic management
  • Configuration Best Practices: Security, monitoring, resource recommendations, and troubleshooting
  • Integration Examples: Connecting MCP clients to cloud-deployed servers

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Blog Posts

License

MIT License

Star History

Star History Chart

Repository Owner

neo4j-contrib
neo4j-contrib

Organization

Repository Details

Language Python
Default Branch main
Size 27,455 KB
Contributors 16
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

Python
98.95%
Dockerfile
0.59%
Makefile
0.38%
Shell
0.08%

Tags

Topics

database mcp mcp-server neo4j stdio

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Neon MCP Server

    Neon MCP Server

    Natural language access to Neon Postgres databases via the Model Context Protocol.

    Neon MCP Server provides a bridge between natural language requests and Neon Postgres databases through the Model Context Protocol (MCP). It enables users to manage database operations such as creating projects, running queries, and handling migrations by translating conversational commands into API calls. Designed for both local and remote setups, it enhances database accessibility for users with varying technical backgrounds. It prioritizes security, recommending use in local development and IDE integrations.

    • 513
    • MCP
    • neondatabase/mcp-server-neon
  • YDB MCP

    YDB MCP

    MCP server for AI-powered natural language database operations on YDB.

    YDB MCP acts as a Model Context Protocol server enabling YDB databases to be accessed via any LLM supporting MCP. It allows AI-driven and natural language interaction with YDB instances by bridging database operations with language model interfaces. Flexible deployment through uvx, pipx, or pip is supported, along with multiple authentication methods. The integration empowers users to manage YDB databases conversationally through standardized protocols.

    • 24
    • MCP
    • ydb-platform/ydb-mcp
  • PGMCP

    PGMCP

    Natural language PostgreSQL interface via the Model Context Protocol.

    PGMCP enables seamless interaction with any PostgreSQL database through natural language queries, translating user intent into structured SQL results. It acts as a Model Context Protocol (MCP) server, connecting AI assistants and MCP-compatible clients to databases with features like streaming, robust error handling, and optional AI-powered SQL generation. The tool ensures secure, read-only access to existing databases using HTTP/MCP protocol. Compatibility includes tools such as Cursor, Claude Desktop, and VS Code extensions.

    • 499
    • MCP
    • subnetmarco/pgmcp
  • Graphlit MCP Server

    Graphlit MCP Server

    Integrate and unify knowledge sources for RAG-ready AI context with the Graphlit MCP Server.

    Graphlit MCP Server provides a Model Context Protocol interface, enabling seamless integration between MCP clients and the Graphlit platform. It supports ingestion from a wide array of sources such as Slack, Discord, Google Drive, email, Jira, and GitHub, turning them into a searchable, RAG-ready knowledge base. Built-in tools allow for document, media extraction, web crawling, and web search, as well as advanced retrieval and publishing functionalities. The server facilitates easy configuration, sophisticated data operations, and automated notifications for diverse workflows.

    • 369
    • MCP
    • graphlit/graphlit-mcp-server
  • Neovim MCP Server

    Neovim MCP Server

    Connect AI assistants to Neovim via the Model Context Protocol.

    Neovim MCP Server enables seamless integration between Neovim instances and AI assistants by implementing the Model Context Protocol (MCP). It allows for multi-connection management, supports both stdio and HTTP server transport modes, and provides access to structured diagnostic information via URI schemes. With LSP integration, plugin support, and an extensible tool system, it facilitates advanced interaction with Neovim for context-aware AI workflows.

    • 20
    • MCP
    • linw1995/nvim-mcp
  • TeslaMate MCP Server

    TeslaMate MCP Server

    Query your TeslaMate data using the Model Context Protocol

    TeslaMate MCP Server implements the Model Context Protocol to enable AI assistants and clients to securely access and query Tesla vehicle data, statistics, and analytics from a TeslaMate PostgreSQL database. The server exposes a suite of tools for retrieving vehicle status, driving history, charging sessions, battery health, and more using standardized MCP endpoints. It supports local and Docker deployments, includes bearer token authentication, and is intended for integration with MCP-compatible AI systems like Claude Desktop.

    • 106
    • MCP
    • cobanov/teslamate-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results