Mem0 MCP Server

Mem0 MCP Server

Structured management of coding preferences using Mem0 and Model Context Protocol.

506
Stars
104
Forks
506
Watchers
11
Issues
Mem0 MCP Server implements a Model Context Protocol-compliant server for storing, retrieving, and searching coding preferences. It integrates with Mem0 and offers tools for persistent management of code snippets, best practices, and technical documentation. The server exposes an SSE endpoint for clients like Cursor, enabling seamless access and interaction with coding context data.

Key Features

Stores detailed coding preferences with full context
Retrieves all stored coding preferences
Performs semantic search on saved coding patterns
Integrates with Mem0 knowledge base
Exposes standards-compliant SSE endpoint
Allows configuration of server host and port
Persistent storage of snippets and documentation
Works with clients like Cursor IDE
Saves language/framework versions and setup guides
Supports cloud-native, decoupled deployments

Use Cases

Centralized storage of implementation details for development teams
Semantic search of programming best practices
On-demand retrieval of code snippets with documentation
Enabling agents to access and manage coding contexts
Integrating custom context servers with Cursor IDE
Cloud-based knowledge sharing for codebases
Persistent, structured storage of language/framework guides
Providing technical onboarding resources for new developers
Automating retrieval of relevant code patterns in CI/CD pipelines
Enhancing code review processes with historical context

README

MCP Server with Mem0 for Managing Coding Preferences

This demonstrates a structured approach for using an MCP server with mem0 to manage coding preferences efficiently. The server can be used with Cursor and provides essential tools for storing, retrieving, and searching coding preferences.

Installation

  1. Clone this repository
  2. Initialize the uv environment:
bash
uv venv
  1. Activate the virtual environment:
bash
source .venv/bin/activate
  1. Install the dependencies using uv:
bash
# Install in editable mode from pyproject.toml
uv pip install -e .
  1. Update .env file in the root directory with your mem0 API key:
bash
MEM0_API_KEY=your_api_key_here

Usage

  1. Start the MCP server:
bash
uv run main.py
  1. In Cursor, connect to the SSE endpoint, follow this doc for reference:
http://0.0.0.0:8080/sse
  1. Open the Composer in Cursor and switch to Agent mode.

Demo with Cursor

https://github.com/user-attachments/assets/56670550-fb11-4850-9905-692d3496231c

Features

The server provides three main tools for managing code preferences:

  1. add_coding_preference: Store code snippets, implementation details, and coding patterns with comprehensive context including:

    • Complete code with dependencies
    • Language/framework versions
    • Setup instructions
    • Documentation and comments
    • Example usage
    • Best practices
  2. get_all_coding_preferences: Retrieve all stored coding preferences to analyze patterns, review implementations, and ensure no relevant information is missed.

  3. search_coding_preferences: Semantically search through stored coding preferences to find relevant:

    • Code implementations
    • Programming solutions
    • Best practices
    • Setup guides
    • Technical documentation

Why?

This implementation allows for a persistent coding preferences system that can be accessed via MCP. The SSE-based server can run as a process that agents connect to, use, and disconnect from whenever needed. This pattern fits well with "cloud-native" use cases where the server and clients can be decoupled processes on different nodes.

Server

By default, the server runs on 0.0.0.0:8080 but is configurable with command line arguments like:

uv run main.py --host <your host> --port <your port>

The server exposes an SSE endpoint at /sse that MCP clients can connect to for accessing the coding preferences management tools.

Star History

Star History Chart

Repository Owner

mem0ai
mem0ai

Organization

Repository Details

Language Python
Default Branch main
Size 157 KB
Contributors 4
MCP Verified Nov 12, 2025

Programming Languages

Python
59.45%
TypeScript
40.55%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • In Memoria

    In Memoria

    Persistent memory and instant context for AI coding assistants, integrated via MCP.

    In Memoria is an MCP server that enables AI coding assistants such as Claude or Copilot to retain, recall, and provide context about codebases across sessions. It learns patterns, architecture, and conventions from user code, offering persistent intelligence that eliminates repetitive explanations and generic suggestions. Through the Model Context Protocol, it allows AI tools to perform semantic search, smart file routing, and track project-specific decisions efficiently.

    • 94
    • MCP
    • pi22by7/In-Memoria
  • Exa MCP Server

    Exa MCP Server

    Fast, efficient web and code context for AI coding assistants.

    Exa MCP Server provides a Model Context Protocol (MCP) server interface that connects AI assistants to Exa AI’s powerful search capabilities, including code, documentation, and web search. It enables coding agents to retrieve precise, token-efficient context from billions of sources such as GitHub, StackOverflow, and documentation sites, reducing hallucinations in coding agents. The platform supports integration with popular tools like Cursor, Claude, and VS Code through standardized MCP configuration, offering configurable access to various research and code-related tools via HTTP.

    • 3,224
    • MCP
    • exa-labs/exa-mcp-server
  • Semgrep MCP Server

    Semgrep MCP Server

    A Model Context Protocol server powered by Semgrep for seamless code analysis integration.

    Semgrep MCP Server implements the Model Context Protocol (MCP) to enable efficient and standardized communication for code analysis tasks. It facilitates integration with platforms like LM Studio, Cursor, and Visual Studio Code, providing both Docker and Python (PyPI) deployment options. The tool is now maintained in the main Semgrep repository with continued updates, enhancing compatibility and support across developer tools.

    • 611
    • MCP
    • semgrep/mcp
  • Code Declaration Lookup MCP Server

    Code Declaration Lookup MCP Server

    Fast, language-agnostic code declaration search and lookup server via MCP.

    Provides a Model Context Protocol (MCP) server that indexes code declarations using universal ctags and SQLite with FTS5 full-text search. Offers search and listing functionality for functions, classes, structures, enums, and other code elements across any language supported by ctags. Enables seamless integration with coding agents for dynamic indexing, respects .gitignore, and supports ctags file ingestion and management.

    • 2
    • MCP
    • osinmv/function-lookup-mcp
  • Vectorize MCP Server

    Vectorize MCP Server

    MCP server for advanced vector retrieval and text extraction with Vectorize integration.

    Vectorize MCP Server is an implementation of the Model Context Protocol (MCP) that integrates with the Vectorize platform to enable advanced vector retrieval and text extraction. It supports seamless installation and integration within development environments such as VS Code. The server is configurable through environment variables or JSON configuration files and is suitable for use in collaborative and individual workflows requiring vector-based context management for models.

    • 97
    • MCP
    • vectorize-io/vectorize-mcp-server
  • Memory MCP

    Memory MCP

    A Model Context Protocol server for managing LLM conversation memories with intelligent context window caching.

    Memory MCP provides a Model Context Protocol (MCP) server for logging, retrieving, and managing memories from large language model (LLM) conversations. It offers features such as context window caching, relevance scoring, and tag-based context retrieval, leveraging MongoDB for persistent storage. The system is designed to efficiently archive, score, and summarize conversational context, supporting external orchestration and advanced memory management tools. This enables seamless handling of conversation history and dynamic context for enhanced LLM applications.

    • 10
    • MCP
    • JamesANZ/memory-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results