Mem0 MCP Server
Structured management of coding preferences using Mem0 and Model Context Protocol.
Key Features
Use Cases
README
MCP Server with Mem0 for Managing Coding Preferences
This demonstrates a structured approach for using an MCP server with mem0 to manage coding preferences efficiently. The server can be used with Cursor and provides essential tools for storing, retrieving, and searching coding preferences.
Installation
- Clone this repository
- Initialize the
uvenvironment:
uv venv
- Activate the virtual environment:
source .venv/bin/activate
- Install the dependencies using
uv:
# Install in editable mode from pyproject.toml
uv pip install -e .
- Update
.envfile in the root directory with your mem0 API key:
MEM0_API_KEY=your_api_key_here
Usage
- Start the MCP server:
uv run main.py
- In Cursor, connect to the SSE endpoint, follow this doc for reference:
http://0.0.0.0:8080/sse
- Open the Composer in Cursor and switch to
Agentmode.
Demo with Cursor
https://github.com/user-attachments/assets/56670550-fb11-4850-9905-692d3496231c
Features
The server provides three main tools for managing code preferences:
-
add_coding_preference: Store code snippets, implementation details, and coding patterns with comprehensive context including:- Complete code with dependencies
- Language/framework versions
- Setup instructions
- Documentation and comments
- Example usage
- Best practices
-
get_all_coding_preferences: Retrieve all stored coding preferences to analyze patterns, review implementations, and ensure no relevant information is missed. -
search_coding_preferences: Semantically search through stored coding preferences to find relevant:- Code implementations
- Programming solutions
- Best practices
- Setup guides
- Technical documentation
Why?
This implementation allows for a persistent coding preferences system that can be accessed via MCP. The SSE-based server can run as a process that agents connect to, use, and disconnect from whenever needed. This pattern fits well with "cloud-native" use cases where the server and clients can be decoupled processes on different nodes.
Server
By default, the server runs on 0.0.0.0:8080 but is configurable with command line arguments like:
uv run main.py --host <your host> --port <your port>
The server exposes an SSE endpoint at /sse that MCP clients can connect to for accessing the coding preferences management tools.
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
In Memoria
Persistent memory and instant context for AI coding assistants, integrated via MCP.
In Memoria is an MCP server that enables AI coding assistants such as Claude or Copilot to retain, recall, and provide context about codebases across sessions. It learns patterns, architecture, and conventions from user code, offering persistent intelligence that eliminates repetitive explanations and generic suggestions. Through the Model Context Protocol, it allows AI tools to perform semantic search, smart file routing, and track project-specific decisions efficiently.
- ⭐ 94
- MCP
- pi22by7/In-Memoria
Exa MCP Server
Fast, efficient web and code context for AI coding assistants.
Exa MCP Server provides a Model Context Protocol (MCP) server interface that connects AI assistants to Exa AI’s powerful search capabilities, including code, documentation, and web search. It enables coding agents to retrieve precise, token-efficient context from billions of sources such as GitHub, StackOverflow, and documentation sites, reducing hallucinations in coding agents. The platform supports integration with popular tools like Cursor, Claude, and VS Code through standardized MCP configuration, offering configurable access to various research and code-related tools via HTTP.
- ⭐ 3,224
- MCP
- exa-labs/exa-mcp-server
Semgrep MCP Server
A Model Context Protocol server powered by Semgrep for seamless code analysis integration.
Semgrep MCP Server implements the Model Context Protocol (MCP) to enable efficient and standardized communication for code analysis tasks. It facilitates integration with platforms like LM Studio, Cursor, and Visual Studio Code, providing both Docker and Python (PyPI) deployment options. The tool is now maintained in the main Semgrep repository with continued updates, enhancing compatibility and support across developer tools.
- ⭐ 611
- MCP
- semgrep/mcp
Code Declaration Lookup MCP Server
Fast, language-agnostic code declaration search and lookup server via MCP.
Provides a Model Context Protocol (MCP) server that indexes code declarations using universal ctags and SQLite with FTS5 full-text search. Offers search and listing functionality for functions, classes, structures, enums, and other code elements across any language supported by ctags. Enables seamless integration with coding agents for dynamic indexing, respects .gitignore, and supports ctags file ingestion and management.
- ⭐ 2
- MCP
- osinmv/function-lookup-mcp
Vectorize MCP Server
MCP server for advanced vector retrieval and text extraction with Vectorize integration.
Vectorize MCP Server is an implementation of the Model Context Protocol (MCP) that integrates with the Vectorize platform to enable advanced vector retrieval and text extraction. It supports seamless installation and integration within development environments such as VS Code. The server is configurable through environment variables or JSON configuration files and is suitable for use in collaborative and individual workflows requiring vector-based context management for models.
- ⭐ 97
- MCP
- vectorize-io/vectorize-mcp-server
Memory MCP
A Model Context Protocol server for managing LLM conversation memories with intelligent context window caching.
Memory MCP provides a Model Context Protocol (MCP) server for logging, retrieving, and managing memories from large language model (LLM) conversations. It offers features such as context window caching, relevance scoring, and tag-based context retrieval, leveraging MongoDB for persistent storage. The system is designed to efficiently archive, score, and summarize conversational context, supporting external orchestration and advanced memory management tools. This enables seamless handling of conversation history and dynamic context for enhanced LLM applications.
- ⭐ 10
- MCP
- JamesANZ/memory-mcp
Didn't find tool you were looking for?