memvid-mcp-server
A Streamable HTTP MCP Server for encoding and semantically searching video-based memory.
Key Features
Use Cases
README
memvid-mcp-server
A Streamable-HTTP MCP Server that uses memvid to encode text data into videos that can be quickly looked up with semantic search.
Supported Actions:
add_chunks: Adds chunks to the memory video. Note: each time you add chunks, it resets the memory.mp4. Unsure if there is a way to incrementally add.search: queries for the top-matching chunks. Returns 5 by default, but can be changed with top_k param.
Running
Set up your environment:
python3.11 -m venv my_env
. ./my_env/bin/activate
pip install -r requirements.txt
Run the server:
python server.py
With a custom port:
PORT=3002 python server.py
Connect a Client
You can connect a client to your MCP Server once it's running. Configure per the client's configuration. There is the mcp-config.json that has an example configuration that looks like this:
{
"mcpServers": {
"memvid": {
"type": "streamable-http",
"url": "http://localhost:3000"
}
}
}
Acknowledgements
- Obviously the modelcontextprotocol and Anthropic teams for the MCP Specification. https://modelcontextprotocol.io/introduction
- HeyFerrante for enabling and sponsoring this project.
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
mcp-server-qdrant
Official Model Context Protocol server for seamless integration with Qdrant vector search engine.
mcp-server-qdrant provides an official implementation of the Model Context Protocol for interfacing with the Qdrant vector search engine. It enables storing and retrieving contextual information, acting as a semantic memory layer for LLM-driven applications. Designed for easy integration, it supports environment-based configuration and extensibility via FastMCP. The server standardizes tool interfaces for managing and querying contextual data using Qdrant.
- ⭐ 1,054
- MCP
- qdrant/mcp-server-qdrant
Mem0 MCP Server
Structured management of coding preferences using Mem0 and Model Context Protocol.
Mem0 MCP Server implements a Model Context Protocol-compliant server for storing, retrieving, and searching coding preferences. It integrates with Mem0 and offers tools for persistent management of code snippets, best practices, and technical documentation. The server exposes an SSE endpoint for clients like Cursor, enabling seamless access and interaction with coding context data.
- ⭐ 506
- MCP
- mem0ai/mem0-mcp
Memory MCP
A Model Context Protocol server for managing LLM conversation memories with intelligent context window caching.
Memory MCP provides a Model Context Protocol (MCP) server for logging, retrieving, and managing memories from large language model (LLM) conversations. It offers features such as context window caching, relevance scoring, and tag-based context retrieval, leveraging MongoDB for persistent storage. The system is designed to efficiently archive, score, and summarize conversational context, supporting external orchestration and advanced memory management tools. This enables seamless handling of conversation history and dynamic context for enhanced LLM applications.
- ⭐ 10
- MCP
- JamesANZ/memory-mcp
Weaviate MCP Server
A server implementation for the Model Context Protocol (MCP) built on Weaviate.
Weaviate MCP Server provides a backend implementation of the Model Context Protocol, enabling interaction with Weaviate for managing, inserting, and querying context objects. The server facilitates object insertion and hybrid search retrieval, supporting context-driven workflows required for LLM orchestration and memory management. It includes tools for building and running a client application, showcasing integration with Weaviate's vector database.
- ⭐ 157
- MCP
- weaviate/mcp-server-weaviate
VikingDB MCP Server
MCP server for managing and searching VikingDB vector databases.
VikingDB MCP Server is an implementation of the Model Context Protocol (MCP) that acts as a bridge between VikingDB, a high-performance vector database by ByteDance, and AI model context management frameworks. It allows users to store, upsert, and search vectorized information efficiently using standardized MCP commands. The server supports various operations on VikingDB collections and indexes, making it suitable for integrating advanced vector search in AI workflows.
- ⭐ 3
- MCP
- KashiwaByte/vikingdb-mcp-server
mcp-pinecone
A Pinecone-backed Model Context Protocol server for semantic search and document management.
mcp-pinecone implements a Model Context Protocol (MCP) server that integrates with Pinecone indexes for use with clients such as Claude Desktop. It provides powerful tools for semantic search, document reading, listing, and processing within a Pinecone vector database. The server supports operations like embedding, chunking, and upserting records, enabling contextual management of large document sets. Designed for ease of installation and interoperability via the MCP standard.
- ⭐ 150
- MCP
- sirmews/mcp-pinecone
Didn't find tool you were looking for?