memvid-mcp-server

memvid-mcp-server

A Streamable HTTP MCP Server for encoding and semantically searching video-based memory.

8
Stars
3
Forks
8
Watchers
1
Issues
memvid-mcp-server provides a Model Context Protocol (MCP) compatible HTTP server that leverages memvid to encode text data into videos. It supports adding text chunks as video and performing semantic search over them using standardized MCP actions such as add_chunks and search. The server can be integrated with MCP clients via streamable HTTP and enables fast context retrieval for AI applications.

Key Features

MCP protocol compliance
Text chunk encoding into video format
Semantic search over stored chunks
Streamable HTTP communication
Customizable search results (top_k)
Integration with MCP-compatible clients
Automatic memory.mp4 generation and reset
Example configuration provided (mcp-config.json)
Support for custom server port
Easy setup and deployment

Use Cases

Storing and retrieving AI model context as video-encoded memory
Integrating fast semantic search into AI pipelines
Serving context to LLMs using MCP standard protocols
Supporting applications that require efficient context lookup for large text datasets
Building streamable context servers for AI agents
Providing quick access to relevant information in conversational AI systems
Developing custom context stores for AI tools
Experimenting with video-based approaches to context management
Facilitating interoperability in multi-model AI environments
Enabling programmatic control of context memory through HTTP

README

memvid-mcp-server

A Streamable-HTTP MCP Server that uses memvid to encode text data into videos that can be quickly looked up with semantic search.

Supported Actions:

  • add_chunks: Adds chunks to the memory video. Note: each time you add chunks, it resets the memory.mp4. Unsure if there is a way to incrementally add.
  • search: queries for the top-matching chunks. Returns 5 by default, but can be changed with top_k param.

Running

Set up your environment:

bash
python3.11 -m venv my_env
. ./my_env/bin/activate
pip install -r requirements.txt

Run the server:

bash
python server.py

With a custom port:

bash
PORT=3002 python server.py

Connect a Client

You can connect a client to your MCP Server once it's running. Configure per the client's configuration. There is the mcp-config.json that has an example configuration that looks like this:

json
{
  "mcpServers": {
    "memvid": {
      "type": "streamable-http",
      "url": "http://localhost:3000"
    }
  }
}

Acknowledgements

Star History

Star History Chart

Repository Owner

ferrants
ferrants

User

Repository Details

Language Python
Default Branch main
Size 3 KB
Contributors 1
License Other
MCP Verified Nov 12, 2025

Programming Languages

Python
100%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • mcp-server-qdrant

    mcp-server-qdrant

    Official Model Context Protocol server for seamless integration with Qdrant vector search engine.

    mcp-server-qdrant provides an official implementation of the Model Context Protocol for interfacing with the Qdrant vector search engine. It enables storing and retrieving contextual information, acting as a semantic memory layer for LLM-driven applications. Designed for easy integration, it supports environment-based configuration and extensibility via FastMCP. The server standardizes tool interfaces for managing and querying contextual data using Qdrant.

    • 1,054
    • MCP
    • qdrant/mcp-server-qdrant
  • Mem0 MCP Server

    Mem0 MCP Server

    Structured management of coding preferences using Mem0 and Model Context Protocol.

    Mem0 MCP Server implements a Model Context Protocol-compliant server for storing, retrieving, and searching coding preferences. It integrates with Mem0 and offers tools for persistent management of code snippets, best practices, and technical documentation. The server exposes an SSE endpoint for clients like Cursor, enabling seamless access and interaction with coding context data.

    • 506
    • MCP
    • mem0ai/mem0-mcp
  • Memory MCP

    Memory MCP

    A Model Context Protocol server for managing LLM conversation memories with intelligent context window caching.

    Memory MCP provides a Model Context Protocol (MCP) server for logging, retrieving, and managing memories from large language model (LLM) conversations. It offers features such as context window caching, relevance scoring, and tag-based context retrieval, leveraging MongoDB for persistent storage. The system is designed to efficiently archive, score, and summarize conversational context, supporting external orchestration and advanced memory management tools. This enables seamless handling of conversation history and dynamic context for enhanced LLM applications.

    • 10
    • MCP
    • JamesANZ/memory-mcp
  • Weaviate MCP Server

    Weaviate MCP Server

    A server implementation for the Model Context Protocol (MCP) built on Weaviate.

    Weaviate MCP Server provides a backend implementation of the Model Context Protocol, enabling interaction with Weaviate for managing, inserting, and querying context objects. The server facilitates object insertion and hybrid search retrieval, supporting context-driven workflows required for LLM orchestration and memory management. It includes tools for building and running a client application, showcasing integration with Weaviate's vector database.

    • 157
    • MCP
    • weaviate/mcp-server-weaviate
  • VikingDB MCP Server

    VikingDB MCP Server

    MCP server for managing and searching VikingDB vector databases.

    VikingDB MCP Server is an implementation of the Model Context Protocol (MCP) that acts as a bridge between VikingDB, a high-performance vector database by ByteDance, and AI model context management frameworks. It allows users to store, upsert, and search vectorized information efficiently using standardized MCP commands. The server supports various operations on VikingDB collections and indexes, making it suitable for integrating advanced vector search in AI workflows.

    • 3
    • MCP
    • KashiwaByte/vikingdb-mcp-server
  • mcp-pinecone

    mcp-pinecone

    A Pinecone-backed Model Context Protocol server for semantic search and document management.

    mcp-pinecone implements a Model Context Protocol (MCP) server that integrates with Pinecone indexes for use with clients such as Claude Desktop. It provides powerful tools for semantic search, document reading, listing, and processing within a Pinecone vector database. The server supports operations like embedding, chunking, and upserting records, enabling contextual management of large document sets. Designed for ease of installation and interoperability via the MCP standard.

    • 150
    • MCP
    • sirmews/mcp-pinecone
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results