MCPs tagged with vector database
-
Agentic Long-Term Memory with Notion Integration
Production-ready agentic long-term memory and Notion integration with Model Context Protocol support.
Agentic Long-Term Memory with Notion Integration enables AI agents to incorporate advanced long-term memory capabilities using both vector and graph databases. It offers comprehensive Notion workspace integration along with a production-ready Model Context Protocol (MCP) server supporting HTTP and stdio transports. The tool facilitates context management, tool discovery, and advanced function chaining for complex agentic workflows.
- ⭐ 4
- MCP
- ankitmalik84/Agentic_Longterm_Memory
-
Trieve
All-in-one solution for search, recommendations, and RAG.
Trieve offers a platform for semantic search, recommendations, and retrieval-augmented generation (RAG). It supports dense vector search, typo-tolerant neural search, sub-sentence highlighting, and integrates with a variety of embedding models. Trieve can be self-hosted and features APIs for context management with LLMs, including Bring Your Own Model and managed RAG endpoints. Full documentation and SDKs are available for streamlined integration.
- ⭐ 2,555
- MCP
- devflowinc/trieve
-
MCP Local RAG
Privacy-first local semantic document search server for MCP clients.
MCP Local RAG is a privacy-preserving, local document search server designed for use with Model Context Protocol (MCP) clients such as Cursor, Codex, and Claude Code. It enables users to ingest and semantically search local documents without using external APIs or cloud services. All processing, including embedding generation and vector storage, is performed on the user's machine. The tool supports document ingestion, semantic search, file management, file deletion, and system status reporting through MCP.
- ⭐ 10
- MCP
- shinpr/mcp-local-rag
-
Sourcerer MCP
Semantic code search & navigation MCP server for efficient AI agent context retrieval.
Sourcerer MCP provides a Model Context Protocol (MCP) server that enables AI agents to perform semantic code search and navigation. By indexing codebases at the function, class, and chunk level, it allows agents to retrieve only the necessary code snippets, greatly reducing token consumption. The tool integrates with Tree-sitter for language parsing and OpenAI for generating code embeddings, supporting advanced contextual code understanding without full file ingestion.
- ⭐ 95
- MCP
- st3v3nmw/sourcerer-mcp
-
MCP Server for Milvus
Bridge Milvus vector database with AI apps using Model Context Protocol (MCP).
MCP Server for Milvus enables seamless integration between the Milvus vector database and large language model (LLM) applications via the Model Context Protocol. It exposes Milvus functionality to external LLM-powered tools through both stdio and Server-Sent Events communication modes. The solution is compatible with MCP-enabled clients such as Claude Desktop and Cursor, supporting easy access to relevant vector data for enhanced AI workflows. Configuration is flexible through environment variables or command-line arguments.
- ⭐ 196
- MCP
- zilliztech/mcp-server-milvus
-
mcp-pinecone
A Pinecone-backed Model Context Protocol server for semantic search and document management.
mcp-pinecone implements a Model Context Protocol (MCP) server that integrates with Pinecone indexes for use with clients such as Claude Desktop. It provides powerful tools for semantic search, document reading, listing, and processing within a Pinecone vector database. The server supports operations like embedding, chunking, and upserting records, enabling contextual management of large document sets. Designed for ease of installation and interoperability via the MCP standard.
- ⭐ 150
- MCP
- sirmews/mcp-pinecone
-
VikingDB MCP Server
MCP server for managing and searching VikingDB vector databases.
VikingDB MCP Server is an implementation of the Model Context Protocol (MCP) that acts as a bridge between VikingDB, a high-performance vector database by ByteDance, and AI model context management frameworks. It allows users to store, upsert, and search vectorized information efficiently using standardized MCP commands. The server supports various operations on VikingDB collections and indexes, making it suitable for integrating advanced vector search in AI workflows.
- ⭐ 3
- MCP
- KashiwaByte/vikingdb-mcp-server
-
Chroma MCP Server
A self-hosted Model Context Protocol (MCP) server for Chroma vector database integration.
Chroma MCP Server implements the Model Context Protocol to allow seamless integration between LLM applications and external data using the Chroma embedding database. It enables AI models to create, manage, and query collections with advanced vector search, full text search, and metadata filtering. The server supports both ephemeral and persistent client types, along with integration for HTTP and cloud-based Chroma instances. Multiple embedding functions, collection management tools, and rich document operations are available for extensible LLM workflows.
- ⭐ 418
- MCP
- chroma-core/chroma-mcp