OpenZIM MCP Server - Alternatives & Competitors
Transforms ZIM archives into intelligent, structured knowledge engines for LLMs.
OpenZIM MCP Server provides structured, intelligent access to ZIM-format knowledge bases, enabling large language models to efficiently search, navigate, and understand content in offline archives. Dual operation modes allow support for both advanced and simple LLM integrations. It features smart navigation by namespace, context-aware discovery, intelligent search, and relationship mapping to optimize knowledge extraction and utilization.
Ranked by Relevance
-
1
OpenStreetMap MCP Server
Enhancing LLMs with geospatial and location-based capabilities via the Model Context Protocol.
OpenStreetMap MCP Server enables large language models to interact with rich geospatial data and location-based services through a standardized protocol. It provides APIs and tools for address geocoding, reverse geocoding, points of interest search, route directions, and neighborhood analysis. The server exposes location-related resources and tools, making it compatible with MCP hosts for seamless LLM integration.
134 30 MCP -
2
Zettelkasten MCP Server
A Zettelkasten-based knowledge management system implementing the Model Context Protocol.
Zettelkasten MCP Server provides an implementation of the Zettelkasten note-taking methodology, enriched with bidirectional linking, semantic relationships, and categorization of notes. It enables creation, exploration, and synthesis of atomic knowledge using MCP for AI-assisted workflows. The system integrates with clients such as Claude and supports markdown, advanced search, and a structured prompt framework for large language models. The dual storage architecture and synchronous operation model ensure flexibility and reliability for managing personal or collaborative knowledge bases.
114 21 MCP -
3
OpenAI WebSearch MCP Server
Intelligent web search with OpenAI reasoning model support, fully MCP-compatible.
OpenAI WebSearch MCP Server provides advanced web search functionality integrated with OpenAI's latest reasoning models, such as gpt-5 and o3-series. It features full compatibility with the Model Context Protocol, enabling easy integration into AI assistants that require up-to-date information and contextual awareness. Built with flexible configuration options, smart reasoning effort controls, and support for location-based search customization. Suitable for environments such as Claude Desktop, Cursor, and automated research workflows.
75 18 MCP -
4
Memory MCP
A Model Context Protocol server for managing LLM conversation memories with intelligent context window caching.
Memory MCP provides a Model Context Protocol (MCP) server for logging, retrieving, and managing memories from large language model (LLM) conversations. It offers features such as context window caching, relevance scoring, and tag-based context retrieval, leveraging MongoDB for persistent storage. The system is designed to efficiently archive, score, and summarize conversational context, supporting external orchestration and advanced memory management tools. This enables seamless handling of conversation history and dynamic context for enhanced LLM applications.
10 6 MCP -
5
Kibela MCP Server
MCP server for seamless LLM integration with Kibela knowledge management.
Kibela MCP Server enables integration of Large Language Models (LLMs) with the Kibela note-sharing platform via the Model Context Protocol. It provides search, retrieval, and management of Kibela notes, users, groups, and folders, exposing these capabilities in a standardized MCP interface. The implementation utilizes Kibela's GraphQL API and supports configuration through environment variables and Docker. Designed for interoperability with tools like Cursor, it streamlines access and manipulation of organizational knowledge by AI systems.
7 6 MCP -
6
ZoomEye MCP Server
Real-time cyberspace asset intelligence for AI assistants via Model Context Protocol.
ZoomEye MCP Server implements the Model Context Protocol (MCP) to provide network asset intelligence to AI assistants and development tools. It enables querying of global internet assets through ZoomEye's cyber asset search engine using structured parameters and dorks. The server includes features like caching, error handling, and compatibility with leading MCP environments, supporting real-time cyber asset data integration for various AI and developer platforms.
50 13 MCP -
7
godoc-mcp
Token-efficient Go documentation server for LLMs using Model Context Protocol.
godoc-mcp is a Model Context Protocol (MCP) server that provides efficient, structured access to Go package documentation for large language models. It enables LLMs to understand Go projects without reading entire source files by supplying essential documentation and source code at varying levels of granularity. The tool supports project navigation, automatic module setup, caching, and works offline for both standard and third-party Go packages.
88 11 MCP -
8
GeoServer MCP Server
Connect LLMs to GeoServer for geospatial data management and AI-driven queries.
GeoServer MCP Server implements the Model Context Protocol, enabling seamless integration between Large Language Models (LLMs) and the GeoServer REST API. It allows AI assistants to interact with, query, and manipulate geospatial data and services through standardized interfaces. The server supports management of workspaces, layers, and spatial queries, as well as rendering geospatial visualizations. Installation is supported via Docker, pip, and integration tools like Smithery, with compatibility for clients such as Claude Desktop and Cursor.
43 9 MCP -
9
MemoryMesh
A knowledge graph server for structured AI memory and context management.
MemoryMesh is a knowledge graph server designed to help AI models maintain structured, consistent memory, especially for interactive storytelling and RPG contexts. It is based on the Model Context Protocol (MCP), as explicitly stated, and retains core MCP server functionalities. By utilizing dynamic, schema-based configuration, the server enables creation and management of nodes and relationships, offering comprehensive tools for data integrity, feedback, and event tracking. MemoryMesh emphasizes flexibility, supporting both predefined and dynamic schemas for guiding AI interactions.
313 43 MCP -
10
RAG Documentation MCP Server
Vector-based documentation search and context augmentation for AI assistants
RAG Documentation MCP Server provides vector-based search and retrieval tools for documentation, enabling large language models to reference relevant context in their responses. It supports managing multiple documentation sources, semantic search, and real-time context delivery. Documentation can be indexed, searched, and managed with queueing and processing features, making it highly suitable for AI-driven assistants. Integration with Claude Desktop and support for Qdrant vector databases is also available.
238 29 MCP -
11
OpenTK Model Context Protocol Server
A standardized interface for LLMs to access Dutch parliamentary data.
OpenTK Model Context Protocol Server provides a bridge between large language models and Dutch parliamentary data using the Model Context Protocol (MCP). It enables AI systems to access, search, and analyze parliamentary documents, debates, and member information from the Tweede Kamer through a unified and structured interface. By leveraging the @modelcontextprotocol/sdk, it ensures consistent context management for model interactions. Built atop the OpenTK project, it delivers streamlined access to extensive open government datasets.
16 3 MCP -
12
GIS MCP Server
Empower AI with advanced geospatial operations via Model Context Protocol.
GIS MCP Server provides a Model Context Protocol (MCP) server implementation that enables Large Language Models to access and perform sophisticated GIS operations. It bridges AI assistants with Python geospatial libraries such as Shapely, GeoPandas, PyProj, Rasterio, and PySAL. The server supports a wide range of spatial analysis, coordinate transformations, raster and vector data processing, and geospatial intelligence tasks. By integrating with MCP-compatible clients, it enhances AI tools with precise and extensible spatial capabilities.
70 21 MCP -
13
Graphlit MCP Server
Integrate and unify knowledge sources for RAG-ready AI context with the Graphlit MCP Server.
Graphlit MCP Server provides a Model Context Protocol interface, enabling seamless integration between MCP clients and the Graphlit platform. It supports ingestion from a wide array of sources such as Slack, Discord, Google Drive, email, Jira, and GitHub, turning them into a searchable, RAG-ready knowledge base. Built-in tools allow for document, media extraction, web crawling, and web search, as well as advanced retrieval and publishing functionalities. The server facilitates easy configuration, sophisticated data operations, and automated notifications for diverse workflows.
369 49 MCP -
14
Cross-LLM MCP Server
Unified MCP server for accessing and combining multiple LLM APIs.
Cross-LLM MCP Server is a Model Context Protocol (MCP) server enabling seamless access to a range of Large Language Model APIs including ChatGPT, Claude, DeepSeek, Gemini, Grok, Kimi, Perplexity, and Mistral. It provides a unified interface for invoking different LLMs from any MCP-compatible client, allowing users to call and aggregate responses across providers. The server implements eight specialized tools for interacting with these LLMs, each offering configurable options like model selection, temperature, and token limits. Output includes model context details as well as token usage statistics for each response.
9 5 MCP -
15
MCP Local RAG
Privacy-first local semantic document search server for MCP clients.
MCP Local RAG is a privacy-preserving, local document search server designed for use with Model Context Protocol (MCP) clients such as Cursor, Codex, and Claude Code. It enables users to ingest and semantically search local documents without using external APIs or cloud services. All processing, including embedding generation and vector storage, is performed on the user's machine. The tool supports document ingestion, semantic search, file management, file deletion, and system status reporting through MCP.
10 3 MCP -
16
TrackMage MCP Server
Shipment and logistics tracking MCP server with multi-carrier support.
TrackMage MCP Server implements the Model Context Protocol (MCP) to provide shipment tracking, logistics management, and API integration for over 1,600 carriers worldwide. It allows integration with major LLMs, supports resources such as workspaces, shipments, orders, carriers, and tracking statuses, and offers tools to create, update, and monitor shipments and orders. The server supports OAuth-based authentication, flexible configuration via environment variables, and can be deployed locally for customized logistics operations.
1 4 MCP -
17
MCP-Geo
Geocoding and reverse geocoding MCP server for LLMs.
MCP-Geo provides geocoding and reverse geocoding capabilities to AI models using the Model Context Protocol, powered by the GeoPY library. It offers various tools such as address lookup, reverse lookup from coordinates, distance calculations, and batch processing of locations, all accessible via standard MCP tool interfaces. Safety features like rate limiting and robust error handling ensure reliable and compliant usage of geocoding services. The server is compatible with environments like Claude Desktop and can be easily configured elsewhere.
28 4 MCP -
18
Ragie Model Context Protocol Server
Seamless knowledge base retrieval via Model Context Protocol for enhanced AI context.
Ragie Model Context Protocol Server enables AI models to access and retrieve information from a Ragie-managed knowledge base using the standardized Model Context Protocol (MCP). It provides a retrieve tool with customizable query options and supports integration with tools like Cursor and Claude Desktop. Users can configure API keys, specify partitions, and override tool descriptions. Designed for rapid setup via npx and flexible for project-specific or global usage.
81 18 MCP -
19
MCP Zotero
Model Context Protocol server for seamless Zotero integration with AI tools.
MCP Zotero provides a Model Context Protocol server enabling AI models such as Claude to access and interact with Zotero libraries. Users can securely link their Zotero accounts and perform actions including listing collections, retrieving papers, searching the library, and getting details about specific items. Integration is designed for both standalone operation and as an extension for tools like Claude Desktop.
137 17 MCP -
20
OSP Marketing Tools for LLMs
Comprehensive marketing content creation and optimization tools for LLMs using MCP.
OSP Marketing Tools for LLMs offers a suite of marketing content creation and optimization utilities designed to operate with Large Language Models that support the Model Context Protocol (MCP). Built on Open Strategy Partners’ proprietary methodologies, it provides structured workflows for product value mapping, metadata generation, content editing, technical writing, and SEO guidance. The suite includes features for persona development, value case documentation, semantic editing, and technical writing best practices, enabling consistent and high-quality marketing outputs. Designed to integrate seamlessly with MCP-compatible LLM clients, it streamlines complex marketing processes and empowers efficient collaboration across technical and non-technical teams.
252 40 MCP -
21
In Memoria
Persistent memory and instant context for AI coding assistants, integrated via MCP.
In Memoria is an MCP server that enables AI coding assistants such as Claude or Copilot to retain, recall, and provide context about codebases across sessions. It learns patterns, architecture, and conventions from user code, offering persistent intelligence that eliminates repetitive explanations and generic suggestions. Through the Model Context Protocol, it allows AI tools to perform semantic search, smart file routing, and track project-specific decisions efficiently.
94 17 MCP -
22
Membase-MCP Server
Decentralized memory layer server for AI agents using the Model Context Protocol.
Membase-MCP Server provides decentralized and persistent storage of conversation history and agent knowledge for AI agents using Unibase and the Model Context Protocol. It supports secure, traceable storage and retrieval of messages to ensure agent continuity and personalization within interactions. The server offers integration with Claude, Windsurf, Cursor, and Cline, allowing dynamic context management such as switching conversations and saving or retrieving messages. The server leverages the Unibase DA network for verifiable storage and agent data interoperability.
15 4 MCP -
23
Octagon Deep Research MCP
AI-powered, enterprise-grade deep research server for MCP clients.
Octagon Deep Research MCP provides specialized AI-driven research and analysis via seamless integration with MCP-enabled applications. It offers comprehensive multi-source data aggregation, advanced analysis tools, and generates in-depth reports across various domains. The solution emphasizes high performance with no rate limits, enterprise-grade speed, and universal compatibility for teams needing thorough research capabilities.
70 12 MCP -
24
Context7 MCP
Up-to-date code docs for every AI prompt.
Context7 MCP delivers current, version-specific documentation and code examples directly into large language model prompts. By integrating with model workflows, it ensures responses are accurate and based on the latest source material, reducing outdated and hallucinated code. Users can fetch relevant API documentation and examples by simply adding a directive to their prompts. This allows for more reliable, context-rich answers tailored to real-world programming scenarios.
36,881 1,825 MCP -
25
Trieve
All-in-one solution for search, recommendations, and RAG.
Trieve offers a platform for semantic search, recommendations, and retrieval-augmented generation (RAG). It supports dense vector search, typo-tolerant neural search, sub-sentence highlighting, and integrates with a variety of embedding models. Trieve can be self-hosted and features APIs for context management with LLMs, including Bring Your Own Model and managed RAG endpoints. Full documentation and SDKs are available for streamlined integration.
2,555 229 MCP -
26
MCP Server for Google Tag Manager
Remote MCP server enabling Google Tag Manager integration with AI clients.
MCP Server for Google Tag Manager enables remote MCP connections with built-in Google OAuth, creating an interface to the Google Tag Manager API. It facilitates secure authentication and streamlined access for AI tools like Claude Desktop and Cursor AI. Developers can quickly configure their MCP clients for seamless integration and manage credentials with ease. Tools and workflows become accessible once authenticated, enhancing contextual interaction and automation through Google Tag Manager.
70 29 MCP -
27
ONES Wiki MCP Server
Spring AI MCP-based service for extracting and transforming ONES Wiki content for AI applications.
ONES Wiki MCP Server provides an MCP-compliant service built on Spring AI MCP for retrieving and converting ONES Wiki content into structured, AI-friendly formats. It supports authentication with ONES platform, automatic translation of Wiki URLs to API endpoints, and outputs processed content as Markdown. The service can be configured through properties, command line arguments, or environment variables, and integrates with MCP-compatible clients such as Claude Desktop.
2 2 MCP -
28
RivalSearchMCP
Advanced MCP server for web research, discovery, and trend analysis.
RivalSearchMCP is an advanced Model Context Protocol (MCP) server designed to streamline web research, content discovery, and trend analysis. It offers tools for multi-engine web search, intelligent content retrieval, website analysis, and AI-driven content insights. The platform includes integrated trends analysis, research workflows with progress tracking, and automated generation of LLMs.txt documentation files. Its anti-detection features, real-time content streaming, and flexible data export options make it ideal for complex research and automation workflows.
10 7 MCP -
29
mcp-server-qdrant
Official Model Context Protocol server for seamless integration with Qdrant vector search engine.
mcp-server-qdrant provides an official implementation of the Model Context Protocol for interfacing with the Qdrant vector search engine. It enables storing and retrieving contextual information, acting as a semantic memory layer for LLM-driven applications. Designed for easy integration, it supports environment-based configuration and extensibility via FastMCP. The server standardizes tool interfaces for managing and querying contextual data using Qdrant.
1,054 187 MCP -
30
@reapi/mcp-openapi
Serve multiple OpenAPI specs for LLM-powered IDE integrations via the Model Context Protocol.
@reapi/mcp-openapi is a Model Context Protocol (MCP) server that loads and serves multiple OpenAPI specifications, making APIs available to LLM-powered IDEs and development tools. It enables Large Language Models to access, interpret, and work directly with OpenAPI docs within code editors such as Cursor. The server supports dereferenced schemas, maintains an API catalog, and offers project-specific or global configuration. Sponsored by ReAPI, it bridges the gap between API specifications and AI-powered developer environments.
71 13 MCP -
31
Open Data Model Context Protocol
Easily connect open data providers to LLMs using a Model Context Protocol server and CLI.
Open Data Model Context Protocol enables seamless integration of open public datasets into Large Language Model (LLM) applications, starting with support for Claude. Through a CLI tool and server, users can access and query public data providers within their LLM clients. It also offers tools and templates for contributors to publish and distribute new open datasets, making data discoverable and actionable for LLM queries.
140 21 MCP -
32
mcp-open-library
Model Context Protocol server for accessing Open Library book and author data.
Provides an implementation of a Model Context Protocol (MCP) server to enable AI assistants and clients to search and retrieve book and author information from the Open Library API. Supports searching by title, author name, and various identifiers, as well as fetching author photos and book covers. Returns structured, machine-readable data suitable for AI model context integration. Offers installation via Smithery, manual setup, and Docker deployment.
34 9 MCP -
33
LlamaCloud MCP Server
Connect multiple LlamaCloud indexes as tools for your MCP client.
LlamaCloud MCP Server is a TypeScript-based implementation of a Model Context Protocol server that allows users to connect multiple managed indexes from LlamaCloud as separate tools in MCP-compatible clients. Each tool is defined via command-line parameters, enabling flexible and dynamic access to different document indexes. The server automatically generates tool interfaces, each capable of querying its respective LlamaCloud index, with customizable parameters such as index name, description, and result limits. Designed for seamless integration, it works with clients like Claude Desktop, Windsurf, and Cursor.
82 17 MCP -
34
Weaviate MCP Server
A server implementation for the Model Context Protocol (MCP) built on Weaviate.
Weaviate MCP Server provides a backend implementation of the Model Context Protocol, enabling interaction with Weaviate for managing, inserting, and querying context objects. The server facilitates object insertion and hybrid search retrieval, supporting context-driven workflows required for LLM orchestration and memory management. It includes tools for building and running a client application, showcasing integration with Weaviate's vector database.
157 38 MCP -
35
MCP Language Server
Bridge codebase navigation tools to AI models using MCP-enabled language servers.
MCP Language Server implements the Model Context Protocol, allowing MCP-enabled clients, such as LLMs, to interact with language servers for codebase navigation. It exposes standard language server features—like go to definition, references, rename, and diagnostics—over MCP for seamless integration with AI tooling. The server supports multiple languages by serving as a proxy to underlying language servers, including gopls, rust-analyzer, and pyright.
1,256 94 MCP -
36
MCP-Typescribe
An MCP server for serving TypeScript API context to language models.
MCP-Typescribe is an open-source implementation of the Model Context Protocol (MCP) focused on providing LLMs with contextual, real-time access to TypeScript API documentation. It parses TypeScript (and other) definitions using TypeDoc-generated JSON and serves this information via a queryable server that supports tools used by AI coding assistants. The solution enables AI agents to dynamically explore, search, and understand unknown APIs, accelerating onboarding and supporting agentic behaviors in code generation.
45 6 MCP -
37
Chroma MCP Server
A self-hosted Model Context Protocol (MCP) server for Chroma vector database integration.
Chroma MCP Server implements the Model Context Protocol to allow seamless integration between LLM applications and external data using the Chroma embedding database. It enables AI models to create, manage, and query collections with advanced vector search, full text search, and metadata filtering. The server supports both ephemeral and persistent client types, along with integration for HTTP and cloud-based Chroma instances. Multiple embedding functions, collection management tools, and rich document operations are available for extensible LLM workflows.
418 78 MCP -
38
Slite MCP Server
Bridge Slite notes with AI model context using the MCP standard.
Slite MCP Server implements the Model Context Protocol to interface with Slite's API, enabling seamless search, retrieval, and hierarchical browsing of workspace notes. It exposes standardized tools for searching notes, retrieving content by ID, and navigating note structures, making Slite content programmatically accessible to contextual AI pipelines. Built in Node.js and TypeScript, it is configurable and supports authentication via API key.
0 0 MCP -
39
QA Sphere MCP Server
Model Context Protocol server enabling LLMs to interact with QA Sphere test cases
QA Sphere MCP Server provides a Model Context Protocol (MCP) integration for QA Sphere, allowing Large Language Models to interact with, discover, and summarize test cases within the QA Sphere test management system. It enables AI-powered IDEs and MCP clients to reference and manipulate QA Sphere test case data within development workflows. The solution supports quick integration into clients like Claude, Cursor, and 5ire, facilitating seamless collaboration and context sharing for AI-assisted development.
15 6 MCP -
40
Open Feishu MCP Server
A Cloudflare-based remote MCP server with integrated Feishu OAuth authentication.
Open Feishu MCP Server provides a fully functional remote Model Context Protocol (MCP) server with integrated Feishu OAuth support. It enables zero-configuration user authentication and context management via Feishu accounts and is optimized for usability and tool set performance. Designed for deployment on Cloudflare Workers, it offers scalable, secure connectivity with multiple clients and supports seamless integration with various tools and platforms.
66 9 MCP -
41
dicom-mcp
A Model Context Protocol server for managing and querying DICOM medical imaging data.
dicom-mcp enables AI assistants and tools to query, read, and transfer data on DICOM servers, such as PACS and VNA systems. It integrates with MCP-compatible clients, offering tooling for searching patient records, retrieving medical reports, and sending image data to analysis endpoints. Configurable via YAML, it streamlines operations on DICOM databases for research and development in medical imaging. It is explicitly designed for interoperability with LLM-based AI workflows.
74 21 MCP -
42
Ebook-MCP
A Model Context Protocol server for conversational e-book interaction and AI integration.
Ebook-MCP acts as a Model Context Protocol (MCP) server enabling seamless interaction between large language model (LLM) applications and electronic books such as EPUB and PDF. It standardizes APIs for AI-powered reading, searching, and managing digital libraries. Through natural language interfaces, it provides smart library management, content navigation, and interactive learning within digital books. Ebook-MCP integrates with modern AI-powered IDEs and supports multi-format digital book processing.
132 23 MCP -
43
Couchbase MCP Server
Enable LLMs to interact directly with Couchbase clusters via the Model Context Protocol.
Couchbase MCP Server provides an MCP-compliant server for connecting Large Language Models to Couchbase clusters. It supports various database operations such as bucket and collection listing, document retrieval, upsert, and deletion, as well as running SQL++ queries and retrieving index information. Designed for easy integration with MCP clients like Claude Desktop, it includes features for secure authentication and query mode configuration. The server can be deployed using a prebuilt PyPI package or directly from source.
24 28 MCP -
44
Quarkus Model Context Protocol Servers
Extensible Java-based servers implementing the Model Context Protocol for context-aware LLM integrations.
Quarkus Model Context Protocol Servers offers a collection of Java-based servers implementing the Model Context Protocol (MCP) to extend the capabilities of language model applications. Built with the Quarkus MCP server framework, it enables integration with JDBC databases, JVM processes, file systems, JavaFX, Kubernetes, containers, and Wolfram Alpha. The project allows easy deployment and extension of context-aware services for AI applications via MCP. Its servers can be run across different environments using jbang and are easily extensible for new capabilities.
176 46 MCP -
45
Filestash
A modular, extensible file manager with robust API and LLM Model Context Protocol (MCP) integration.
Filestash is a versatile file manager that supports a wide range of backends including FTP, SFTP, WebDAV, S3, SMB, and popular cloud storage services such as Dropbox and Google Drive. It features a plugin-driven architecture and workflow engine for customization and automation and offers built-in viewers for images, music, and video. Filestash provides an API interface and explicit LLM integration via Model Context Protocol (MCP), enabling advanced file management automation and AI-driven workflows.
13,013 930 MCP -
46
Slack MCP Server
A feature-rich Model Context Protocol server for integrating Slack Workspaces with AI model context management.
Slack MCP Server acts as a Model Context Protocol (MCP) server tailored for Slack Workspaces, providing flexible integration modes including Stealth and OAuth. It enables advanced functionalities such as fetching channel and thread messages, supporting direct and group messages, smart history retrieval, and robust context management for AI workflows. Communication is supported over Stdio, SSE, and HTTP transports, and the platform includes enhanced features like user information embedding and cache support. Designed for enterprise and individual use, it allows seamless context extraction and management without compromising workspace security.
903 136 MCP -
47
MCP Content Summarizer Server
Intelligent multi-format content summarization via MCP interface.
MCP Content Summarizer Server provides intelligent summarization of various content types including text, web pages, PDF documents, and EPUB books using Google's Gemini 1.5 Pro model. Through the Model Context Protocol, it supports customizable, multi-language summaries with options for style and focus. It is designed for integration with applications as an MCP server and offers tools for both summarization and testing. The solution maintains key information while producing concise and context-aware summaries from diverse content sources.
142 23 MCP -
48
MaxMSP-MCP Server
Bridge LLMs with Max patches via Model Context Protocol
MaxMSP-MCP Server enables large language models to understand, explain, and generate Max patches by leveraging the Model Context Protocol. It connects LLM agents with MaxMSP environments, providing access to documentation and patch objects for detailed interaction. Installation includes both a Python server and Max environment integration, facilitating seamless Python-Max communication. The tool supports explaining patches, debugging, and synthesizer creation directly through LLM interfaces.
106 12 MCP -
49
Octocode MCP
Enterprise-grade AI context server for codebase research and analysis.
Octocode MCP is a Model Context Protocol (MCP) server designed to enable AI assistants to search, analyze, and extract insights from millions of GitHub repositories with high security and token efficiency. It offers intelligent orchestration for deep code research, planning, and agentic workflows, streamlining the process of building and understanding complex software projects. The platform features robust tools and commands, such as /research for expert code research, designed to support developers and AI systems with context-rich information.
577 45 MCP -
50
Neo4j MCP Clients & Servers
Seamless natural language and knowledge graph integration for Neo4j via Model Context Protocol.
Neo4j MCP Clients & Servers provide standardized interfaces that enable large language models and AI assistants to interact with Neo4j databases and cloud services using natural language through the Model Context Protocol (MCP). It includes multiple servers for translating natural language to Cypher queries, managing graph memory, handling Neo4j Aura cloud services, and supporting interactive data modeling. Multiple transport modes such as STDIO, HTTP, and SSE offer flexibility for various deployments including cloud and local environments.
797 209 MCP
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Didn't find tool you were looking for?