Wren Engine
Semantic Engine for MCP Clients and AI Agents
Key Features
Use Cases
README
Wren Engine is the Semantic Engine for MCP Clients and AI Agents. Wren AI GenBI AI Agent is based on Wren Engine.
🔌 Supported Data Sources
- BigQuery
- Google Cloud Storage
- Local Files
- MS SQL Server
- Minio
- MySQL Server
- Oracle Server
- PostgreSQL Server
- Amazon S3
- Snowflake
- Trino
😫 Challenge Today
At the enterprise level, the stakes - and the complexity - are much higher. Businesses run on structured data stored in cloud warehouses, relational databases, and secure filesystems. From BI dashboards to CRM updates and compliance workflows, AI must not only execute commands but also understand and retrieve the right data, with precision and in context.
While many community and official MCP servers already support connections to major databases like PostgreSQL, MySQL, SQL Server, and more, there's a problem: raw access to data isn't enough.
Enterprises need:
- Accurate semantic understanding of their data models
- Trusted calculations and aggregations in reporting
- Clarity on business terms, like "active customer," "net revenue," or "churn rate"
- User-based permissions and access control
Natural language alone isn't enough to drive complex workflows across enterprise data systems. You need a layer that interprets intent, maps it to the correct data, applies calculations accurately, and ensures security.
🎯 Our Mission
Wren Engine is on a mission to power the future of MCP clients and AI agents through the Model Context Protocol (MCP) — a new open standard that connects LLMs with tools, databases, and enterprise systems.
As part of the MCP ecosystem, Wren Engine provides a semantic engine powered the next generation semantic layer that enables AI agents to access business data with accuracy, context, and governance.
By building the semantic layer directly into MCP clients, such as Claude, Cline, Cursor, etc. Wren Engine empowers AI Agents with precise business context and ensures accurate data interactions across diverse enterprise environments.
We believe the future of enterprise AI lies in context-aware, composable systems. That’s why Wren Engine is designed to be:
- 🔌 Embeddable into any MCP client or AI agentic workflow
- 🔄 Interoperable with modern data stacks (PostgreSQL, MySQL, Snowflake, etc.)
- 🧠 Semantic-first, enabling AI to “understand” your data model and business logic
- 🔐 Governance-ready, respecting roles, access controls, and definitions
With Wren Engine, you can scale AI adoption across teams — not just with better automation, but with better understanding.
Check our full article
🚀 Get Started with MCP
https://github.com/user-attachments/assets/dab9b50f-70d7-4eb3-8fc8-2ab55dc7d2ec
👉 Blog Post Tutorial: Powering AI-driven workflows with Wren Engine and Zapier via the Model Context Protocol (MCP)
🤔 Concepts
- Powering Semantic SQL for AI Agents with Apache DataFusion
- Quick start with Wren Engine
- What is semantics?
- What is Modeling Definition Language (MDL)?
- Benefits of Wren Engine with LLMs
🚧 Project Status
Wren Engine is currently in the beta version. The project team is actively working on progress and aiming to release new versions at least biweekly.
🛠️ Developer Guides
The project consists of 4 main modules:
- ibis-server: the Web server of Wren Engine powered by FastAPI and Ibis
- wren-core: the semantic core written in Rust powered by Apache DataFusion
- wren-core-py: the Python binding for wren-core
- mcp-server: the MCP server of Wren Engine powered by MCP Python SDK
⭐️ Community
- Welcome to our Discord server to give us feedback!
- If there is any issues, please visit Github Issues.
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Tags
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
Snowflake Cortex AI Model Context Protocol (MCP) Server
Tooling and orchestration for Snowflake Cortex AI via Model Context Protocol.
Provides an MCP server that brings Snowflake Cortex AI, object management, and SQL orchestration to the MCP ecosystem. Enables Cortex Search, Analyst, and Agent services for structured and unstructured data querying, along with automated SQL execution and object management. Designed for integration with MCP clients to streamline AI-powered data workflows and context-sensitive operations within Snowflake environments.
- ⭐ 176
- MCP
- Snowflake-Labs/mcp
MXCP
Enterprise-Grade Model Context Protocol Framework for AI Applications
MXCP is an enterprise-ready framework that implements the Model Context Protocol (MCP) for building secure, production-grade AI application servers. It introduces a structured methodology focused on data modeling, robust service design, policy enforcement, and comprehensive testing, integrated with strong security and audit capabilities. The framework enables rapid development and deployment of AI tools, supporting both SQL and Python environments, with built-in telemetry and drift detection for reliability and compliance.
- ⭐ 49
- MCP
- raw-labs/mxcp
Snowflake MCP Server
MCP server enabling secure and structured Snowflake database interaction with AI tools.
Snowflake MCP Server provides a Model Context Protocol-conformant interface to interact programmatically with Snowflake databases. It exposes SQL execution, schema exploration, and insight aggregation as standardized resources and tools accessible by AI assistants. The server offers read/write capabilities, structured resource summaries, and insight memoization suitable for contextual AI workflows. Integration is supported with popular AI platforms such as Claude Desktop via Smithery or UVX configurations.
- ⭐ 170
- MCP
- isaacwasserman/mcp-snowflake-server
Sourcerer MCP
Semantic code search & navigation MCP server for efficient AI agent context retrieval.
Sourcerer MCP provides a Model Context Protocol (MCP) server that enables AI agents to perform semantic code search and navigation. By indexing codebases at the function, class, and chunk level, it allows agents to retrieve only the necessary code snippets, greatly reducing token consumption. The tool integrates with Tree-sitter for language parsing and OpenAI for generating code embeddings, supporting advanced contextual code understanding without full file ingestion.
- ⭐ 95
- MCP
- st3v3nmw/sourcerer-mcp
ApeRAG
Hybrid RAG platform with MCP integration for intelligent knowledge management
ApeRAG is a production-ready Retrieval-Augmented Generation (RAG) platform that integrates graph-based, vector, and full-text search capabilities. It enables the construction of knowledge graphs and supports MCP (Model Context Protocol), allowing AI assistants direct interaction with knowledge bases. Features include advanced document parsing, multimodal processing, intelligent agent workflows, and enterprise management tools. Deployment is streamlined via Docker and Kubernetes, with extensive support for customization and scalability.
- ⭐ 920
- MCP
- apecloud/ApeRAG
Alkemi MCP Server
Integrate Alkemi Data sources with MCP Clients for seamless, standardized data querying.
Alkemi MCP Server provides a STDIO wrapper for connecting Alkemi data sources—including Snowflake, Google BigQuery, and Databricks—with MCP Clients using the Model Context Protocol. It facilitates context sharing, database metadata management, and query generation through a standardized protocol endpoint. Shared MCP Servers allow teams to maintain consistent, high-quality data querying capabilities without needing to replicate schemas or query knowledge for each agent. Out-of-the-box integration with Claude Desktop and robust debugging tools are also included.
- ⭐ 2
- MCP
- alkemi-ai/alkemi-mcp
Didn't find tool you were looking for?