Wren Engine

Wren Engine

Semantic Engine for MCP Clients and AI Agents

496
Stars
138
Forks
496
Watchers
50
Issues
Wren Engine is a semantic engine designed for Model Context Protocol (MCP) clients and AI agents. It enables precise access to enterprise data by providing semantic understanding, trusted aggregations, and business term clarity across numerous data sources. With a focus on accurate context, calculation, and governance, it equips AI systems to interact safely and meaningfully with complex enterprise data. As part of the MCP ecosystem, it connects LLMs with databases and enterprise tools through standardized protocols.

Key Features

Semantic understanding of data models
Trusted calculations and aggregations
Business term clarity and definitions
User-based permissions and access control
Connection to multiple data sources (e.g., BigQuery, Snowflake, S3)
Contextual data retrieval for AI agents
Support for standardized Model Context Protocol
Secure and governed data access
Integration with LLMs and AI workflows
Open standard interoperability for enterprise tools

Use Cases

Natural language BI querying
Enterprise reporting and dashboards
AI-driven CRM updates
Compliance and audit workflows
Automated data retrieval for AI agents
Data access control in multi-user environments
Standardizing business metrics and terms
Integration of LLMs with enterprise data
Context-aware workflow automation
Connecting diverse data sources under unified semantics

README

Wren Engine is the Semantic Engine for MCP Clients and AI Agents. Wren AI GenBI AI Agent is based on Wren Engine.

🔌 Supported Data Sources

😫 Challenge Today

At the enterprise level, the stakes - and the complexity - are much higher. Businesses run on structured data stored in cloud warehouses, relational databases, and secure filesystems. From BI dashboards to CRM updates and compliance workflows, AI must not only execute commands but also understand and retrieve the right data, with precision and in context.

While many community and official MCP servers already support connections to major databases like PostgreSQL, MySQL, SQL Server, and more, there's a problem: raw access to data isn't enough.

Enterprises need:

  • Accurate semantic understanding of their data models
  • Trusted calculations and aggregations in reporting
  • Clarity on business terms, like "active customer," "net revenue," or "churn rate"
  • User-based permissions and access control

Natural language alone isn't enough to drive complex workflows across enterprise data systems. You need a layer that interprets intent, maps it to the correct data, applies calculations accurately, and ensures security.

🎯 Our Mission

Wren Engine is on a mission to power the future of MCP clients and AI agents through the Model Context Protocol (MCP) — a new open standard that connects LLMs with tools, databases, and enterprise systems.

As part of the MCP ecosystem, Wren Engine provides a semantic engine powered the next generation semantic layer that enables AI agents to access business data with accuracy, context, and governance.

By building the semantic layer directly into MCP clients, such as Claude, Cline, Cursor, etc. Wren Engine empowers AI Agents with precise business context and ensures accurate data interactions across diverse enterprise environments.

We believe the future of enterprise AI lies in context-aware, composable systems. That’s why Wren Engine is designed to be:

  • 🔌 Embeddable into any MCP client or AI agentic workflow
  • 🔄 Interoperable with modern data stacks (PostgreSQL, MySQL, Snowflake, etc.)
  • 🧠 Semantic-first, enabling AI to “understand” your data model and business logic
  • 🔐 Governance-ready, respecting roles, access controls, and definitions

With Wren Engine, you can scale AI adoption across teams — not just with better automation, but with better understanding.

Check our full article

🤩 Our Mission - Fueling the Next Wave of AI Agents: Building the Foundation for Future MCP Clients and Enterprise Data Access

🚀 Get Started with MCP

MCP Server README

https://github.com/user-attachments/assets/dab9b50f-70d7-4eb3-8fc8-2ab55dc7d2ec

👉 Blog Post Tutorial: Powering AI-driven workflows with Wren Engine and Zapier via the Model Context Protocol (MCP)

🤔 Concepts

🚧 Project Status

Wren Engine is currently in the beta version. The project team is actively working on progress and aiming to release new versions at least biweekly.

🛠️ Developer Guides

The project consists of 4 main modules:

  1. ibis-server: the Web server of Wren Engine powered by FastAPI and Ibis
  2. wren-core: the semantic core written in Rust powered by Apache DataFusion
  3. wren-core-py: the Python binding for wren-core
  4. mcp-server: the MCP server of Wren Engine powered by MCP Python SDK

⭐️ Community

Star History

Star History Chart

Repository Owner

Canner
Canner

Organization

Repository Details

Language Java
Default Branch main
Size 23,537 KB
Contributors 20
License Apache License 2.0
MCP Verified Nov 12, 2025

Programming Languages

Java
59.47%
Python
20.57%
Rust
18.17%
ANTLR
1.17%
Shell
0.25%
Jupyter Notebook
0.21%
Dockerfile
0.09%
Just
0.08%

Tags

Topics

agent agentic-ai ai business-intelligence data data-analysis data-analytics data-lake data-warehouse hacktoberfest llm mcp mcp-server semantic semantic-layer sql

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Snowflake Cortex AI Model Context Protocol (MCP) Server

    Snowflake Cortex AI Model Context Protocol (MCP) Server

    Tooling and orchestration for Snowflake Cortex AI via Model Context Protocol.

    Provides an MCP server that brings Snowflake Cortex AI, object management, and SQL orchestration to the MCP ecosystem. Enables Cortex Search, Analyst, and Agent services for structured and unstructured data querying, along with automated SQL execution and object management. Designed for integration with MCP clients to streamline AI-powered data workflows and context-sensitive operations within Snowflake environments.

    • 176
    • MCP
    • Snowflake-Labs/mcp
  • MXCP

    MXCP

    Enterprise-Grade Model Context Protocol Framework for AI Applications

    MXCP is an enterprise-ready framework that implements the Model Context Protocol (MCP) for building secure, production-grade AI application servers. It introduces a structured methodology focused on data modeling, robust service design, policy enforcement, and comprehensive testing, integrated with strong security and audit capabilities. The framework enables rapid development and deployment of AI tools, supporting both SQL and Python environments, with built-in telemetry and drift detection for reliability and compliance.

    • 49
    • MCP
    • raw-labs/mxcp
  • Snowflake MCP Server

    Snowflake MCP Server

    MCP server enabling secure and structured Snowflake database interaction with AI tools.

    Snowflake MCP Server provides a Model Context Protocol-conformant interface to interact programmatically with Snowflake databases. It exposes SQL execution, schema exploration, and insight aggregation as standardized resources and tools accessible by AI assistants. The server offers read/write capabilities, structured resource summaries, and insight memoization suitable for contextual AI workflows. Integration is supported with popular AI platforms such as Claude Desktop via Smithery or UVX configurations.

    • 170
    • MCP
    • isaacwasserman/mcp-snowflake-server
  • Sourcerer MCP

    Sourcerer MCP

    Semantic code search & navigation MCP server for efficient AI agent context retrieval.

    Sourcerer MCP provides a Model Context Protocol (MCP) server that enables AI agents to perform semantic code search and navigation. By indexing codebases at the function, class, and chunk level, it allows agents to retrieve only the necessary code snippets, greatly reducing token consumption. The tool integrates with Tree-sitter for language parsing and OpenAI for generating code embeddings, supporting advanced contextual code understanding without full file ingestion.

    • 95
    • MCP
    • st3v3nmw/sourcerer-mcp
  • ApeRAG

    ApeRAG

    Hybrid RAG platform with MCP integration for intelligent knowledge management

    ApeRAG is a production-ready Retrieval-Augmented Generation (RAG) platform that integrates graph-based, vector, and full-text search capabilities. It enables the construction of knowledge graphs and supports MCP (Model Context Protocol), allowing AI assistants direct interaction with knowledge bases. Features include advanced document parsing, multimodal processing, intelligent agent workflows, and enterprise management tools. Deployment is streamlined via Docker and Kubernetes, with extensive support for customization and scalability.

    • 920
    • MCP
    • apecloud/ApeRAG
  • Alkemi MCP Server

    Alkemi MCP Server

    Integrate Alkemi Data sources with MCP Clients for seamless, standardized data querying.

    Alkemi MCP Server provides a STDIO wrapper for connecting Alkemi data sources—including Snowflake, Google BigQuery, and Databricks—with MCP Clients using the Model Context Protocol. It facilitates context sharing, database metadata management, and query generation through a standardized protocol endpoint. Shared MCP Servers allow teams to maintain consistent, high-quality data querying capabilities without needing to replicate schemas or query knowledge for each agent. Out-of-the-box integration with Claude Desktop and robust debugging tools are also included.

    • 2
    • MCP
    • alkemi-ai/alkemi-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results