Databricks Genie MCP Server

Databricks Genie MCP Server

Bridge natural language queries to Databricks Genie via Model Context Protocol.

12
Stars
2
Forks
12
Watchers
1
Issues
Databricks Genie MCP Server enables interaction between large language models and the Databricks Genie API using the Model Context Protocol. It allows users to ask natural language questions, start and manage conversations, and run SQL queries in Genie spaces. The tool provides structured results, supports follow-up queries, and facilitates connection through both standard and Docker-based setups. Designed for use with Claude Desktop, it streamlines conversational analytics within Databricks workspaces.

Key Features

Connects LLMs to Databricks Genie API via MCP
Lists and fetches Genie space information
Starts new and continues existing Genie conversations
Executes SQL queries and retrieves structured results
Integrates with Claude Desktop
Docker support for deployment
Supports authentication with Databricks personal access token
Manual configuration of Genie space IDs
Inspects with MCP Inspector tools
Environment variable configuration for secure credentials

Use Cases

Conversational querying of Databricks data using natural language
Embedding Genie workspace context in AI assistant workflows
Automating analytics requests through LLMs
Interactive data exploration with Claude Desktop
Retrieving tabular results from SQL queries via LLMs
Managing multi-turn conversational analytics sessions
Teaching or demonstrating conversational data analysis within Databricks
Building custom AI assistants for enterprise data platforms
Secure integration of workspace authentication for AI tools
Enhancing LLM-driven business intelligence processes

README

Databricks Genie MCP Server

A Model Context Protocol (MCP) server that connects to the Databricks Genie API, allowing LLMs to ask natural language questions, run SQL queries, and interact with Databricks conversational agents.

✨ Features

  • List Genie spaces available in your Databricks workspace (Currently Manual/Using Resource)
  • Fetch metadata (title, description) of a specific Genie space
  • Start new Genie conversations with natural language questions
  • Ask follow-up questions in ongoing Genie conversations
  • Retrieve SQL and result tables in structured format

🧱 Prerequisites

  • Python 3.7+
  • Databricks workspace with:
    • Personal access token
    • Genie API enabled
    • Permissions to access Genie spaces and run queries

⚙️ Setup

  1. Clone this repository

  2. Create and activate a virtual environment (recommended):

 python -m venv .venv
 source .venv/bin/activate

Install dependencies:

pip install -r requirements.txt

Create a .env file in the root directory with the following variables:

DATABRICKS_HOST=your-databricks-instance.cloud.databricks.com # Don't add https
DATABRICKS_TOKEN=your-personal-access-token

📌 Manually Adding Genie Space IDs

Note:
At this time, the Databricks Genie API does not provide a public endpoint to list all available space IDs and titles. (afaik) As a workaround, you need to manually add the Genie space IDs and their titles in the get_genie_space_id() function in main.py.

🧪 Test the Server

You can test the MCP server using the inspector (optional but recommended):

npx @modelcontextprotocol/inspector python main.py

OR

You can directly build and run docker to test the server

💬 Use with Claude Desktop

Download Claude Desktop

Install Your MCP Server: From your project directory, run:

mcp install main.py

Once Server Installed

  1. Connect in Claude

  2. Open Claude Desktop

  3. Click Resources → Add Resource

  4. Select your Genie MCP Server

  5. Start chatting with your data using natural language! 🎯

🧾 Obtaining Databricks Credentials

Host Your Databricks instance URL (e.g., your-instance.cloud.databricks.com) — do not include https://

Token

  1. Go to your Databricks workspace

  2. Click your username (top right) → User Settings

  3. Under the Developer tab, click Manage under "Access tokens"

  4. Generate a new token and copy it

🚀 Running the Server

python main.py

This will start the Genie MCP server over the stdio transport for LLM interaction.

🧰 Available MCP Tools

The following MCP tools are available:

Tool Description

  1. get_genie_space_id() List available Genie space IDs and titles
  2. get_space_info(space_id: str) Retrieve title and description of a Genie space
  3. ask_genie(space_id: str, question: str) Start a new Genie conversation and get results
  4. follow_up(space_id: str, conversation_id: str, question: str) Continue an existing Genie conversation

🛠️ Troubleshooting

Common Issues

  • Invalid host: Ensure the host does not include https://

  • Token error: Make sure your personal access token is valid and has access to Genie

  • Timeout: Check if the Genie space is accessible and not idle/expired

  • No data returned: Ensure your query is valid for the selected space

🔐 Security Considerations

  • Keep your .env file secure and never commit it to version control

  • Use minimal scope tokens with expiration whenever possible

  • Avoid exposing this server in public-facing environments unless authenticated

Claude Desktop Screenshots

image

image

Star History

Star History Chart

Repository Owner

Repository Details

Language Python
Default Branch main
Size 13 KB
Contributors 1
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

Python
96.05%
Dockerfile
3.95%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • TeslaMate MCP Server

    TeslaMate MCP Server

    Query your TeslaMate data using the Model Context Protocol

    TeslaMate MCP Server implements the Model Context Protocol to enable AI assistants and clients to securely access and query Tesla vehicle data, statistics, and analytics from a TeslaMate PostgreSQL database. The server exposes a suite of tools for retrieving vehicle status, driving history, charging sessions, battery health, and more using standardized MCP endpoints. It supports local and Docker deployments, includes bearer token authentication, and is intended for integration with MCP-compatible AI systems like Claude Desktop.

    • 106
    • MCP
    • cobanov/teslamate-mcp
  • mcp-graphql

    mcp-graphql

    Enables LLMs to interact dynamically with GraphQL APIs via Model Context Protocol.

    mcp-graphql provides a Model Context Protocol (MCP) server that allows large language models to discover and interact with GraphQL APIs. The implementation facilitates schema introspection, exposes the GraphQL schema as a resource, and enables secure query and mutation execution based on configuration. It supports configuration through environment variables, automated or manual installation options, and offers flexibility in using local or remote schema files. By default, mutation operations are disabled for security, but can be enabled if required.

    • 319
    • MCP
    • blurrah/mcp-graphql
  • OpenAI MCP Server

    OpenAI MCP Server

    Bridge between Claude and OpenAI models using the MCP protocol.

    OpenAI MCP Server enables direct querying of OpenAI language models from Claude via the Model Context Protocol (MCP). It provides a configurable Python server that exposes OpenAI APIs as MCP endpoints. The server is designed for seamless integration, requiring simple configuration updates and environment variable setup. Automated testing is supported to verify connectivity and response from the OpenAI API.

    • 77
    • MCP
    • pierrebrunelle/mcp-server-openai
  • Dune Analytics MCP Server

    Dune Analytics MCP Server

    Bridge Dune Analytics data seamlessly to AI agents via a Model Context Protocol server.

    Dune Analytics MCP Server provides a Model Context Protocol-compliant server that allows AI agents to access and interact with Dune Analytics data. It exposes tools to fetch the latest results of Dune queries and execute arbitrary queries, returning results in CSV format. The server is easily deployable, supports integration with platforms like Claude Desktop, and requires a Dune Analytics API key for operation.

    • 31
    • MCP
    • kukapay/dune-analytics-mcp
  • GitHub GraphQL MCP Server

    GitHub GraphQL MCP Server

    A Model Context Protocol server for executing arbitrary GraphQL queries on GitHub's API.

    GitHub GraphQL MCP Server is a Model Context Protocol (MCP) server that enables interaction with GitHub's GraphQL API. It allows users to execute any GraphQL queries and mutations against GitHub, supporting variable injection and error handling. The server is designed to integrate with Claude for Desktop, providing tooling for AI environments to access or manipulate GitHub data. Detailed documentation and configuration examples are provided for rapid setup and use.

    • 9
    • MCP
    • QuentinCody/github-graphql-mcp-server
  • Pica MCP Server

    Pica MCP Server

    A Model Context Protocol (MCP) server for seamless integration with 100+ platforms via Pica.

    Pica MCP Server provides a standardized Model Context Protocol (MCP) interface for interaction with a wide range of third-party services through Pica. It enables direct platform integrations, action execution, and intelligent intent detection while prioritizing secure environment variable management. The server also offers features such as code generation, form and data handling, and robust documentation for platform actions. It supports multiple deployment methods, including standalone, Docker, Vercel, and integration with tools like Claude Desktop and Cursor.

    • 8
    • MCP
    • picahq/mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results