MCP Toolbox for Databases

MCP Toolbox for Databases

Open source MCP server for secure and efficient Gen AI database integrations.

11,412
Stars
988
Forks
11,412
Watchers
183
Issues
MCP Toolbox for Databases is an open source server that implements the Model Context Protocol (MCP) for database interactions in Gen AI workflows. It manages core complexities such as connection pooling, authentication, and tool integration, enabling developers to create and deploy database tools with ease and enhanced security. The toolbox supports streamlined connections between development environments and databases, offering observability, context-aware code generation, and automation features. Its design emphasizes rapid integration, reusable tools, and compatibility with AI assistants.

Key Features

Implements Model Context Protocol (MCP)
Open source MCP-compliant database server
Simplified tool integration and deployment
Connection pooling for efficient resource management
Integrated authentication for secure data access
Support for OpenTelemetry-based observability
Natural language querying of databases
Automated database management and schema operations
Context-aware code and test generation
Reusable tools across multiple agents and frameworks

Use Cases

Enabling natural language access to databases from IDEs
Empowering AI assistants to perform database queries and management tasks
Streamlining secure database connectivity for generative AI workflows
Automating routine database maintenance by AI agents
Generating application code and tests based on real-time schemas
Observing and tracing database interactions with built-in metrics
Integrating database tools into multiple agent platforms
Reducing development overhead and configuration time
Delegating complex database operations to AI assistants
Deploying customizable, context-aware database tools in production environments

README

logo

MCP Toolbox for Databases

Docs Discord Medium Go Report Card

[!NOTE] MCP Toolbox for Databases is currently in beta, and may see breaking changes until the first stable release (v1.0).

MCP Toolbox for Databases is an open source MCP server for databases. It enables you to develop tools easier, faster, and more securely by handling the complexities such as connection pooling, authentication, and more.

This README provides a brief overview. For comprehensive details, see the full documentation.

[!NOTE] This solution was originally named “Gen AI Toolbox for Databases” as its initial development predated MCP, but was renamed to align with recently added MCP compatibility.

Table of Contents

Why Toolbox?

Toolbox helps you build Gen AI tools that let your agents access data in your database. Toolbox provides:

  • Simplified development: Integrate tools to your agent in less than 10 lines of code, reuse tools between multiple agents or frameworks, and deploy new versions of tools more easily.
  • Better performance: Best practices such as connection pooling, authentication, and more.
  • Enhanced security: Integrated auth for more secure access to your data
  • End-to-end observability: Out of the box metrics and tracing with built-in support for OpenTelemetry.

⚡ Supercharge Your Workflow with an AI Database Assistant ⚡

Stop context-switching and let your AI assistant become a true co-developer. By connecting your IDE to your databases with MCP Toolbox, you can delegate complex and time-consuming database tasks, allowing you to build faster and focus on what matters. This isn't just about code completion; it's about giving your AI the context it needs to handle the entire development lifecycle.

Here’s how it will save you time:

  • Query in Plain English: Interact with your data using natural language right from your IDE. Ask complex questions like, "How many orders were delivered in 2024, and what items were in them?" without writing any SQL.
  • Automate Database Management: Simply describe your data needs, and let the AI assistant manage your database for you. It can handle generating queries, creating tables, adding indexes, and more.
  • Generate Context-Aware Code: Empower your AI assistant to generate application code and tests with a deep understanding of your real-time database schema. This accelerates the development cycle by ensuring the generated code is directly usable.
  • Slash Development Overhead: Radically reduce the time spent on manual setup and boilerplate. MCP Toolbox helps streamline lengthy database configurations, repetitive code, and error-prone schema migrations.

Learn how to connect your AI tools (IDEs) to Toolbox using MCP.

General Architecture

Toolbox sits between your application's orchestration framework and your database, providing a control plane that is used to modify, distribute, or invoke tools. It simplifies the management of your tools by providing you with a centralized location to store and update tools, allowing you to share tools between agents and applications and update those tools without necessarily redeploying your application.

Getting Started

Installing the server

For the latest version, check the releases page and use the following instructions for your OS and CPU architecture.

To install Toolbox as a binary:

To install Toolbox as a binary on Linux (AMD64):

sh
# see releases page for other versions
export VERSION=0.19.1
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/linux/amd64/toolbox
chmod +x toolbox

To install Toolbox as a binary on macOS (Apple Silicon):

sh
# see releases page for other versions
export VERSION=0.19.1
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/arm64/toolbox
chmod +x toolbox

To install Toolbox as a binary on macOS (Intel):

sh
# see releases page for other versions
export VERSION=0.19.1
curl -L -o toolbox https://storage.googleapis.com/genai-toolbox/v$VERSION/darwin/amd64/toolbox
chmod +x toolbox

To install Toolbox as a binary on Windows (AMD64):

powershell
# see releases page for other versions
$VERSION = "0.19.1"
Invoke-WebRequest -Uri "https://storage.googleapis.com/genai-toolbox/v$VERSION/windows/amd64/toolbox.exe" -OutFile "toolbox.exe"
sh
# see releases page for other versions
export VERSION=0.19.1
docker pull us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION

To install Toolbox using Homebrew on macOS or Linux:

sh
brew install mcp-toolbox

To install from source, ensure you have the latest version of Go installed, and then run the following command:

sh
go install github.com/googleapis/genai-toolbox@v0.19.1

To install Gemini CLI Extensions for MCP Toolbox, run the following command:

sh
gemini extensions install https://github.com/gemini-cli-extensions/mcp-toolbox

Running the server

Configure a tools.yaml to define your tools, and then execute toolbox to start the server:

To run Toolbox from binary:

sh
./toolbox --tools-file "tools.yaml"

ⓘ Note
Toolbox enables dynamic reloading by default. To disable, use the --disable-reload flag.

To run the server after pulling the container image:

sh
export VERSION=0.11.0 # Use the version you pulled
docker run -p 5000:5000 \
-v $(pwd)/tools.yaml:/app/tools.yaml \
us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$VERSION \
--tools-file "/app/tools.yaml"

ⓘ Note
The -v flag mounts your local tools.yaml into the container, and -p maps the container's port 5000 to your host's port 5000.

To run the server directly from source, navigate to the project root directory and run:

sh
go run .

ⓘ Note
This command runs the project from source, and is more suitable for development and testing. It does not compile a binary into your $GOPATH. If you want to compile a binary instead, refer the Developer Documentation.

If you installed Toolbox using Homebrew, the toolbox binary is available in your system path. You can start the server with the same command:

sh
toolbox --tools-file "tools.yaml"

Interact with your custom tools using natural language. Check gemini-cli-extensions/mcp-toolbox for more information.

You can use toolbox help for a full list of flags! To stop the server, send a terminate signal (ctrl+c on most platforms).

For more detailed documentation on deploying to different environments, check out the resources in the How-to section

Integrating your application

Once your server is up and running, you can load the tools into your application. See below the list of Client SDKs for using various frameworks:

  1. Install Toolbox Core SDK:

    bash
    pip install toolbox-core
    
  2. Load tools:

    python
    from toolbox_core import ToolboxClient
    
    # update the url to point to your server
    async with ToolboxClient("http://127.0.0.1:5000") as client:
    
        # these tools can be passed to your application!
        tools = await client.load_toolset("toolset_name")
    

For more detailed instructions on using the Toolbox Core SDK, see the project's README.

  1. Install Toolbox LangChain SDK:

    bash
    pip install toolbox-langchain
    
  2. Load tools:

    python
    from toolbox_langchain import ToolboxClient
    
    # update the url to point to your server
    async with ToolboxClient("http://127.0.0.1:5000") as client:
    
        # these tools can be passed to your application!
        tools = client.load_toolset()
    

    For more detailed instructions on using the Toolbox LangChain SDK, see the project's README.

  1. Install Toolbox Llamaindex SDK:

    bash
    pip install toolbox-llamaindex
    
  2. Load tools:

    python
    from toolbox_llamaindex import ToolboxClient
    
    # update the url to point to your server
    async with ToolboxClient("http://127.0.0.1:5000") as client:
    
        # these tools can be passed to your application!
        tools = client.load_toolset()
    

    For more detailed instructions on using the Toolbox Llamaindex SDK, see the project's README.

  1. Install Toolbox Core SDK:

    bash
    npm install @toolbox-sdk/core
    
  2. Load tools:

    javascript
    import { ToolboxClient } from '@toolbox-sdk/core';
    
    // update the url to point to your server
    const URL = 'http://127.0.0.1:5000';
    let client = new ToolboxClient(URL);
    
    // these tools can be passed to your application!
    const tools = await client.loadToolset('toolsetName');
    

    For more detailed instructions on using the Toolbox Core SDK, see the project's README.

  1. Install Toolbox Core SDK:

    bash
    npm install @toolbox-sdk/core
    
  2. Load tools:

    javascript
    import { ToolboxClient } from '@toolbox-sdk/core';
    
    // update the url to point to your server
    const URL = 'http://127.0.0.1:5000';
    let client = new ToolboxClient(URL);
    
    // these tools can be passed to your application!
    const toolboxTools = await client.loadToolset('toolsetName');
    
    // Define the basics of the tool: name, description, schema and core logic
    const getTool = (toolboxTool) => tool(currTool, {
        name: toolboxTool.getName(),
        description: toolboxTool.getDescription(),
        schema: toolboxTool.getParamSchema()
    });
    
    // Use these tools in your Langchain/Langraph applications
    const tools = toolboxTools.map(getTool);
    
  1. Install Toolbox Core SDK:

    bash
    npm install @toolbox-sdk/core
    
  2. Load tools:

    javascript
    import { ToolboxClient } from '@toolbox-sdk/core';
    import { genkit } from 'genkit';
    
    // Initialise genkit
    const ai = genkit({
        plugins: [
            googleAI({
                apiKey: process.env.GEMINI_API_KEY || process.env.GOOGLE_API_KEY
            })
        ],
        model: googleAI.model('gemini-2.0-flash'),
    });
    
    // update the url to point to your server
    const URL = 'http://127.0.0.1:5000';
    let client = new ToolboxClient(URL);
    
    // these tools can be passed to your application!
    const toolboxTools = await client.loadToolset('toolsetName');
    
    // Define the basics of the tool: name, description, schema and core logic
    const getTool = (toolboxTool) => ai.defineTool({
        name: toolboxTool.getName(),
        description: toolboxTool.getDescription(),
        schema: toolboxTool.getParamSchema()
    }, toolboxTool)
    
    // Use these tools in your Genkit applications
    const tools = toolboxTools.map(getTool);
    
  1. Install Toolbox Go SDK:

    bash
    go get github.com/googleapis/mcp-toolbox-sdk-go
    
  2. Load tools:

    go
    package main
    
    import (
      "github.com/googleapis/mcp-toolbox-sdk-go/core"
      "context"
    )
    
    func main() {
      // Make sure to add the error checks
      // update the url to point to your server
      URL := "http://127.0.0.1:5000";
      ctx := context.Background()
    
      client, err := core.NewToolboxClient(URL)
    
      // Framework agnostic tools
      tools, err := client.LoadToolset("toolsetName", ctx)
    }
    

    For more detailed instructions on using the Toolbox Go SDK, see the project's README.

  1. Install Toolbox Go SDK:

    bash
    go get github.com/googleapis/mcp-toolbox-sdk-go
    
  2. Load tools:

    go
    package main
    
    import (
      "context"
      "encoding/json"
    
      "github.com/googleapis/mcp-toolbox-sdk-go/core"
      "github.com/tmc/langchaingo/llms"
    )
    
    func main() {
      // Make sure to add the error checks
      // update the url to point to your server
      URL := "http://127.0.0.1:5000"
      ctx := context.Background()
    
      client, err := core.NewToolboxClient(URL)
    
      // Framework agnostic tool
      tool, err := client.LoadTool("toolName", ctx)
    
      // Fetch the tool's input schema
      inputschema, err := tool.InputSchema()
    
      var paramsSchema map[string]any
      _ = json.Unmarshal(inputschema, &paramsSchema)
    
      // Use this tool with LangChainGo
      langChainTool := llms.Tool{
        Type: "function",
        Function: &llms.FunctionDefinition{
          Name:        tool.Name(),
          Description: tool.Description(),
          Parameters:  paramsSchema,
        },
      }
    }
    
    
  1. Install Toolbox Go SDK:

    bash
    go get github.com/googleapis/mcp-toolbox-sdk-go
    
  2. Load tools:

    go
    package main
    import (
      "context"
      "log"
    
      "github.com/firebase/genkit/go/genkit"
      "github.com/googleapis/mcp-toolbox-sdk-go/core"
      "github.com/googleapis/mcp-toolbox-sdk-go/tbgenkit"
    )
    
    func main() {
      // Make sure to add the error checks
      // Update the url to point to your server
      URL := "http://127.0.0.1:5000"
      ctx := context.Background()
      g := genkit.Init(ctx)
    
      client, err := core.NewToolboxClient(URL)
    
      // Framework agnostic tool
      tool, err := client.LoadTool("toolName", ctx)
    
      // Convert the tool using the tbgenkit package
      // Use this tool with Genkit Go
      genkitTool, err := tbgenkit.ToGenkitTool(tool, g)
      if err != nil {
        log.Fatalf("Failed to convert tool: %v\n", err)
      }
      log.Printf("Successfully converted tool: %s", genkitTool.Name())
    }
    
  1. Install Toolbox Go SDK:

    bash
    go get github.com/googleapis/mcp-toolbox-sdk-go
    
  2. Load tools:

    go
    package main
    
    import (
      "context"
      "encoding/json"
    
      "github.com/googleapis/mcp-toolbox-sdk-go/core"
      "google.golang.org/genai"
    )
    
    func main() {
      // Make sure to add the error checks
      // Update the url to point to your server
      URL := "http://127.0.0.1:5000"
      ctx := context.Background()
    
      client, err := core.NewToolboxClient(URL)
    
      // Framework agnostic tool
      tool, err := client.LoadTool("toolName", ctx)
    
      // Fetch the tool's input schema
      inputschema, err := tool.InputSchema()
    
      var schema *genai.Schema
      _ = json.Unmarshal(inputschema, &schema)
    
      funcDeclaration := &genai.FunctionDeclaration{
        Name:        tool.Name(),
        Description: tool.Description(),
        Parameters:  schema,
      }
    
      // Use this tool with Go GenAI
      genAITool := &genai.Tool{
        FunctionDeclarations: []*genai.FunctionDeclaration{funcDeclaration},
      }
    }
    
  1. Install Toolbox Go SDK:

    bash
    go get github.com/googleapis/mcp-toolbox-sdk-go
    
  2. Load tools:

    go
    package main
    
    import (
      "context"
      "encoding/json"
    
      "github.com/googleapis/mcp-toolbox-sdk-go/core"
      openai "github.com/openai/openai-go"
    )
    
    func main() {
      // Make sure to add the error checks
      // Update the url to point to your server
      URL := "http://127.0.0.1:5000"
      ctx := context.Background()
    
      client, err := core.NewToolboxClient(URL)
    
      // Framework agnostic tool
      tool, err := client.LoadTool("toolName", ctx)
    
      // Fetch the tool's input schema
      inputschema, err := tool.InputSchema()
    
      var paramsSchema openai.FunctionParameters
      _ = json.Unmarshal(inputschema, &paramsSchema)
    
      // Use this tool with OpenAI Go
      openAITool := openai.ChatCompletionToolParam{
        Function: openai.FunctionDefinitionParam{
          Name:        tool.Name(),
          Description: openai.String(tool.Description()),
          Parameters:  paramsSchema,
        },
      }
    
    }
    
  1. Install Toolbox Go SDK:

    bash
    go get github.com/googleapis/mcp-toolbox-sdk-go
    
  2. Load tools:

    go
    package main
    
    import (
      "github.com/googleapis/mcp-toolbox-sdk-go/tbadk"
      "context"
    )
    
    func main() {
      // Make sure to add the error checks
      // Update the url to point to your server
      URL := "http://127.0.0.1:5000"
      ctx := context.Background()
      client, err := tbadk.NewToolboxClient(URL)
      if err != nil {
        return fmt.Sprintln("Could not start Toolbox Client", err)
      }
    
      // Use this tool with ADK Go
      tool, err := client.LoadTool("toolName", ctx)
      if err != nil {
        return fmt.Sprintln("Could not load Toolbox Tool", err)
      }
    }
    

    For more detailed instructions on using the Toolbox Go SDK, see the project's README.

Using Toolbox with Gemini CLI Extensions

Gemini CLI extensions provide tools to interact directly with your data sources from command line. Below is a list of Gemini CLI extensions that are built on top of Toolbox. They allow you to interact with your data sources through pre-defined or custom tools with natural language. Click into the link to see detailed instructions on their usage.

To use custom tools with Gemini CLI:

To use prebuilt tools with Gemini CLI:

Configuration

The primary way to configure Toolbox is through the tools.yaml file. If you have multiple files, you can tell toolbox which to load with the --tools-file tools.yaml flag.

You can find more detailed reference documentation to all resource types in the Resources.

Sources

The sources section of your tools.yaml defines what data sources your Toolbox should have access to. Most tools will have at least one source to execute against.

yaml
sources:
  my-pg-source:
    kind: postgres
    host: 127.0.0.1
    port: 5432
    database: toolbox_db
    user: toolbox_user
    password: my-password

For more details on configuring different types of sources, see the Sources.

Tools

The tools section of a tools.yaml define the actions an agent can take: what kind of tool it is, which source(s) it affects, what parameters it uses, etc.

yaml
tools:
  search-hotels-by-name:
    kind: postgres-sql
    source: my-pg-source
    description: Search for hotels based on name.
    parameters:
      - name: name
        type: string
        description: The name of the hotel.
    statement: SELECT * FROM hotels WHERE name ILIKE '%' || $1 || '%';

For more details on configuring different types of tools, see the Tools.

Toolsets

The toolsets section of your tools.yaml allows you to define groups of tools that you want to be able to load together. This can be useful for defining different groups based on agent or application.

yaml
toolsets:
    my_first_toolset:
        - my_first_tool
        - my_second_tool
    my_second_toolset:
        - my_second_tool
        - my_third_tool

You can load toolsets by name:

python
# This will load all tools
all_tools = client.load_toolset()

# This will only load the tools listed in 'my_second_toolset'
my_second_toolset = client.load_toolset("my_second_toolset")

Prompts

The prompts section of a tools.yaml defines prompts that can be used for interactions with LLMs.

yaml
prompts:
  code_review:
    description: "Asks the LLM to analyze code quality and suggest improvements."
    messages:
      - content: "Please review the following code for quality, correctness, and potential improvements: \n\n{{.code}}"
    arguments:
      - name: "code"
        description: "The code to review"

For more details on configuring prompts, see the Prompts.

Versioning

This project uses semantic versioning (MAJOR.MINOR.PATCH). Since the project is in a pre-release stage (version 0.x.y), we follow the standard conventions for initial development:

Pre-1.0.0 Versioning

While the major version is 0, the public API should be considered unstable. The version will be incremented as follows:

  • 0.MINOR.PATCH: The MINOR version is incremented when we add new functionality or make breaking, incompatible API changes.
  • 0.MINOR.PATCH: The PATCH version is incremented for backward-compatible bug fixes.

Post-1.0.0 Versioning

Once the project reaches a stable 1.0.0 release, the versioning will follow the more common convention:

  • MAJOR.MINOR.PATCH: Incremented for incompatible API changes.
  • MAJOR.MINOR.PATCH: Incremented for new, backward-compatible functionality.
  • MAJOR.MINOR.PATCH: Incremented for backward-compatible bug fixes.

The public API that this applies to is the CLI associated with Toolbox, the interactions with official SDKs, and the definitions in the tools.yaml file.

Contributing

Contributions are welcome. Please, see the CONTRIBUTING to get started.

Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms. See Contributor Code of Conduct for more information.

Community

Join our discord community to connect with our developers!

Star History

Star History Chart

Repository Owner

googleapis
googleapis

Organization

Repository Details

Language Go
Default Branch main
Size 486,630 KB
Contributors 30
License Apache License 2.0
MCP Verified Nov 12, 2025

Programming Languages

Go
97.13%
JavaScript
1.74%
Shell
0.42%
CSS
0.39%
HTML
0.29%
Dockerfile
0.03%
SCSS
0%

Tags

Topics

databases genai llms mcp

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Multi-Database MCP Server (by Legion AI)

    Multi-Database MCP Server (by Legion AI)

    Unified multi-database access and AI interaction server with MCP integration.

    Multi-Database MCP Server enables seamless access and querying of diverse databases via a unified API, with native support for the Model Context Protocol (MCP). It supports popular databases such as PostgreSQL, MySQL, SQL Server, and more, and is built for integration with AI assistants and agents. Leveraging the MCP Python SDK, it exposes databases as resources, tools, and prompts for intelligent, context-aware interactions, while delivering zero-configuration schema discovery and secure credential management.

    • 76
    • MCP
    • TheRaLabs/legion-mcp
  • YDB MCP

    YDB MCP

    MCP server for AI-powered natural language database operations on YDB.

    YDB MCP acts as a Model Context Protocol server enabling YDB databases to be accessed via any LLM supporting MCP. It allows AI-driven and natural language interaction with YDB instances by bridging database operations with language model interfaces. Flexible deployment through uvx, pipx, or pip is supported, along with multiple authentication methods. The integration empowers users to manage YDB databases conversationally through standardized protocols.

    • 24
    • MCP
    • ydb-platform/ydb-mcp
  • MCP 数据库工具 (MCP Database Utilities)

    MCP 数据库工具 (MCP Database Utilities)

    A secure bridge enabling AI systems safe, read-only access to multiple databases via unified configuration.

    MCP Database Utilities provides a secure, standardized service for AI systems to access and analyze databases like SQLite, MySQL, and PostgreSQL using a unified YAML-based configuration. It enforces strict read-only operations, local processing, and credential protection to ensure data privacy and integrity. The tool is suitable for entities focused on data privacy and minimizes risks by isolating database connections and masking sensitive data. Designed for easy integration, it supports multiple installation options and advanced capabilities such as schema analysis and table browsing.

    • 85
    • MCP
    • donghao1393/mcp-dbutils
  • greptimedb-mcp-server

    greptimedb-mcp-server

    A Model Context Protocol (MCP) server for secure, structured AI access to GreptimeDB.

    greptimedb-mcp-server implements a Model Context Protocol (MCP) server for GreptimeDB, enabling AI assistants to securely explore and analyze database contents. It provides controlled operations such as listing tables, reading data, and executing SQL queries, ensuring responsible access. The server offers integration with Claude Desktop and supports prompt management for structured AI interactions.

    • 23
    • MCP
    • GreptimeTeam/greptimedb-mcp-server
  • MCP libSQL by xexr

    MCP libSQL by xexr

    Secure, protocol-compliant libSQL database server for MCP-enabled clients.

    MCP libSQL by xexr provides a Model Context Protocol (MCP) server designed for secure database access and management via libSQL. It enables database operations—such as querying, table management, and schema inspection—through standardized MCP tools, ensuring compatibility with clients like Claude Desktop and Cursor. The project emphasizes robust security validation, audit logging, and comprehensive error handling. Users benefit from production-ready deployment, extensive test coverage, and streamlined integration with MCP-compatible platforms.

    • 16
    • MCP
    • Xexr/mcp-libsql
  • XiYan MCP Server

    XiYan MCP Server

    A server enabling natural language queries to SQL databases via the Model Context Protocol.

    XiYan MCP Server is a Model Context Protocol (MCP) compliant server that allows users to query SQL databases such as MySQL and PostgreSQL using natural language. It leverages the XiYanSQL model, providing state-of-the-art text-to-SQL translation and supports both general LLMs and local deployment for enhanced security. The server lists available database tables as resources and can read table contents, making it simple to integrate with different applications.

    • 218
    • MCP
    • XGenerationLab/xiyan_mcp_server
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results