OpenLink MCP Server for ODBC
MCP-compatible ODBC server enabling LLMs to access diverse databases.
Key Features
Use Cases
README
OpenLink MCP Server for ODBC
This document covers the set up and use of a generic ODBC server for the Model Context Protocol (MCP), referred to as an mcp-odbc server. It has been developed to provide Large Language Models with transparent access to ODBC-accessible data sources via a Data Source Name configured for a specific ODBC Connector (also called an ODBC Driver).
Server Implementation
This MCP Server for ODBC is a small TypeScript layer built on top of node-odbc. It routes calls to the host system's local ODBC Driver Manager via node.js (specifically using npx for TypeScript).
Operating Environment Set Up & Prerequisites
While the examples that follow are oriented toward the Virtuoso ODBC Connector, this guide will also work with other ODBC Connectors. We strongly encourage code contributions and submissions of usage demos related to other database management systems (DBMS) for incorporation into this project.
Key System Components
- Check the
node.jsversion. If it's not21.1.0or higher, upgrade or install explicitly using:shnvm install v21.1.0 - Install MCP components using:
sh
npm install @modelcontextprotocol/sdk zod tsx odbc dotenv - Set the
nvmversion using:shnvm alias default 21.1.0
Installation
- Run
sh
git clone https://github.com/OpenLinkSoftware/mcp-odbc-server.git - Change directory
sh
cd mcp-odbc-server - Run
sh
npm init -y - Run
sh
npm install @modelcontextprotocol/sdk zod tsx odbc dotenv
unixODBC Runtime Environment Checks
- Check installation configuration (i.e., location of key INI files) by running:
sh
odbcinst -j - List available data source names (DSNs) by running:
sh
odbcinst -q -s
Environment Variables
As good security practice, you should use the .env file situated in the same directory as the mcp-ser to set bindings for the ODBC Data Source Name (ODBC_DSN), the User (ODBC_USER), the Password (ODBC_PWD), the ODBC INI (ODBCINI), and, if you want to use the OpenLink AI Layer (OPAL) via ODBC, the target Large Language Model (LLM) API Key (API_KEY).
API_KEY=sk-xxx
ODBC_DSN=Local Virtuoso
ODBC_USER=dba
ODBC_PASSWORD=dba
ODBCINI=/Library/ODBC/odbc.ini
Usage
Tools
After successful installation, the following tools will be available to MCP client applications.
Overview
| name | description |
|---|---|
get_schemas |
List database schemas accessible to connected database management system (DBMS). |
get_tables |
List tables associated with a selected database schema. |
describe_table |
Provide the description of a table associated with a designated database schema. This includes information about column names, data types, null handling, autoincrement, primary key, and foreign keys |
filter_table_names |
List tables associated with a selected database schema, based on a substring pattern from the q input field. |
query_database |
Execute a SQL query and return results in JSON Lines (JSONL) format. |
execute_query |
Execute a SQL query and return results in JSON Lines (JSONL) format. |
execute_query_md |
Execute a SQL query and return results in Markdown table format. |
spasql_query |
Execute a SPASQL query and return results. |
sparql_query |
Execute a SPARQL query and return results. |
virtuoso_support_ai |
Interact with the Virtuoso Support Assistant/Agent — a Virtuoso-specific feature for interacting with LLMs |
Detailed Description
-
get_schemas- Retrieve and return a list of all schema names from the connected database.
- Input parameters:
user(string, optional): Database username. Defaults to"demo".password(string, optional): Database password. Defaults to"demo".dsn(string, optional): ODBC data source name. Defaults to"Local Virtuoso".
- Returns a JSON string array of schema names.
-
get_tables- Retrieve and return a list containing information about tables in a specified schema. If no schema is provided, uses the connection's default schema.
- Input parameters:
schema(string, optional): Database schema to filter tables. Defaults to connection default.user(string, optional): Database username. Defaults to"demo".password(string, optional): Database password. Defaults to"demo".dsn(string, optional): ODBC data source name. Defaults to"Local Virtuoso".
- Returns a JSON string containing table information (e.g.,
TABLE_CAT,TABLE_SCHEM,TABLE_NAME,TABLE_TYPE).
-
filter_table_names- Filters and returns information about tables whose names contain a specific substring.
- Input parameters:
q(string, required): The substring to search for within table names.schema(string, optional): Database schema to filter tables. Defaults to connection default.user(string, optional): Database username. Defaults to"demo".password(string, optional): Database password. Defaults to"demo".dsn(string, optional): ODBC data source name. Defaults to"Local Virtuoso".
- Returns a JSON string containing information for matching tables.
-
describe_table- Retrieve and return detailed information about the columns of a specific table.
- Input parameters:
schema(string, required): The database schema name containing the table.table(string, required): The name of the table to describe.user(string, optional): Database username. Defaults to"demo".password(string, optional): Database password. Defaults to"demo".dsn(string, optional): ODBC data source name. Defaults to"Local Virtuoso".
- Returns a JSON string describing the table's columns (e.g.,
COLUMN_NAME,TYPE_NAME,COLUMN_SIZE,IS_NULLABLE).
-
query_database- Execute a standard SQL query and return the results in JSON format.
- Input parameters:
query(string, required): The SQL query string to execute.user(string, optional): Database username. Defaults to"demo".password(string, optional): Database password. Defaults to"demo".dsn(string, optional): ODBC data source name. Defaults to"Local Virtuoso".
- Returns query results as a JSON string.
-
query_database_md- Execute a standard SQL query and return the results formatted as a Markdown table.
- Input parameters:
query(string, required): The SQL query string to execute.user(string, optional): Database username. Defaults to"demo".password(string, optional): Database password. Defaults to"demo".dsn(string, optional): ODBC data source name. Defaults to"Local Virtuoso".
- Returns query results as a Markdown table string.
-
query_database_jsonl- Execute a standard SQL query and return the results in JSON Lines (JSONL) format (one JSON object per line).
- Input parameters:
query(string, required): The SQL query string to execute.user(string, optional): Database username. Defaults to"demo".password(string, optional): Database password. Defaults to"demo".dsn(string, optional): ODBC data source name. Defaults to"Local Virtuoso".
- Returns query results as a JSONL string.
-
spasql_query- Execute a SPASQL (SQL/SPARQL hybrid) query return results. This is a Virtuoso-specific feature.
- Input parameters:
query(string, required): The SPASQL query string.max_rows(number, optional): Maximum number of rows to return. Defaults to20.timeout(number, optional): Query timeout in milliseconds. Defaults to30000, i.e., 30 seconds.user(string, optional): Database username. Defaults to"demo".password(string, optional): Database password. Defaults to"demo".dsn(string, optional): ODBC data source name. Defaults to"Local Virtuoso".
- Returns the result from the underlying stored procedure call (e.g.,
Demo.demo.execute_spasql_query).
-
sparql_query- Execute a SPARQL query and return results. This is a Virtuoso-specific feature.
- Input parameters:
query(string, required): The SPARQL query string.format(string, optional): Desired result format. Defaults to'json'.timeout(number, optional): Query timeout in milliseconds. Defaults to30000, i.e., 30 seconds.user(string, optional): Database username. Defaults to"demo".password(string, optional): Database password. Defaults to"demo".dsn(string, optional): ODBC data source name. Defaults to"Local Virtuoso".
- Returns the result from the underlying function call (e.g.,
"UB".dba."sparqlQuery").
-
virtuoso_support_ai- Utilizes a Virtuoso-specific AI Assistant function, passing a prompt and optional API key. This is a Virtuoso-specific feature.
- Input parameters:
prompt(string, required): The prompt text for the AI function.api_key(string, optional): API key for the AI service. Defaults to"none".user(string, optional): Database username. Defaults to"demo".password(string, optional): Database password. Defaults to"demo".dsn(string, optional): ODBC data source name. Defaults to"Local Virtuoso".
- Returns the result from the AI Support Assistant function call (e.g.,
DEMO.DBA.OAI_VIRTUOSO_SUPPORT_AI).
Basic Installation Testing & Troubleshooting
MCP Inspector Tool
Canonical MCP Inspector Tool Edition
-
Start the inspector from the mcp-server directory/folder using the following command:
shODBCINI=/Library/ODBC/odbc.ini npx -y @modelcontextprotocol/inspector npx tsx ./src/main.ts -
Click on the "Connect" button, then click on the "Tools" tab to get started.
OpenLink MCP Inspector Tool Edition
This is a fork of the canonical edition that includes a JSON handling bug fix related to use with this MCP Server.
- run
sh
git clone git@github.com:OpenLinkSoftware/inspector.git cd inspector - run
sh
npm run start - Provide the following value in the
Argumentsinput field of MCP Inspectors UI from http://localhost:6274shtsx /path/to/mcp-odbc-server/src/main.ts - Click on the
Connectbutton to initialize your session with the designated MCP Server
Apple Silicon (ARM64) Compatibility with MCP ODBC Server Issues
Node x86_64 vs arm64 Conflict Issue
The x86_64 rather than arm64 edition of node may be in place, but the ODBC bridge and MCP server are arm64-based components.
You can solve this problem by performing the following steps:
- Uninstall the x86_64 edition of
nodeby running:shnvm uninstall 21.1.0 - Run the following command to confirm your current shell is in arm64 mode:
sh
arch- if that returns x86_64, then run the following command to change the active mode:
arch arm64
- if that returns x86_64, then run the following command to change the active mode:
- Install the arm64 edition of
nodeby running:shnvm install 21.1.0
Node to ODBC Bridge Layer Incompatibility
When attempting to use a Model Context Protocol (MCP) ODBC Server on Apple Silicon machines, you may encounter architecture mismatch errors. These occur because the Node.js ODBC native module (odbc.node) is compiled for ARM64 architecture, but the x86_64-based edition of the unixODBC runtime is being loaded.
Typical error message:
Error: dlopen(...odbc.node, 0x0001): tried: '...odbc.node' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e' or 'arm64'))
You solve this problem by performing the following steps:
-
Verify your
Node.jsis running in ARM64 mode:bashnode -p "process.arch" # Should output: `arm64` -
Install unixODBC for ARM64:
bash# Verify Homebrew is running in ARM64 mode which brew # Should point to /opt/homebrew/bin/brew # Remove existing unixODBC brew uninstall --force unixodbc # Install ARM64 version arch -arm64 brew install unixodbc -
Rebuild the Node.js ODBC module for ARM64:
bash# Navigate to your project cd /path/to/mcp-odbc-server # Remove existing module rm -rf node_modules/odbc # Set architecture environment variable export npm_config_arch=arm64 # Reinstall with force build npm install odbc --build-from-source -
Verify the module is now ARM64:
bashfile node_modules/odbc/lib/bindings/napi-v8/odbc.node # Should show "arm64" instead of "x86_64"
Key Points
- Both unixODBC and the
Node.jsODBC module must be ARM64-compatible - Using environment variables (
export npm_config_arch=arm64) is more reliable thannpm configcommands - Always verify architecture with the
filecommand ornode -p "process.arch" - When using Homebrew on Apple Silicon, commands can be prefixed with
arch -arm64to force use of ARM64 binaries
MCP Application Usage
Claude Desktop Configuration
The path for this config file is: ~{username}/Library/Application Support/Claude/claude_desktop_config.json.
{
"mcpServers": {
"ODBC": {
"command": "/path/to/.nvm/versions/node/v21.1.0/bin/node",
"args": [
"/path/to/mcp-odbc-server/node_modules/.bin/tsx",
"/path/to/mcp-odbc-server/src/main.ts"
],
"env": {
"ODBCINI": "/Library/ODBC/odbc.ini",
"NODE_VERSION": "v21.1.0",
"PATH": "~/.nvm/versions/node/v21.1.0/bin:${PATH}"
},
"disabled": false,
"autoApprove": []
}
}
}
Claude Desktop Usage
-
Start the application.
-
Apply configuration (from above) via Settings | Developer user interface.
-
Ensure you have a working ODBC connection to a Data Source Name (DSN).
-
Present a prompt requesting query execution, e.g.,
Execute the following query: SELECT TOP * from Demo..Customers
Cline (Visual Studio Extension) Configuration
The path for this config file is: ~{username}/Library/Application\ Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
{
"mcpServers": {
"ODBC": {
"command": "/path/to/.nvm/versions/node/v21.1.0/bin/node",
"args": [
"/path/to/mcp-odbc-server/node_modules/.bin/tsx",
"/path/to/mcp-odbc-server/src/main.ts"
],
"env": {
"ODBCINI": "/Library/ODBC/odbc.ini",
"NODE_VERSION": "v21.1.0",
"PATH": "/path/to/.nvm/versions/node/v21.1.0/bin:${PATH}"
},
"disabled": false,
"autoApprove": []
}
}
}
Cline (Visual Studio Extension) Usage
-
Use Shift+Command+
Pto open the Command Palette. -
Type in:
Cline. -
Select:
Cline View, which opens the Cline UI in the VSCode sidebar. -
Use the four-squares icon to access the UI for installing and configuring MCP servers.
-
Apply the Cline Config (from above).
-
Return to the extension's main UI and start a new task requesting processing of the following prompt:
"Execute the following query: SELECT TOP 5 * from Demo..Customers"
Cursor Configuration
Use the settings gear to open the configuration menu that includes the MCP menu item for registering and configuring mcp servers.
Cursor Usage
-
Use the Command+
Ior Control+Ikey combination to open the Chat Interface. -
Select
Agentfrom the drop-down at the bottom left of the UI, where the default isAsk. -
Enter your prompt, qualifying the use of the
mcp-server for odbcusing the pattern:@odbc {rest-of-prompt}. -
Click on "Accept" to execute the prompt.
Related
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
MCP Server ODBC via SQLAlchemy
A lightweight FastAPI server enabling model context protocol access to ODBC-compatible databases via SQLAlchemy.
MCP Server ODBC via SQLAlchemy provides a lightweight, FastAPI-based server that implements the Model Context Protocol (MCP) for accessing ODBC-compatible databases. It bridges AI tools and database systems by exposing standardized endpoints for fetching schema, tables, table descriptions, and executing queries or stored procedures. With support for Virtuoso, PostgreSQL, MySQL, and SQLite, it allows seamless, structured, and context-aware database access for context-driven applications.
- ⭐ 19
- MCP
- OpenLinkSoftware/mcp-sqlalchemy-server
Multi-Database MCP Server (by Legion AI)
Unified multi-database access and AI interaction server with MCP integration.
Multi-Database MCP Server enables seamless access and querying of diverse databases via a unified API, with native support for the Model Context Protocol (MCP). It supports popular databases such as PostgreSQL, MySQL, SQL Server, and more, and is built for integration with AI assistants and agents. Leveraging the MCP Python SDK, it exposes databases as resources, tools, and prompts for intelligent, context-aware interactions, while delivering zero-configuration schema discovery and secure credential management.
- ⭐ 76
- MCP
- TheRaLabs/legion-mcp
MCP libSQL by xexr
Secure, protocol-compliant libSQL database server for MCP-enabled clients.
MCP libSQL by xexr provides a Model Context Protocol (MCP) server designed for secure database access and management via libSQL. It enables database operations—such as querying, table management, and schema inspection—through standardized MCP tools, ensuring compatibility with clients like Claude Desktop and Cursor. The project emphasizes robust security validation, audit logging, and comprehensive error handling. Users benefit from production-ready deployment, extensive test coverage, and streamlined integration with MCP-compatible platforms.
- ⭐ 16
- MCP
- Xexr/mcp-libsql
XiYan MCP Server
A server enabling natural language queries to SQL databases via the Model Context Protocol.
XiYan MCP Server is a Model Context Protocol (MCP) compliant server that allows users to query SQL databases such as MySQL and PostgreSQL using natural language. It leverages the XiYanSQL model, providing state-of-the-art text-to-SQL translation and supports both general LLMs and local deployment for enhanced security. The server lists available database tables as resources and can read table contents, making it simple to integrate with different applications.
- ⭐ 218
- MCP
- XGenerationLab/xiyan_mcp_server
Supabase MCP Server
Connect Supabase projects to AI assistants using the Model Context Protocol.
Supabase MCP Server enables direct, secure integration between Supabase projects and AI assistants such as Cursor, Claude, and Windsurf. Leveraging the Model Context Protocol, it provides standardized endpoints for external LLMs to perform tasks like managing tables, fetching configurations, and querying data on Supabase. The server supports OAuth 2.1 Dynamic Client Registration and offers easy setup with feature groups and popular client installers for local, cloud, and self-hosted environments.
- ⭐ 2,263
- MCP
- supabase-community/supabase-mcp
YDB MCP
MCP server for AI-powered natural language database operations on YDB.
YDB MCP acts as a Model Context Protocol server enabling YDB databases to be accessed via any LLM supporting MCP. It allows AI-driven and natural language interaction with YDB instances by bridging database operations with language model interfaces. Flexible deployment through uvx, pipx, or pip is supported, along with multiple authentication methods. The integration empowers users to manage YDB databases conversationally through standardized protocols.
- ⭐ 24
- MCP
- ydb-platform/ydb-mcp
Didn't find tool you were looking for?