PGMCP
Natural language PostgreSQL interface via the Model Context Protocol.
Key Features
Use Cases
README
PGMCP - PostgreSQL Model Context Protocol Server
PGMCP connects AI assistants to any PostgreSQL database through natural language queries. Ask questions in plain English and get structured SQL results with automatic streaming and robust error handling.
Works with: Cursor, Claude Desktop, VS Code extensions, and any MCP-compatible client
Quick Start
PGMCP connects to your existing PostgreSQL database and makes it accessible to AI assistants through natural language queries.
Prerequisites
- PostgreSQL database (existing database with your schema)
- OpenAI API key (optional, for AI-powered SQL generation)
Basic Usage
# Set up environment variables
export DATABASE_URL="postgres://user:password@localhost:5432/your-existing-db"
export OPENAI_API_KEY="your-api-key" # Optional
# Run server (using pre-compiled binary)
./pgmcp-server
# Test with client in another terminal
./pgmcp-client -ask "What tables do I have?" -format table
./pgmcp-client -ask "Who is the customer that has placed the most orders?" -format table
./pgmcp-client -search "john" -format table
Here is how it works:
👤 User / AI Assistant
│
│ "Who are the top customers?"
▼
┌─────────────────────────────────────────────────────────────┐
│ Any MCP Client │
│ │
│ PGMCP CLI │ Cursor │ Claude Desktop │ VS Code │ ... │
│ JSON/CSV │ Chat │ AI Assistant │ Editor │ │
└─────────────────────────────────────────────────────────────┘
│
│ Streamable HTTP / MCP Protocol
▼
┌─────────────────────────────────────────────────────────────┐
│ PGMCP Server │
│ │
│ 🔒 Security 🧠 AI Engine 🌊 Streaming │
│ • Input Valid • Schema Cache • Auto-Pagination │
│ • Audit Log • OpenAI API • Memory Management │
│ • SQL Guard • Error Recovery • Connection Pool │
└─────────────────────────────────────────────────────────────┘
│
│ Read-Only SQL Queries
▼
┌─────────────────────────────────────────────────────────────┐
│ Your PostgreSQL Database │
│ │
│ Any Schema: E-commerce, Analytics, CRM, etc. │
│ Tables • Views • Indexes • Functions │
└─────────────────────────────────────────────────────────────┘
External AI Services:
OpenAI API • Anthropic • Local LLMs (Ollama, etc.)
Key Benefits:
✅ Works with ANY PostgreSQL database (no assumptions about schema)
✅ No schema modifications required
✅ Read-only access (100% safe)
✅ Automatic streaming for large results
✅ Intelligent query understanding (singular vs plural)
✅ Robust error handling (graceful AI failure recovery)
✅ PostgreSQL case sensitivity support (mixed-case tables)
✅ Production-ready security and performance
✅ Universal database compatibility
✅ Multiple output formats (table, JSON, CSV)
✅ Free-text search across all columns
✅ Authentication support
✅ Comprehensive testing suite
Features
- Natural Language to SQL: Ask questions in plain English
- Automatic Streaming: Handles large result sets automatically
- Safe Read-Only Access: Prevents any write operations
- Text Search: Search across all text columns
- Multiple Output Formats: Table, JSON, and CSV
- PostgreSQL Case Sensitivity: Handles mixed-case table names correctly
- Universal Compatibility: Works with any PostgreSQL database
Environment Variables
Required:
DATABASE_URL: PostgreSQL connection string to your existing database
Optional:
OPENAI_API_KEY: OpenAI API key for AI-powered SQL generationOPENAI_MODEL: Model to use (default: "gpt-4o-mini")HTTP_ADDR: Server address (default: ":8080")HTTP_PATH: MCP endpoint path (default: "/mcp")AUTH_BEARER: Bearer token for authentication
Installation
Download Pre-compiled Binaries
- Go to GitHub Releases
- Download the binary for your platform (Linux, macOS, Windows)
- Extract and run:
# Example for macOS/Linux
tar xzf pgmcp_*.tar.gz
cd pgmcp_*
./pgmcp-server
Alternative Options
# Homebrew (macOS/Linux) - Available after first release
brew tap subnetmarco/homebrew-tap
brew install pgmcp
# Build from source
go build -o pgmcp-server ./server
go build -o pgmcp-client ./client
Docker/Kubernetes
# Docker
docker run -e DATABASE_URL="postgres://user:pass@host:5432/db" \
-p 8080:8080 ghcr.io/subnetmarco/pgmcp:latest
# Kubernetes (see examples/ directory for full manifests)
kubectl create secret generic pgmcp-secret \
--from-literal=database-url="postgres://user:pass@host:5432/db"
kubectl apply -f examples/k8s/
Quick Start
# Set up database (optional - works with any existing PostgreSQL database)
export DATABASE_URL="postgres://user:password@localhost:5432/mydb"
psql $DATABASE_URL < schema.sql
# Run server
export OPENAI_API_KEY="your-api-key"
./pgmcp-server
# Test with client
./pgmcp-client -ask "Who is the user that places the most orders?" -format table
./pgmcp-client -ask "Show me the top 40 most reviewed items in the marketplace" -format table
Environment Variables
Required:
DATABASE_URL: PostgreSQL connection string
Optional:
OPENAI_API_KEY: OpenAI API key for SQL generationOPENAI_MODEL: Model to use (default: "gpt-4o-mini")HTTP_ADDR: Server address (default: ":8080")HTTP_PATH: MCP endpoint path (default: "/mcp")AUTH_BEARER: Bearer token for authentication
Usage Examples
# Ask questions in natural language
./pgmcp-client -ask "What are the top 5 customers?" -format table
./pgmcp-client -ask "How many orders were placed today?" -format json
# Search across all text fields
./pgmcp-client -search "john" -format table
# Multiple questions at once
./pgmcp-client -ask "Show tables" -ask "Count users" -format table
# Different output formats
./pgmcp-client -ask "Export all data" -format csv -max-rows 1000
Example Database
The project includes two schemas:
schema.sql: Full Amazon-like marketplace with 5,000+ recordsschema_minimal.sql: Minimal test schema with mixed-case"Categories"table
Key features:
- Mixed-case table names (
"Categories") for testing case sensitivity - Composite primary keys (
order_items) for testing AI assumptions - Realistic relationships and data types
Use your own database:
export DATABASE_URL="postgres://user:pass@host:5432/your_db"
./pgmcp-server
./pgmcp-client -ask "What tables do I have?"
AI Error Handling
When AI generates incorrect SQL, PGMCP handles it gracefully:
{
"error": "Column not found in generated query",
"suggestion": "Try rephrasing your question or ask about specific tables",
"original_sql": "SELECT non_existent_column FROM table..."
}
Instead of crashing, the system provides helpful feedback and continues operating.
MCP Integration
Cursor Integration
# Start server
export DATABASE_URL="postgres://user:pass@localhost:5432/your_db"
./pgmcp-server
Add to Cursor settings:
{
"mcp.servers": {
"pgmcp": {
"transport": {
"type": "http",
"url": "http://localhost:8080/mcp"
}
}
}
}
Claude Desktop Integration
Edit ~/.config/claude-desktop/claude_desktop_config.json:
{
"mcpServers": {
"pgmcp": {
"transport": {
"type": "http",
"url": "http://localhost:8080/mcp"
}
}
}
}
API Tools
ask: Natural language questions → SQL queries with automatic streamingsearch: Free-text search across all database text columnsstream: Advanced streaming for very large result sets with pagination
Safety Features
- Read-Only Enforcement: Blocks write operations (INSERT, UPDATE, DELETE, etc.)
- Query Timeouts: Prevents long-running queries
- Input Validation: Sanitizes and validates all user input
- Transaction Isolation: All queries run in read-only transactions
Testing
# Unit tests
go test ./server -v
# Integration tests (requires PostgreSQL)
go test ./server -tags=integration -v
License
Apache 2.0 - See LICENSE file for details.
Related Projects
- Model Context Protocol - The underlying protocol specification
- MCP Go SDK - Go implementation of MCP
PGMCP makes your PostgreSQL database accessible to AI assistants through natural language while maintaining security through read-only access controls.
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
XiYan MCP Server
A server enabling natural language queries to SQL databases via the Model Context Protocol.
XiYan MCP Server is a Model Context Protocol (MCP) compliant server that allows users to query SQL databases such as MySQL and PostgreSQL using natural language. It leverages the XiYanSQL model, providing state-of-the-art text-to-SQL translation and supports both general LLMs and local deployment for enhanced security. The server lists available database tables as resources and can read table contents, making it simple to integrate with different applications.
- ⭐ 218
- MCP
- XGenerationLab/xiyan_mcp_server
Multi-Database MCP Server (by Legion AI)
Unified multi-database access and AI interaction server with MCP integration.
Multi-Database MCP Server enables seamless access and querying of diverse databases via a unified API, with native support for the Model Context Protocol (MCP). It supports popular databases such as PostgreSQL, MySQL, SQL Server, and more, and is built for integration with AI assistants and agents. Leveraging the MCP Python SDK, it exposes databases as resources, tools, and prompts for intelligent, context-aware interactions, while delivering zero-configuration schema discovery and secure credential management.
- ⭐ 76
- MCP
- TheRaLabs/legion-mcp
PMCP
Golang Model Context Protocol server for natural language Prometheus queries
PMCP implements a Model Context Protocol (MCP) server in Go, enabling natural language access and manipulation of Prometheus metrics. It maintains full consistency with the Prometheus HTTP API and supports a robust, type-safe interface for seamless integration with MCP-compatible clients. The server offers complete Prometheus API coverage and supports multiple transport methods, including HTTP and Server-Sent Events. Its modular architecture is designed for performance, extensibility, and effective error handling.
- ⭐ 3
- MCP
- yshngg/pmcp
MCP 数据库工具 (MCP Database Utilities)
A secure bridge enabling AI systems safe, read-only access to multiple databases via unified configuration.
MCP Database Utilities provides a secure, standardized service for AI systems to access and analyze databases like SQLite, MySQL, and PostgreSQL using a unified YAML-based configuration. It enforces strict read-only operations, local processing, and credential protection to ensure data privacy and integrity. The tool is suitable for entities focused on data privacy and minimizes risks by isolating database connections and masking sensitive data. Designed for easy integration, it supports multiple installation options and advanced capabilities such as schema analysis and table browsing.
- ⭐ 85
- MCP
- donghao1393/mcp-dbutils
TeslaMate MCP Server
Query your TeslaMate data using the Model Context Protocol
TeslaMate MCP Server implements the Model Context Protocol to enable AI assistants and clients to securely access and query Tesla vehicle data, statistics, and analytics from a TeslaMate PostgreSQL database. The server exposes a suite of tools for retrieving vehicle status, driving history, charging sessions, battery health, and more using standardized MCP endpoints. It supports local and Docker deployments, includes bearer token authentication, and is intended for integration with MCP-compatible AI systems like Claude Desktop.
- ⭐ 106
- MCP
- cobanov/teslamate-mcp
Supabase MCP Server
Connect Supabase projects to AI assistants using the Model Context Protocol.
Supabase MCP Server enables direct, secure integration between Supabase projects and AI assistants such as Cursor, Claude, and Windsurf. Leveraging the Model Context Protocol, it provides standardized endpoints for external LLMs to perform tasks like managing tables, fetching configurations, and querying data on Supabase. The server supports OAuth 2.1 Dynamic Client Registration and offers easy setup with feature groups and popular client installers for local, cloud, and self-hosted environments.
- ⭐ 2,263
- MCP
- supabase-community/supabase-mcp
Didn't find tool you were looking for?