gqai
Expose GraphQL operations as Model Context Protocol (MCP) tools for AI models.
Key Features
Use Cases
README
gqai
graphql → ai
gqai is a lightweight proxy that exposes GraphQL operations as
Model Context Protocol (MCP) tools for AI like
Claude, Cursor, and ChatGPT.
Define tools using regular GraphQL queries/mutations against your GraphQL backend, and gqai automatically
generates an MCP server for you.
🔌 Powered by your GraphQL backend
⚙️ Driven by .graphqlrc.yml + plain .graphql files
✨ Features
- 🧰 Define tools using GraphQL operations
- 🗂 Automatically discover operations from
.graphqlrc.yml - 🧾 Tool metadata compatible with OpenAI function calling / MCP
🛠️ Installation
go install github.com/fotoetienne/gqai@latest
🚀 Quick Start
- Create a .graphqlrc.yml:
schema: https://graphql.org/graphql/
documents: .
This file tells gqai where to find your GraphQL schema and operations.
Note: The schema parameter tells gqai where to execute the operations. This must be a live server rather than a static schema file
- Add a GraphQL operation
get_all_films.graphql:
# Get all Star Wars films
query get_all_films {
allFilms {
films {
title
episodeID
}
}
}
- Add gqai to your
mcp.jsonfile:
"gqai": {
"command": "gqai",
"args": [
"run",
"--config"
".graphqlrc.yml"
]
}
That's it! Your AI model can now call the get_all_films tool.
Usage
Configuration
GraphQL Config
The graphql config
file is a YAML file that defines the GraphQL endpoint and the operations
you want to expose as tools. It should be named .graphqlrc.yml and placed in the root of your project.
schema: https://graphql.org/graphql/
documents: operations
The schema field specifies the GraphQL endpoint, and the documents field specifies the directory where your GraphQL operations are located.
In this example, the operations directory contains all the GraphQL operations you want to expose as tools.
Operations are defined in .graphql files, and gqai will automatically discover them.
Headers
You can also specify headers to be sent with each request to the GraphQL endpoint. This is useful for authentication or other custom headers.
schema:
- https://graphql.org/graphql/:
headers:
Authorization: Bearer YOUR_TOKEN
X-Custom-Header: CustomValue
documents: .
Using Environment Variables in Headers
You can reference environment variables in header values using the ${VARNAME} syntax. For example:
schema:
- https://graphql.org/graphql/:
headers:
Authorization: Bearer ${MY_AUTH_TOKEN}
documents: .
You can also provide a default value using the ${VARNAME:-default} syntax:
schema:
- https://graphql.org/graphql/:
headers:
Authorization: Bearer ${MY_AUTH_TOKEN:-default-token}
documents: .
When gqai loads the config, it will substitute ${MY_AUTH_TOKEN} with the value of the MY_AUTH_TOKEN environment variable, or use default-token if the variable is not set. This allows you to keep secrets out of your config files.
If the environment variable is not set and no default is provided, the value will be left as-is.
Using Environment Variables in Config
You can use environment variables in any part of your .graphqlrc.yml config: schema URLs, document paths, include/exclude globs, and header values. Use ${VARNAME} or ${VARNAME:-default} syntax:
schema:
- ${MY_SCHEMA_URL:-https://default/graphql}:
headers:
Authorization: Bearer ${MY_AUTH_TOKEN}
documents:
- ${MY_DOCS_PATH:-operations/**/*.graphql}
include: ${MY_INCLUDE:-operations/include.graphql}
exclude: ${MY_EXCLUDE:-operations/exclude.graphql}
gqai will substitute these with the value of the environment variable, or use the default if not set. This keeps secrets and environment-specific paths out of your config files.
MCP Configuration
Claude Desktop
To use gqai with Claude Desktop, you need to add the following configuration to your mcp.json file:
{
"gqai": {
"command": "gqai",
"args": [
"run",
"--config",
".graphqlrc.yml"
]
}
}
🧪 CLI Testing
Call a tool via CLI to test:
gqai tools/call get_all_films
This will execute the get_all_films tool and print the result.
{
"data": {
"allFilms": {
"films": [
{
"id": 4,
"title": "A New Hope"
},
{
"id": 5,
"title": "The Empire Strikes Back"
},
{
"id": 6,
"title": "Return of the Jedi"
},
...
]
}
}
}
Call a tool with arguments:
Create a GraphQL operation that takes arguments, and these will be the tool inputs:
get_film_by_id.graphql:
query get_film_by_id($id: ID!) {
film(filmID: $id) {
episodeID
title
director
releaseDate
}
}
Call the tool with arguments:
gqai tools/call get_film_by_id '{"id": "1"}'
This will execute the get_film_by_id tool with the provided arguments.
{
"data": {
"film": {
"episodeID": 1,
"title": "A New Hope",
"director": "George Lucas",
"releaseDate": "1977-05-25"
}
}
}
Development
Prerequisites
- Go 1.20+
Build
go build -o gqai main.go
Test
go test ./...
Format
go fmt ./...
Run MCP server
./gqai run --config .graphqlrc.yml
Run CLI
./gqai tools/call get_all_films
About GQAI
🤖 Why gqai?
gqai makes it easy to turn your GraphQL backend into a model-ready tool layer — no code, no extra infra. Just define your operations and let AI call them.
📜 License
MIT — fork it, build on it, all the things.
👋 Author
Made with ❤️ and 🤖vibes by Stephen Spalding && <your-name-here>
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
mcp-graphql-forge
Configuration-driven MCP server exposing modular GraphQL queries as tools.
mcp-graphql-forge provides a lightweight, easily configurable Model Context Protocol (MCP) server that transforms any GraphQL endpoint into a set of modular API tools. Tools, defined in YAML, specify GraphQL queries and parameters to enable curated interactions via a standardized protocol. Written in Go, it emphasizes modularity, security, and extensibility without requiring code changes, and offers ready-to-run binaries for all major platforms.
- ⭐ 3
- MCP
- UnitVectorY-Labs/mcp-graphql-forge
OpenAI MCP Server
Bridge between Claude and OpenAI models using the MCP protocol.
OpenAI MCP Server enables direct querying of OpenAI language models from Claude via the Model Context Protocol (MCP). It provides a configurable Python server that exposes OpenAI APIs as MCP endpoints. The server is designed for seamless integration, requiring simple configuration updates and environment variable setup. Automated testing is supported to verify connectivity and response from the OpenAI API.
- ⭐ 77
- MCP
- pierrebrunelle/mcp-server-openai
MyMCP Server (All-in-One Model Context Protocol)
Powerful and extensible Model Context Protocol server with developer and productivity integrations.
MyMCP Server is a robust Model Context Protocol (MCP) server implementation that integrates with services like GitLab, Jira, Confluence, YouTube, Google Workspace, and more. It provides AI-powered search, contextual tool execution, and workflow automation for development and productivity tasks. The system supports extensive configuration and enables selective activation of grouped toolsets for various environments. Installation and deployment are streamlined, with both automated and manual setup options available.
- ⭐ 93
- MCP
- nguyenvanduocit/all-in-one-model-context-protocol
Shopify Storefront MCP Server
Seamless Shopify Storefront API access for AI assistants via Model Context Protocol
Enables AI assistants to interact with Shopify store data through standardized MCP tools. Offers endpoints for product discovery, inventory management, GraphQL queries, cart operations, and comprehensive customer data manipulation. Designed for easy integration with MCP-compatible AI and automated token handling. Simplifies secure connection to Shopify's Storefront API with minimal configuration.
- ⭐ 5
- MCP
- QuentinCody/shopify-storefront-mcp-server
MCP OpenAI Server
Seamlessly connect OpenAI's models to Claude via Model Context Protocol.
MCP OpenAI Server acts as a Model Context Protocol (MCP) bridge allowing Claude Desktop to access and interact with multiple OpenAI chat models. It enables users to leverage models such as GPT-4o and O1 directly from Claude using a straightforward message-passing interface. The server supports easy integration through configuration and provides basic error handling. Designed for use with Node.js and requiring an OpenAI API key, it is tailored for macOS with support for other platforms in progress.
- ⭐ 69
- MCP
- mzxrai/mcp-openai
any-chat-completions-mcp
Integrate multiple AI chat providers with OpenAI-compatible MCP server.
any-chat-completions-mcp is a TypeScript-based server implementing the Model Context Protocol (MCP) to connect popular AI chat providers like OpenAI, Perplexity, Groq, xAI, and PyroPrompts via a unified interface. It relays chat/completion requests to any OpenAI SDK-compatible API, allowing users to easily access multiple AI services through the same standardized protocol. The server can be configured for different providers by setting environment variables and integrates with both Claude Desktop and LibreChat.
- ⭐ 143
- MCP
- pyroprompts/any-chat-completions-mcp
Didn't find tool you were looking for?