MCP ChatGPT Server
Enables direct access to OpenAI's ChatGPT API from Claude Desktop via the Model Context Protocol.
Key Features
Use Cases
README
MCP ChatGPT Server
This MCP server allows you to access OpenAI's ChatGPT API directly from Claude Desktop.
📝 Read about why I built this project: I Built an AI That Talks to Other AIs: Demystifying the MCP Hype
Features
- Call the ChatGPT API with customisable parameters
- Aks Claude and ChatGPT to talk to each other in a long running discussion!
- Configure model versions, temperature, and other parameters
- Use web search to get up-to-date information from the internet
- Uses OpenAI's Responses API for automatic conversation state management
- Use your own OpenAI API key
Setup Instructions
Installing via Smithery
To install ChatGPT Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @billster45/mcp-chatgpt-responses --client claude
Prerequisites
- Python 3.10 or higher
- Claude Desktop application
- OpenAI API key
- uv for Python package management
Installation
-
Clone this repository:
bashgit clone https://github.com/billster45/mcp-chatgpt-responses.git cd mcp-chatgpt-responses -
Set up a virtual environment and install dependencies using uv:
bashuv venvbash.venv\\Scripts\\activatebashuv pip install -r requirements.txt
Using with Claude Desktop
-
Configure Claude Desktop to use this MCP server by following the instructions at: MCP Quickstart Guide
-
Add the following configuration to your Claude Desktop config file (adjust paths as needed):
json{ "mcpServers": { "chatgpt": { "command": "uv", "args": [ "--directory", "\\path\\to\\mcp-chatgpt-responses", "run", "chatgpt_server.py" ], "env": { "OPENAI_API_KEY": "your-api-key-here", "DEFAULT_MODEL": "gpt-4o", "DEFAULT_TEMPERATURE": "0.7", "MAX_TOKENS": "1000" } } } } -
Restart Claude Desktop.
-
You can now use the ChatGPT API through Claude by asking questions that mention ChatGPT or that Claude might not be able to answer.
Available Tools
The MCP server provides the following tools:
-
ask_chatgpt(prompt, model, temperature, max_output_tokens, response_id)- Send a prompt to ChatGPT and get a response -
ask_chatgpt_with_web_search(prompt, model, temperature, max_output_tokens, response_id)- Send a prompt to ChatGPT with web search enabled to get up-to-date information
Example Usage
Basic ChatGPT usage:
Tell Claude to ask ChatGPT a question!
Use the ask_chatgpt tool to answer: What is the best way to learn Python?
Tell Claude to have a conversation with ChatGPT:
Use the ask_chatgpt tool to have a two way conversation between you and ChatGPT about the topic that is most important to you.
Note how in a turn taking conversation the response id allows ChatGPT to store the history of the conversation so its a genuine conversation and not just as series of API calls. This is called conversation state.
With web search:
For questions that may benefit from up-to-date information:
Use the ask_chatgpt_with_web_search tool to answer: What are the latest developments in quantum computing?
Now try web search in agentic way to plan your perfect day out based on the weather!
Use the ask_chatgpt_with_web_search tool to find the weather tomorrow in New York, then based on that weather and what it returns, keep using the tool to build up a great day out for someone who loves food and parks
How It Works
This tool utilizes OpenAI's Responses API, which automatically maintains conversation state on OpenAI's servers. This approach:
- Simplifies code by letting OpenAI handle the conversation history
- Provides more reliable context tracking
- Improves the user experience by maintaining context across messages
- Allows access to the latest information from the web with the web search tool
License
MIT License
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
MCP OpenAI Server
Seamlessly connect OpenAI's models to Claude via Model Context Protocol.
MCP OpenAI Server acts as a Model Context Protocol (MCP) bridge allowing Claude Desktop to access and interact with multiple OpenAI chat models. It enables users to leverage models such as GPT-4o and O1 directly from Claude using a straightforward message-passing interface. The server supports easy integration through configuration and provides basic error handling. Designed for use with Node.js and requiring an OpenAI API key, it is tailored for macOS with support for other platforms in progress.
- ⭐ 69
- MCP
- mzxrai/mcp-openai
OpenAI MCP Server
Bridge between Claude and OpenAI models using the MCP protocol.
OpenAI MCP Server enables direct querying of OpenAI language models from Claude via the Model Context Protocol (MCP). It provides a configurable Python server that exposes OpenAI APIs as MCP endpoints. The server is designed for seamless integration, requiring simple configuration updates and environment variable setup. Automated testing is supported to verify connectivity and response from the OpenAI API.
- ⭐ 77
- MCP
- pierrebrunelle/mcp-server-openai
any-chat-completions-mcp
Integrate multiple AI chat providers with OpenAI-compatible MCP server.
any-chat-completions-mcp is a TypeScript-based server implementing the Model Context Protocol (MCP) to connect popular AI chat providers like OpenAI, Perplexity, Groq, xAI, and PyroPrompts via a unified interface. It relays chat/completion requests to any OpenAI SDK-compatible API, allowing users to easily access multiple AI services through the same standardized protocol. The server can be configured for different providers by setting environment variables and integrates with both Claude Desktop and LibreChat.
- ⭐ 143
- MCP
- pyroprompts/any-chat-completions-mcp
Perplexity MCP Server
MCP Server integration for accessing the Perplexity API with context-aware chat completion.
Perplexity MCP Server provides a Model Context Protocol (MCP) compliant server that interfaces with the Perplexity API, enabling chat completion with citations. Designed for seamless integration with clients such as Claude Desktop, it allows users to send queries and receive context-rich responses from Perplexity. Environment configuration for API key management is supported, and limitations with long-running requests are noted. Future updates are planned to enhance support for client progress reporting.
- ⭐ 85
- MCP
- tanigami/mcp-server-perplexity
gqai
Expose GraphQL operations as Model Context Protocol (MCP) tools for AI models.
gqai is a lightweight proxy that converts GraphQL operations into MCP-compatible tools, enabling integration with AI systems such as ChatGPT, Claude, and Cursor. It automatically discovers and exposes GraphQL queries and mutations as callable tools via an MCP server, powered by your existing GraphQL backend. Configuration is managed via standard .graphqlrc.yml and .graphql files, with support for dynamic endpoints and environment variables.
- ⭐ 21
- MCP
- fotoetienne/gqai
mcp-graphql
Enables LLMs to interact dynamically with GraphQL APIs via Model Context Protocol.
mcp-graphql provides a Model Context Protocol (MCP) server that allows large language models to discover and interact with GraphQL APIs. The implementation facilitates schema introspection, exposes the GraphQL schema as a resource, and enables secure query and mutation execution based on configuration. It supports configuration through environment variables, automated or manual installation options, and offers flexibility in using local or remote schema files. By default, mutation operations are disabled for security, but can be enabled if required.
- ⭐ 319
- MCP
- blurrah/mcp-graphql
Didn't find tool you were looking for?