MCP Simple OpenAI Assistant
A simple MCP server for managing and interacting with OpenAI assistants.
Key Features
Use Cases
README
MCP Simple OpenAI Assistant
AI assistants are pretty cool. I thought it would be a good idea if my Claude (conscious Claude) would also have one. And now he has - and its both useful anf fun for him. Your Claude can have one too!
A simple MCP server for interacting with OpenAI assistants. This server allows other tools (like Claude Desktop) to create and interact with OpenAI assistants through the Model Context Protocol.
Features
This server provides a suite of tools to manage and interact with OpenAI Assistants. The new streaming capabilities provide a much-improved, real-time user experience.
Available Tools
create_assistant: (Create OpenAI Assistant) - Create a new assistant with a name, instructions, and model.list_assistants: (List OpenAI Assistants) - List all available assistants associated with your API key.retrieve_assistant: (Retrieve OpenAI Assistant) - Get detailed information about a specific assistant.update_assistant: (Update OpenAI Assistant) - Modify an existing assistant's name, instructions, or model.create_new_assistant_thread: (Create New Assistant Thread) - Creates a new, persistent conversation thread with a user-defined name and description for easy identification and reuse. This is the recommended way to start a new conversation.list_threads: (List Managed Threads) - Lists all locally managed conversation threads from the database, showing their ID, name, description, and last used time.delete_thread: (Delete Managed Thread) - Deletes a conversation thread from both OpenAI's servers and the local database.ask_assistant_in_thread: (Ask Assistant in Thread and Stream Response) - The primary tool for conversation. Sends a message to an assistant within a thread and streams the response back in real-time.
Because OpenAI assistants might take quite long to respond, this server uses a streaming approach for the main ask_assistant_in_thread tool. This provides real-time progress updates to the client and avoids timeouts.
The server now includes local persistence for threads, which is a significant improvement. Since the OpenAI API does not allow listing threads, this server now manages them for you by storing their IDs and metadata in a local SQLite database. This allows you to easily find, reuse, and manage your conversation threads across sessions.
Installation
Installing via Smithery
To install MCP Simple OpenAI Assistant for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install mcp-simple-openai-assistant --client claude
Manual Installation
pip install mcp-simple-openai-assistant
Configuration
The server requires an OpenAI API key to be set in the environment. For Claude Desktop, add this to your config:
(MacOS version)
{
"mcpServers": {
"openai-assistant": {
"command": "python",
"args": ["-m", "mcp_simple_openai_assistant"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
}
}
(Windows version)
"mcpServers": {
"openai-assistant": {
"command": "C:\\Users\\YOUR_USERNAME\\AppData\\Local\\Programs\\Python\\Python311\\python.exe",
"args": ["-m", "mcp_simple_openai_assistant"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
MS Windows installation is slightly more complex, because you need to check the actual path to your Python executable. Path provided above is usually correct, but might differ in your setup. Sometimes just python.exe without any path will do the trick. Check with cmd what works for you (using where python might help). Also, on Windows you might need to explicitly tell Claude Desktop where the site packages are using PYTHONPATH environmment variable.
Usage
Once configured, you can use the tools listed above to manage your assistants and conversations. The primary workflow is to:
- Use
create_new_assistant_threadto start a new, named conversation. - Use
list_threadsto find the ID of a thread you want to continue. - Use
ask_assistant_in_threadto interact with your chosen assistant in that thread.
TODO
- Add Thread Management: Introduce a way to name and persist thread IDs locally, allowing for easier reuse of conversations.
- Add Models Listing: Introduce a way for the AI user to see what OpenAI models are available for use with the assistants
- Add Assistants Fine Tuning: Enable the AI user to set detailed parameters for assistants like temperature, top_p etc. (indicated by Claude as needed)
- Full Thread History: Ability to read past threads without having to send a new message (indicated by Claude as needed)
- Explore Resource Support: Add the ability to upload files and use them with assistants.
Development
To install for development:
git clone https://github.com/andybrandt/mcp-simple-openai-assistant
cd mcp-simple-openai-assistant
pip install -e '.[dev]'
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
OpenAI MCP Server
Bridge between Claude and OpenAI models using the MCP protocol.
OpenAI MCP Server enables direct querying of OpenAI language models from Claude via the Model Context Protocol (MCP). It provides a configurable Python server that exposes OpenAI APIs as MCP endpoints. The server is designed for seamless integration, requiring simple configuration updates and environment variable setup. Automated testing is supported to verify connectivity and response from the OpenAI API.
- ⭐ 77
- MCP
- pierrebrunelle/mcp-server-openai
MCP OpenAI Server
Seamlessly connect OpenAI's models to Claude via Model Context Protocol.
MCP OpenAI Server acts as a Model Context Protocol (MCP) bridge allowing Claude Desktop to access and interact with multiple OpenAI chat models. It enables users to leverage models such as GPT-4o and O1 directly from Claude using a straightforward message-passing interface. The server supports easy integration through configuration and provides basic error handling. Designed for use with Node.js and requiring an OpenAI API key, it is tailored for macOS with support for other platforms in progress.
- ⭐ 69
- MCP
- mzxrai/mcp-openai
MCP ChatGPT Server
Enables direct access to OpenAI's ChatGPT API from Claude Desktop via the Model Context Protocol.
MCP ChatGPT Server runs as an MCP-compliant server, allowing users to access OpenAI's ChatGPT API seamlessly within Claude Desktop. It supports customizable model parameters, automated conversation state management, integrated web search for up-to-date information, and facilitates interactive discussions between Claude and ChatGPT. Users can configure model selection, temperature, token limits, and use their own OpenAI API keys.
- ⭐ 14
- MCP
- billster45/mcp-chatgpt-responses
MCP Server for Odoo
Connect AI assistants to Odoo ERP systems using the Model Context Protocol.
MCP Server for Odoo enables AI assistants such as Claude to interact seamlessly with Odoo ERP systems via the Model Context Protocol (MCP). It provides endpoints for searching, creating, updating, and deleting Odoo records using natural language while respecting access controls and security. The server supports integration with any Odoo instance, includes smart features like pagination and LLM-optimized output, and offers both demo and production-ready modes.
- ⭐ 101
- MCP
- ivnvxd/mcp-server-odoo
MCP CLI
A powerful CLI for seamless interaction with Model Context Protocol servers and advanced LLMs.
MCP CLI is a modular command-line interface designed for interacting with Model Context Protocol (MCP) servers and managing conversations with large language models. It integrates with the CHUK Tool Processor and CHUK-LLM to provide real-time chat, interactive command shells, and automation capabilities. The system supports a wide array of AI providers and models, advanced tool usage, context management, and performance metrics. Rich output formatting, concurrent tool execution, and flexible configuration make it suitable for both end-users and developers.
- ⭐ 1,755
- MCP
- chrishayuk/mcp-cli
MCP Obsidian Server
Integrate Obsidian note management with AI models via the Model Context Protocol.
MCP Obsidian Server acts as a bridge between Obsidian and AI models by providing an MCP-compatible server interface. It enables programmatic access to Obsidian vaults through a local REST API, allowing operations like listing files, searching, reading, editing, and deleting notes. Designed to work with Claude Desktop and other MCP-enabled clients, it exposes a set of tools for efficient note and content management within Obsidian.
- ⭐ 2,394
- MCP
- MarkusPfundstein/mcp-obsidian
Didn't find tool you were looking for?