Mattermost MCP Host
Connects Mattermost to MCP servers, enabling AI agent-powered tool orchestration within chat.
Key Features
Use Cases
README
Mattermost MCP Host
A Mattermost integration that connects to Model Context Protocol (MCP) servers, leveraging a LangGraph-based AI agent to provide an intelligent interface for interacting with users and executing tools directly within Mattermost.
Demo
1. Github Agent in support channel - searches the existing issues and PRs and creates a new issue if not found
2. Search internet and post to a channel using Mattermost-MCP-server
Scroll below for full demo in YouTube
Features
- 🤖 Langgraph Agent Integration: Uses a LangGraph agent to understand user requests and orchestrate responses.
- 🔌 MCP Server Integration: Connects to multiple MCP servers defined in
mcp-servers.json. - 🛠️ Dynamic Tool Loading: Automatically discovers tools from connected MCP servers and makes them available to the AI agent. Converts MCP tools to langchain structured tools.
- 💬 Thread-Aware Conversations: Maintains conversational context within Mattermost threads for coherent interactions.
- 🔄 Intelligent Tool Use: The AI agent can decide when to use available tools (including chaining multiple calls) to fulfill user requests.
- 🔍 MCP Capability Discovery: Allows users to list available servers, tools, resources, and prompts via direct commands.
- #️⃣ Direct Command Interface: Interact directly with MCP servers using a command prefix (default:
#).
Overview
The integration works as follows:
- Mattermost Connection (
mattermost_client.py): Connects to the Mattermost server via API and WebSocket to listen for messages in a specified channel. - MCP Connections (
mcp_client.py): Establishes connections (primarilystdio) to each MCP server defined insrc/mattermost_mcp_host/mcp-servers.json. It discovers available tools on each server. - Agent Initialization (
agent/llm_agent.py): ALangGraphAgentis created, configured with the chosen LLM provider and the dynamically loaded tools from all connected MCP servers. - Message Handling (
main.py):- If a message starts with the command prefix (
#), it's parsed as a direct command to list servers/tools or call a specific tool via the correspondingMCPClient. - Otherwise, the message (along with thread history) is passed to the
LangGraphAgent.
- If a message starts with the command prefix (
- Agent Execution: The agent processes the request, potentially calling one or more MCP tools via the
MCPClientinstances, and generates a response. - Response Delivery: The final response from the agent or command execution is posted back to the appropriate Mattermost channel/thread.
Setup
-
Clone the repository:
bashgit clone <repository-url> cd mattermost-mcp-host -
Install:
- Using uv (recommended):
bash
# Install uv if you don't have it yet # curl -LsSf https://astral.sh/uv/install.sh | sh # Activate venv source .venv/bin/activate # Install the package with uv uv sync # To install dev dependencies uv sync --dev --all-extras
- Using uv (recommended):
-
Configure Environment (
.envfile): Copy the.env.exampleand fill in the values or Create a.envfile in the project root (or set environment variables):env# Mattermost Details MATTERMOST_URL=http://your-mattermost-url MATTERMOST_TOKEN=your-bot-token # Needs permissions to post, read channel, etc. MATTERMOST_TEAM_NAME=your-team-name MATTERMOST_CHANNEL_NAME=your-channel-name # Channel for the bot to listen in # MATTERMOST_CHANNEL_ID= # Optional: Auto-detected if name is provided # LLM Configuration (Azure OpenAI is default) DEFAULT_PROVIDER=azure AZURE_OPENAI_ENDPOINT=your-azure-endpoint AZURE_OPENAI_API_KEY=your-azure-api-key AZURE_OPENAI_DEPLOYMENT=your-deployment-name # e.g., gpt-4o # AZURE_OPENAI_API_VERSION= # Optional, defaults provided # Optional: Other providers (install with `[all]` extra) # OPENAI_API_KEY=... # ANTHROPIC_API_KEY=... # GOOGLE_API_KEY=... # Command Prefix COMMAND_PREFIX=#See
.env.examplefor more options. -
Configure MCP Servers: Edit
src/mattermost_mcp_host/mcp-servers.jsonto define the MCP servers you want to connect to. Seesrc/mattermost_mcp_host/mcp-servers-example.json. Depending on the server configuration, you mightnpx,uvx,dockerinstalled in your system and in path. -
Start the Integration:
bashmattermost-mcp-host
Prerequisites
- Python 3.13.1+
- uv package manager
- Mattermost server instance
- Mattermost Bot Account with API token
- Access to a LLM API (Azure OpenAI)
Optional
- One or more MCP servers configured in
mcp-servers.json - Tavily web search requires
TAVILY_API_KEYin.envfile
Usage in Mattermost
Once the integration is running and connected:
- Direct Chat: Simply chat in the configured channel or with the bot. The AI agent will respond, using tools as needed. It maintains context within message threads.
- Direct Commands: Use the command prefix (default
#) for specific actions:#help- Display help information.#servers- List configured and connected MCP servers.#<server_name> tools- List available tools for<server_name>.#<server_name> call <tool_name> <json_arguments>- Call<tool_name>on<server_name>with arguments provided as a JSON string.- Example:
#my-server call echo '{"message": "Hello MCP!"}'
- Example:
#<server_name> resources- List available resources for<server_name>.#<server_name> prompts- List available prompts for<server_name>.
Next Steps
- ⚙️ Configurable LLM Backend: Supports multiple AI providers (Azure OpenAI default, OpenAI, Anthropic Claude, Google Gemini) via environment variables.
Mattermost Setup
- Create a Bot Account
- Go to Integrations > Bot Accounts > Add Bot Account
- Give it a name and description
- Save the access token in the .env file
- Required Bot Permissions
- post_all
- create_post
- read_channel
- create_direct_channel
- read_user
- Add Bot to Team/Channel
- Invite the bot to your team
- Add bot to desired channels
Troubleshooting
- Connection Issues
- Verify Mattermost server is running
- Check bot token permissions
- Ensure correct team/channel names
- AI Provider Issues
- Validate API keys
- Check API quotas and limits
- Verify network access to API endpoints
- MCP Server Issues
- Check server logs
- Verify server configurations
- Ensure required dependencies are installed and env variables are defined
Demos
Create issue via chat using Github MCP server
(in YouTube)
Contributing
Please feel free to open a PR.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
MCP CLI
A powerful CLI for seamless interaction with Model Context Protocol servers and advanced LLMs.
MCP CLI is a modular command-line interface designed for interacting with Model Context Protocol (MCP) servers and managing conversations with large language models. It integrates with the CHUK Tool Processor and CHUK-LLM to provide real-time chat, interactive command shells, and automation capabilities. The system supports a wide array of AI providers and models, advanced tool usage, context management, and performance metrics. Rich output formatting, concurrent tool execution, and flexible configuration make it suitable for both end-users and developers.
- ⭐ 1,755
- MCP
- chrishayuk/mcp-cli
GitHub MCP Server
Connect AI tools directly to GitHub for repository, issue, and workflow management via natural language.
GitHub MCP Server enables AI tools such as agents, assistants, and chatbots to interact natively with the GitHub platform. It allows these tools to access repositories, analyze code, manage issues and pull requests, and automate workflows using the Model Context Protocol (MCP). The server supports integration with multiple hosts, including VS Code and other popular IDEs, and can operate both remotely and locally. Built for developers seeking to enhance AI-powered development workflows through seamless GitHub context access.
- ⭐ 24,418
- MCP
- github/github-mcp-server
mcp-memgraph
Expose Memgraph database features via the Model Context Protocol.
mcp-memgraph provides an MCP (Model Context Protocol) server implementation, enabling Memgraph tools to be accessed over a lightweight STDIO protocol. It supports seamless integration with AI frameworks by standardizing context and communication for data-driven AI workflows. The toolkit is part of a larger suite for extending Memgraph with AI-powered capabilities, including tools for LangChain integration and automated database migration. Tested packages and usage examples are provided for quick adoption.
- ⭐ 52
- MCP
- memgraph/ai-toolkit
LINE Bot MCP Server
MCP server connecting LINE Messaging API with AI agents
Provides a Model Context Protocol (MCP) server implementation for integrating AI agents with the LINE Messaging API. Enables sending text and flex messages, accessing user profiles, and managing features like rich menus via MCP-compatible endpoints. Designed for connecting AI-driven context management with LINE Official Accounts for experimental and production scenarios.
- ⭐ 493
- MCP
- line/line-bot-mcp-server
Klavis
One MCP server for AI agents to handle thousands of tools.
Klavis provides an MCP (Model Context Protocol) server with over 100 prebuilt integrations for AI agents, enabling seamless connectivity with various tools and services. It offers both cloud-hosted and self-hosted deployment options and includes out-of-the-box OAuth support for secure authentication. Klavis is designed to act as an intelligent connector, streamlining workflow automation and enhancing agent capability through standardized context management.
- ⭐ 5,447
- MCP
- Klavis-AI/klavis
Insforge MCP Server
A Model Context Protocol server for seamless integration with Insforge and compatible AI clients.
Insforge MCP Server implements the Model Context Protocol (MCP), enabling smooth integration with various AI tools and clients. It allows users to configure and manage connections to the Insforge platform, providing automated and manual installation methods. The server supports multiple AI clients such as Claude Code, Cursor, Windsurf, Cline, Roo Code, and Trae via standardized context management. Documentation and configuration guidelines are available for further customization and usage.
- ⭐ 3
- MCP
- InsForge/insforge-mcp
Didn't find tool you were looking for?