Mattermost MCP Host

Mattermost MCP Host

Connects Mattermost to MCP servers, enabling AI agent-powered tool orchestration within chat.

27
Stars
16
Forks
27
Watchers
6
Issues
Mattermost MCP Host integrates the Model Context Protocol (MCP) with Mattermost, utilizing a LangGraph-based AI agent to facilitate seamless user interaction and dynamic tool execution directly in chat. It supports integration with multiple MCP servers, automatically discovers available tools, and allows users to issue commands or natural language requests. Conversational context is maintained in threads, and users can interact directly with MCP servers to manage resources and capabilities.

Key Features

Connects to multiple MCP servers for tool and resource access
Integrates LangGraph AI agent for intelligent request handling
Automatic tool discovery and dynamic loading from MCP servers
Maintains conversation thread context within Mattermost
Supports both direct command and natural language interaction
Lists available servers, tools, resources, and prompts on request
Executes chained tool calls as needed by the AI agent
Posts agent responses and tool outputs back to Mattermost
Configurable via environment variables and json files
Direct control through command prefix usage

Use Cases

Automated issue and pull request management in GitHub via Mattermost
Connecting team chat with diverse MCP-powered AI tools
Discovering and executing custom tools from multiple servers
Providing intelligent, context-aware chatbot support within Mattermost
Managing resources and capabilities across MCP servers via chat
Extending organizational workflows with custom AI capabilities
Interacting with and chaining multiple tools in response to user queries
Hosting dynamic AI agents that respond based on conversation history
Facilitating direct command-driven access to server-side automation
Enabling seamless integration of enterprise AI tools in team communication

README

Mattermost MCP Host

A Mattermost integration that connects to Model Context Protocol (MCP) servers, leveraging a LangGraph-based AI agent to provide an intelligent interface for interacting with users and executing tools directly within Mattermost.

Version Python License Package Manager

Demo

1. Github Agent in support channel - searches the existing issues and PRs and creates a new issue if not found

Description of your GIF

2. Search internet and post to a channel using Mattermost-MCP-server

Description of your GIF

Scroll below for full demo in YouTube

Features

  • 🤖 Langgraph Agent Integration: Uses a LangGraph agent to understand user requests and orchestrate responses.
  • 🔌 MCP Server Integration: Connects to multiple MCP servers defined in mcp-servers.json.
  • 🛠️ Dynamic Tool Loading: Automatically discovers tools from connected MCP servers and makes them available to the AI agent. Converts MCP tools to langchain structured tools.
  • 💬 Thread-Aware Conversations: Maintains conversational context within Mattermost threads for coherent interactions.
  • 🔄 Intelligent Tool Use: The AI agent can decide when to use available tools (including chaining multiple calls) to fulfill user requests.
  • 🔍 MCP Capability Discovery: Allows users to list available servers, tools, resources, and prompts via direct commands.
  • #️⃣ Direct Command Interface: Interact directly with MCP servers using a command prefix (default: #).

Overview

The integration works as follows:

  1. Mattermost Connection (mattermost_client.py): Connects to the Mattermost server via API and WebSocket to listen for messages in a specified channel.
  2. MCP Connections (mcp_client.py): Establishes connections (primarily stdio) to each MCP server defined in src/mattermost_mcp_host/mcp-servers.json. It discovers available tools on each server.
  3. Agent Initialization (agent/llm_agent.py): A LangGraphAgent is created, configured with the chosen LLM provider and the dynamically loaded tools from all connected MCP servers.
  4. Message Handling (main.py):
    • If a message starts with the command prefix (#), it's parsed as a direct command to list servers/tools or call a specific tool via the corresponding MCPClient.
    • Otherwise, the message (along with thread history) is passed to the LangGraphAgent.
  5. Agent Execution: The agent processes the request, potentially calling one or more MCP tools via the MCPClient instances, and generates a response.
  6. Response Delivery: The final response from the agent or command execution is posted back to the appropriate Mattermost channel/thread.

Setup

  1. Clone the repository:

    bash
    git clone <repository-url>
    cd mattermost-mcp-host
    
  2. Install:

    • Using uv (recommended):
      bash
      # Install uv if you don't have it yet
      # curl -LsSf https://astral.sh/uv/install.sh | sh 
      
      # Activate venv
      source .venv/bin/activate
      
      # Install the package with uv
      uv sync
      
      # To install dev dependencies
      uv sync --dev --all-extras
      
  3. Configure Environment (.env file): Copy the .env.example and fill in the values or Create a .env file in the project root (or set environment variables):

    env
    # Mattermost Details
    MATTERMOST_URL=http://your-mattermost-url
    MATTERMOST_TOKEN=your-bot-token # Needs permissions to post, read channel, etc.
    MATTERMOST_TEAM_NAME=your-team-name
    MATTERMOST_CHANNEL_NAME=your-channel-name # Channel for the bot to listen in
    # MATTERMOST_CHANNEL_ID= # Optional: Auto-detected if name is provided
    
    # LLM Configuration (Azure OpenAI is default)
    DEFAULT_PROVIDER=azure
    AZURE_OPENAI_ENDPOINT=your-azure-endpoint
    AZURE_OPENAI_API_KEY=your-azure-api-key
    AZURE_OPENAI_DEPLOYMENT=your-deployment-name # e.g., gpt-4o
    # AZURE_OPENAI_API_VERSION= # Optional, defaults provided
    
    # Optional: Other providers (install with `[all]` extra)
    # OPENAI_API_KEY=...
    # ANTHROPIC_API_KEY=...
    # GOOGLE_API_KEY=...
    
    # Command Prefix
    COMMAND_PREFIX=# 
    

    See .env.example for more options.

  4. Configure MCP Servers: Edit src/mattermost_mcp_host/mcp-servers.json to define the MCP servers you want to connect to. See src/mattermost_mcp_host/mcp-servers-example.json. Depending on the server configuration, you might npx, uvx, docker installed in your system and in path.

  5. Start the Integration:

    bash
    mattermost-mcp-host
    

Prerequisites

  • Python 3.13.1+
  • uv package manager
  • Mattermost server instance
  • Mattermost Bot Account with API token
  • Access to a LLM API (Azure OpenAI)

Optional

  • One or more MCP servers configured in mcp-servers.json
  • Tavily web search requires TAVILY_API_KEY in .env file

Usage in Mattermost

Once the integration is running and connected:

  1. Direct Chat: Simply chat in the configured channel or with the bot. The AI agent will respond, using tools as needed. It maintains context within message threads.
  2. Direct Commands: Use the command prefix (default #) for specific actions:
    • #help - Display help information.
    • #servers - List configured and connected MCP servers.
    • #<server_name> tools - List available tools for <server_name>.
    • #<server_name> call <tool_name> <json_arguments> - Call <tool_name> on <server_name> with arguments provided as a JSON string.
      • Example: #my-server call echo '{"message": "Hello MCP!"}'
    • #<server_name> resources - List available resources for <server_name>.
    • #<server_name> prompts - List available prompts for <server_name>.

Next Steps

  • ⚙️ Configurable LLM Backend: Supports multiple AI providers (Azure OpenAI default, OpenAI, Anthropic Claude, Google Gemini) via environment variables.

Mattermost Setup

  1. Create a Bot Account
  • Go to Integrations > Bot Accounts > Add Bot Account
  • Give it a name and description
  • Save the access token in the .env file
  1. Required Bot Permissions
  • post_all
  • create_post
  • read_channel
  • create_direct_channel
  • read_user
  1. Add Bot to Team/Channel
  • Invite the bot to your team
  • Add bot to desired channels

Troubleshooting

  1. Connection Issues
  • Verify Mattermost server is running
  • Check bot token permissions
  • Ensure correct team/channel names
  1. AI Provider Issues
  • Validate API keys
  • Check API quotas and limits
  • Verify network access to API endpoints
  1. MCP Server Issues
  • Check server logs
  • Verify server configurations
  • Ensure required dependencies are installed and env variables are defined

Demos

Create issue via chat using Github MCP server

Description of your GIF

(in YouTube)

AI Agent in Action in Mattermost

Contributing

Please feel free to open a PR.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Star History

Star History Chart

Repository Owner

Repository Details

Language Python
Default Branch main
Size 125,449 KB
Contributors 1
License MIT License
MCP Verified Nov 11, 2025

Programming Languages

Python
99.73%
Dockerfile
0.27%

Tags

Topics

langgraph llm mattermost mcp mcp-clients

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • MCP CLI

    MCP CLI

    A powerful CLI for seamless interaction with Model Context Protocol servers and advanced LLMs.

    MCP CLI is a modular command-line interface designed for interacting with Model Context Protocol (MCP) servers and managing conversations with large language models. It integrates with the CHUK Tool Processor and CHUK-LLM to provide real-time chat, interactive command shells, and automation capabilities. The system supports a wide array of AI providers and models, advanced tool usage, context management, and performance metrics. Rich output formatting, concurrent tool execution, and flexible configuration make it suitable for both end-users and developers.

    • 1,755
    • MCP
    • chrishayuk/mcp-cli
  • GitHub MCP Server

    GitHub MCP Server

    Connect AI tools directly to GitHub for repository, issue, and workflow management via natural language.

    GitHub MCP Server enables AI tools such as agents, assistants, and chatbots to interact natively with the GitHub platform. It allows these tools to access repositories, analyze code, manage issues and pull requests, and automate workflows using the Model Context Protocol (MCP). The server supports integration with multiple hosts, including VS Code and other popular IDEs, and can operate both remotely and locally. Built for developers seeking to enhance AI-powered development workflows through seamless GitHub context access.

    • 24,418
    • MCP
    • github/github-mcp-server
  • mcp-memgraph

    mcp-memgraph

    Expose Memgraph database features via the Model Context Protocol.

    mcp-memgraph provides an MCP (Model Context Protocol) server implementation, enabling Memgraph tools to be accessed over a lightweight STDIO protocol. It supports seamless integration with AI frameworks by standardizing context and communication for data-driven AI workflows. The toolkit is part of a larger suite for extending Memgraph with AI-powered capabilities, including tools for LangChain integration and automated database migration. Tested packages and usage examples are provided for quick adoption.

    • 52
    • MCP
    • memgraph/ai-toolkit
  • LINE Bot MCP Server

    LINE Bot MCP Server

    MCP server connecting LINE Messaging API with AI agents

    Provides a Model Context Protocol (MCP) server implementation for integrating AI agents with the LINE Messaging API. Enables sending text and flex messages, accessing user profiles, and managing features like rich menus via MCP-compatible endpoints. Designed for connecting AI-driven context management with LINE Official Accounts for experimental and production scenarios.

    • 493
    • MCP
    • line/line-bot-mcp-server
  • Klavis

    Klavis

    One MCP server for AI agents to handle thousands of tools.

    Klavis provides an MCP (Model Context Protocol) server with over 100 prebuilt integrations for AI agents, enabling seamless connectivity with various tools and services. It offers both cloud-hosted and self-hosted deployment options and includes out-of-the-box OAuth support for secure authentication. Klavis is designed to act as an intelligent connector, streamlining workflow automation and enhancing agent capability through standardized context management.

    • 5,447
    • MCP
    • Klavis-AI/klavis
  • Insforge MCP Server

    Insforge MCP Server

    A Model Context Protocol server for seamless integration with Insforge and compatible AI clients.

    Insforge MCP Server implements the Model Context Protocol (MCP), enabling smooth integration with various AI tools and clients. It allows users to configure and manage connections to the Insforge platform, providing automated and manual installation methods. The server supports multiple AI clients such as Claude Code, Cursor, Windsurf, Cline, Roo Code, and Trae via standardized context management. Documentation and configuration guidelines are available for further customization and usage.

    • 3
    • MCP
    • InsForge/insforge-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results