MCP Notify Server

MCP Notify Server

Send desktop notifications with sound for completed agent/LLM tasks using the MCP protocol.

39
Stars
9
Forks
39
Watchers
1
Issues
MCP Notify Server provides system desktop notifications and sound alerts when AI agent tasks or LLM jobs finish. Built on the standard Model Context Protocol, it is cross-platform and designed for smooth integration with clients like Claude Desktop, Cursor, and VSCode. This tool leverages the Apprise API for notification delivery and includes configuration details for easy adoption across popular AI/LLM development environments.

Key Features

Sends system desktop notifications upon agent or LLM task completion
Plays sound effects with notifications
Works on Windows, macOS, and Linux platforms
Integrates with major LLM/AI agent clients (Claude Desktop, Cursor, VSCode)
Follows Model Context Protocol standards
Uses Apprise API for notification delivery
Configurable via CLI and project files
Supports additional debug and log file options
Provides easy installation through PyPI and uv/uvx
Offers multilingual documentation (English and Chinese)

Use Cases

Alerting developers when automated LLM agent workflows finish in their IDE
Providing real-time task completion notifications on the desktop when using AI tools
Enhancing productivity by minimizing the need to check task status manually
Integrating notifications into custom LLM agent pipelines
Supporting AI-driven code editors with automated contextual alerts
Improving collaboration by notifying team members upon programmatic task completion
Customizing notification delivery (sound, badge) for accessibility
Keeping users informed about long-running AI processes
Automating reminders or notifications in conversational AI applications
Using with various platforms such as VSCode, Cursor, or Claude Desktop for seamless workflow integration

README

MseeP.ai Security Assessment Badge

PyPI version

Trust Score

📢 MCP Notify Server

English | 中文

A MCP server that send desktop notifications with sound effect when agent tasks are completed.

🥩 Features

  • Send system desktop notifications after agent tasks completion
  • Play alert sounds to grab user attention, with sound file inside.
  • Cross-platform support (Windows, macOS, Linux)
  • Based on standard MCP protocol, integrates with various LLM clients

⏬ Installation

Install using uv package manager

bash
git clone https://github.com/Cactusinhand/mcp_server_notify.git
cd mcp_server_notify

uv venv
source .venv/Scripts/activate

uv pip install mcp-server-notify
# or
pip install mcp-server-notify

After installation, call the module directly to check if installation was successful:

bash
python -m mcp_server_notify

This module accepts --debug or --file option, we can use it like:

shell
python -m mcp_server_notify --debug
python -m mcp_server_notify --debug --log-file=path/to/logfile.log

⚠️❕ Special requirements

** We use Apprise API for our Desktop notification deliver,so we need to install some special requirements in our Desktop **

Windows

shell
# windows:// minimum requirements
pip install pywin32

macOS

shell
# Make sure terminal-notifier is installed into your system
brew install terminal-notifier

📚 Usage

Using with Claude Desktop:

Find the configuration file claude_desktop_config.json

json
{
    "mcpServers": {
        "NotificationServer": {
            "command": "uv",
            "args": [
              "--directory",
              "path/to/your/mcp_server_notify project",
              "run",
              "mcp-server-notify",
            ]
        }
    }
}

If installed globally, you can also use the python command:

json
{
    "mcpServers": {
        "NotificationServer": {
            "command": "python",
            "args": [
              "-m",
              "mcp_server_notify",
            ]
        }
    }
}

⚡️ Using with Cursor:

Find the configuration file ~/.cursor/mcp.json or your_project/.cursor/mcp.json

json
{
    "mcpServers": {
        "NotificationServer": {
            "command": "uv",
            "args": [
              "--directory",
              "path/to/your/mcp_server_notify project",
              "run",
              "mcp-server-notify",
            ]
        }
    }
}

After configuration, simply add a prompt like finally, send me a notification when task finished. at the end of your task input to the AI to trigger notifications.

In Cursor, you can add this prompt as a rule in Cursor Settings -> Rules so you don't have to type it manually each time.

⚡️ Using with VSCode + Copilot:

  1. Install the service manager uv/uvx: pip install uv

  2. Add the service to VSCode settings:

    Windows %APPDATA%\Code\User\settings.json
    macOS $HOME/Library/Application\ Support/Code/User/settings.json
    Linux $HOME/.config/Code/User/settings.json

    json
    "mcp": {
        "servers": {
            "notifier": {
                "command": "uvx",
                "args": [
                    "mcp-server-notify"
                ],
                "env": {}
            }
        }
    }
    
  3. Make sure you are using the latest VSCode version — it automatically runs MCP services

  4. Open VSCode → enable Copilot → switch to agent mode.

  5. Type # → you will see the #send_notification option.

  6. Ask the agent: run #send_notification (it will handle the notification automatically).

  7. Now the Copilot in agent mode can send desktop notifications.

🐳 Running with Docker

Currently not available due to environment compatibility issues. If Docker containers need to trigger host notifications regardless of whether the host OS is Windows, macOS, or Linux, the solution becomes much more complex, and direct use of native notifications is usually not feasible.

Main issues:

  1. OS-specific notification systems Each operating system (Windows, macOS, Linux) has its unique notification mechanism.

  2. Docker isolation The isolation of Docker containers limits their ability to access host operating system resources directly.

  3. Dependency management Need to handle different notification libraries and dependencies for each operating system.

🧾 License

MIT

💻 Contributions

Issues and pull requests are welcome!

Star History

Star History Chart

Repository Owner

Repository Details

Language Python
Default Branch main
Size 73 KB
Contributors 5
License MIT License
MCP Verified Nov 11, 2025

Programming Languages

Python
88.46%
Dockerfile
11.54%

Tags

Topics

mcp mcp-server notification

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • ntfy-mcp

    ntfy-mcp

    Task completion notifications for AI assistants via ntfy.

    ntfy-mcp integrates with the Model Context Protocol to provide real-time notifications via ntfy when AI assistants complete tasks. It serves as an MCP server, keeping users informed without needing to constantly monitor progress. The application supports flexible configuration, is built in Node.js, and works seamlessly with the ntfy mobile app for simple, customizable push alerts.

    • 43
    • MCP
    • teddyzxcv/ntfy-mcp
  • ntfy-me-mcp

    ntfy-me-mcp

    Send real-time notifications from AI assistants to your devices using ntfy and the Model Context Protocol.

    ntfy-me-mcp implements a streamlined Model Context Protocol (MCP) server, enabling AI assistants to send real-time notifications to user devices through the ntfy service. It supports both public and self-hosted ntfy instances with token authentication, allowing integration into various AI workflows. The server automatically detects URLs for interactive actions and applies smart markdown formatting to craft rich notifications suitable for important events and task updates.

    • 48
    • MCP
    • gitmotion/ntfy-me-mcp
  • mcp-installer

    mcp-installer

    Automated installer for MCP servers across multiple languages.

    mcp-installer provides a server that automates the installation of other Model Context Protocol (MCP) servers. It supports installation of MCP servers hosted on npm and PyPi by leveraging tools like npx and uv. The tool is designed to integrate with AI assistants like Claude, enabling users to request remote installations of MCP servers with custom arguments and environment configurations. Its primary goal is to simplify the deployment and management of MCP-compliant servers for various workflows.

    • 1,457
    • MCP
    • anaisbetts/mcp-installer
  • interactive-mcp

    interactive-mcp

    Enable interactive, local communication between LLMs and users via MCP.

    interactive-mcp implements a Model Context Protocol (MCP) server in Node.js/TypeScript, allowing Large Language Models (LLMs) to interact directly with users on their local machine. It exposes tools for requesting user input, sending notifications, and managing persistent command-line chat sessions, facilitating real-time communication. Designed for integration with clients like Claude Desktop and VS Code, it operates locally to access OS-level notifications and command prompts. The project is suited for interactive workflows where LLMs require user involvement or confirmation.

    • 313
    • MCP
    • ttommyth/interactive-mcp
  • Daisys MCP server

    Daisys MCP server

    A beta server implementation for the Model Context Protocol supporting audio context with Daisys integration.

    Daisys MCP server provides a beta implementation of the Model Context Protocol (MCP), enabling seamless integration between the Daisys AI platform and various MCP clients. It allows users to connect MCP-compatible clients to Daisys by configurable authentication and environment settings, with out-of-the-box support for audio file storage and playback. The server is designed to be extensible, including support for both user-level deployments and developer contributions, with best practices for secure authentication and dependency management.

    • 10
    • MCP
    • daisys-ai/daisys-mcp
  • MCP Language Server

    MCP Language Server

    Bridge codebase navigation tools to AI models using MCP-enabled language servers.

    MCP Language Server implements the Model Context Protocol, allowing MCP-enabled clients, such as LLMs, to interact with language servers for codebase navigation. It exposes standard language server features—like go to definition, references, rename, and diagnostics—over MCP for seamless integration with AI tooling. The server supports multiple languages by serving as a proxy to underlying language servers, including gopls, rust-analyzer, and pyright.

    • 1,256
    • MCP
    • isaacphi/mcp-language-server
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results