MCP Notify Server
Send desktop notifications with sound for completed agent/LLM tasks using the MCP protocol.
Key Features
Use Cases
README
📢 MCP Notify Server
A MCP server that send desktop notifications with sound effect when agent tasks are completed.
🥩 Features
- Send system desktop notifications after agent tasks completion
- Play alert sounds to grab user attention, with sound file inside.
- Cross-platform support (Windows, macOS, Linux)
- Based on standard MCP protocol, integrates with various LLM clients
⏬ Installation
Install using uv package manager
git clone https://github.com/Cactusinhand/mcp_server_notify.git
cd mcp_server_notify
uv venv
source .venv/Scripts/activate
uv pip install mcp-server-notify
# or
pip install mcp-server-notify
After installation, call the module directly to check if installation was successful:
python -m mcp_server_notify
This module accepts --debug or --file option, we can use it like:
python -m mcp_server_notify --debug
python -m mcp_server_notify --debug --log-file=path/to/logfile.log
⚠️❕ Special requirements
** We use Apprise API for our Desktop notification deliver,so we need to install some special requirements in our Desktop **
Windows
# windows:// minimum requirements
pip install pywin32
macOS
# Make sure terminal-notifier is installed into your system
brew install terminal-notifier
📚 Usage
Using with Claude Desktop:
Find the configuration file claude_desktop_config.json
{
"mcpServers": {
"NotificationServer": {
"command": "uv",
"args": [
"--directory",
"path/to/your/mcp_server_notify project",
"run",
"mcp-server-notify",
]
}
}
}
If installed globally, you can also use the python command:
{
"mcpServers": {
"NotificationServer": {
"command": "python",
"args": [
"-m",
"mcp_server_notify",
]
}
}
}
⚡️ Using with Cursor:
Find the configuration file ~/.cursor/mcp.json or your_project/.cursor/mcp.json
{
"mcpServers": {
"NotificationServer": {
"command": "uv",
"args": [
"--directory",
"path/to/your/mcp_server_notify project",
"run",
"mcp-server-notify",
]
}
}
}
After configuration, simply add a prompt like finally, send me a notification when task finished. at the end of your task input to the AI to trigger notifications.
In Cursor, you can add this prompt as a rule in Cursor Settings -> Rules so you don't have to type it manually each time.
⚡️ Using with VSCode + Copilot:
-
Install the service manager uv/uvx:
pip install uv -
Add the service to VSCode settings:
Windows
%APPDATA%\Code\User\settings.json
macOS$HOME/Library/Application\ Support/Code/User/settings.json
Linux$HOME/.config/Code/User/settings.jsonjson"mcp": { "servers": { "notifier": { "command": "uvx", "args": [ "mcp-server-notify" ], "env": {} } } } -
Make sure you are using the latest VSCode version — it automatically runs MCP services
-
Open VSCode → enable Copilot → switch to agent mode.
-
Type # → you will see the #send_notification option.
-
Ask the agent: run #send_notification (it will handle the notification automatically).
-
Now the Copilot in agent mode can send desktop notifications.
🐳 Running with Docker
Currently not available due to environment compatibility issues. If Docker containers need to trigger host notifications regardless of whether the host OS is Windows, macOS, or Linux, the solution becomes much more complex, and direct use of native notifications is usually not feasible.
Main issues:
-
OS-specific notification systems Each operating system (Windows, macOS, Linux) has its unique notification mechanism.
-
Docker isolation The isolation of Docker containers limits their ability to access host operating system resources directly.
-
Dependency management Need to handle different notification libraries and dependencies for each operating system.
🧾 License
MIT
💻 Contributions
Issues and pull requests are welcome!
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
ntfy-mcp
Task completion notifications for AI assistants via ntfy.
ntfy-mcp integrates with the Model Context Protocol to provide real-time notifications via ntfy when AI assistants complete tasks. It serves as an MCP server, keeping users informed without needing to constantly monitor progress. The application supports flexible configuration, is built in Node.js, and works seamlessly with the ntfy mobile app for simple, customizable push alerts.
- ⭐ 43
- MCP
- teddyzxcv/ntfy-mcp
ntfy-me-mcp
Send real-time notifications from AI assistants to your devices using ntfy and the Model Context Protocol.
ntfy-me-mcp implements a streamlined Model Context Protocol (MCP) server, enabling AI assistants to send real-time notifications to user devices through the ntfy service. It supports both public and self-hosted ntfy instances with token authentication, allowing integration into various AI workflows. The server automatically detects URLs for interactive actions and applies smart markdown formatting to craft rich notifications suitable for important events and task updates.
- ⭐ 48
- MCP
- gitmotion/ntfy-me-mcp
mcp-installer
Automated installer for MCP servers across multiple languages.
mcp-installer provides a server that automates the installation of other Model Context Protocol (MCP) servers. It supports installation of MCP servers hosted on npm and PyPi by leveraging tools like npx and uv. The tool is designed to integrate with AI assistants like Claude, enabling users to request remote installations of MCP servers with custom arguments and environment configurations. Its primary goal is to simplify the deployment and management of MCP-compliant servers for various workflows.
- ⭐ 1,457
- MCP
- anaisbetts/mcp-installer
interactive-mcp
Enable interactive, local communication between LLMs and users via MCP.
interactive-mcp implements a Model Context Protocol (MCP) server in Node.js/TypeScript, allowing Large Language Models (LLMs) to interact directly with users on their local machine. It exposes tools for requesting user input, sending notifications, and managing persistent command-line chat sessions, facilitating real-time communication. Designed for integration with clients like Claude Desktop and VS Code, it operates locally to access OS-level notifications and command prompts. The project is suited for interactive workflows where LLMs require user involvement or confirmation.
- ⭐ 313
- MCP
- ttommyth/interactive-mcp
Daisys MCP server
A beta server implementation for the Model Context Protocol supporting audio context with Daisys integration.
Daisys MCP server provides a beta implementation of the Model Context Protocol (MCP), enabling seamless integration between the Daisys AI platform and various MCP clients. It allows users to connect MCP-compatible clients to Daisys by configurable authentication and environment settings, with out-of-the-box support for audio file storage and playback. The server is designed to be extensible, including support for both user-level deployments and developer contributions, with best practices for secure authentication and dependency management.
- ⭐ 10
- MCP
- daisys-ai/daisys-mcp
MCP Language Server
Bridge codebase navigation tools to AI models using MCP-enabled language servers.
MCP Language Server implements the Model Context Protocol, allowing MCP-enabled clients, such as LLMs, to interact with language servers for codebase navigation. It exposes standard language server features—like go to definition, references, rename, and diagnostics—over MCP for seamless integration with AI tooling. The server supports multiple languages by serving as a proxy to underlying language servers, including gopls, rust-analyzer, and pyright.
- ⭐ 1,256
- MCP
- isaacphi/mcp-language-server
Didn't find tool you were looking for?