Daisys MCP server

Daisys MCP server

A beta server implementation for the Model Context Protocol supporting audio context with Daisys integration.

10
Stars
5
Forks
10
Watchers
2
Issues
Daisys MCP server provides a beta implementation of the Model Context Protocol (MCP), enabling seamless integration between the Daisys AI platform and various MCP clients. It allows users to connect MCP-compatible clients to Daisys by configurable authentication and environment settings, with out-of-the-box support for audio file storage and playback. The server is designed to be extensible, including support for both user-level deployments and developer contributions, with best practices for secure authentication and dependency management.

Key Features

Implements Model Context Protocol server functionality
Seamless integration with Daisys AI user accounts
Audio context storage and management
Supports major MCP clients (Claude Desktop, Cursor, VSCode, mcp-cli)
Configuration via environment variables and MCP client config
Beta quality with community contribution guidelines
Cross-platform dependency setup (MacOS, Linux)
Virtual environment and dependency isolation using uv
Built-in support for security and credential handling
Automated and manual testing capabilities

Use Cases

Enabling MCP audio context for AI assistants using Daisys accounts
Extending MCP client functionality with audio operations
Integrating Daisys authentication with third-party AI tools
Secure storage of AI-driven audio interactions
Local and cloud-based audio model context workflows
Rapid prototyping for developers working with MCP servers
Testing and debugging Model Context Protocol integrations
Customizing audio file storage paths for context data
Automating server deployment alongside MCP clients
Collaborative development with open-source MCP standards

README

MseeP.ai Security Assessment Badge

Daisys MCP server

smithery badge

Daisys-mcp is a beta version and doesn't have a stable release yet. But you can try it out by doing the following:

  1. Get an account on Daisys and create an username and password.

If you run on mac os run the following command:

bash
brew install portaudio

If you run on linux run the following command:

bash
sudo apt install portaudio19-dev libjack-dev
  1. Add the following configuration to the mcp config file in your MCP client (Claude Desktop, Cursor, mcp-cli, mcp-vscode, etc.):
json
{
  "mcpServers": {
    "daisys-mcp": {
      "command": "uvx",
      "args": ["daisys-mcp"],
      "env": {
        "DAISYS_EMAIL": "{Your Daisys Email}",
        "DAISYS_PASSWORD": "{Your Daisys Password}",
        "DAISYS_BASE_STORAGE_PATH": "{Path where you want to store your audio files}"
      }
    }
  }
}

To build from source:

  1. clone the repository: git clone https://github.com/daisys-ai/daisys-mcp.git

  2. cd into the repository: cd daisys-mcp

  3. Install uv (Python package manager), install with curl -LsSf https://astral.sh/uv/install.sh | sh or see the uv repo for additional install methods.

  4. Create a virtual environment and install dependencies using uv:

bash
uv venv
# source .venv/Scripts/activate (Windows)
source .venv/bin/activate (mac and linux)
uv pip install -e .
  1. Add the following to your config file in your MCP client (Claude Desktop, Cursor, mcp-cli, mcp-vscode, etc.):
json
{
    "mcpServers": {
        "daisys-mcp": {
            "command": "uv",
            "args": [
                "--directory",
                "{installation_path}/daisys-mcp",
                "run",
                "-m",
                "daisys_mcp.server"
            ],
            "env": {
                "DAISYS_EMAIL": "{Your Daisys Email}",
                "DAISYS_PASSWORD": "{Your Daisys Password}",
                "DAISYS_BASE_STORAGE_PATH": "{Path where you want to store your audio files}"
            }
        }
    }
}

Common Issues

If you get any issues with portaudio on linux, you can try installing it manually:

bash
sudo apt-get update
sudo apt-get install -y portaudio19-dev

Contributing

If you want to contribute or run from source:

  1. Clone the repository:
bash
git clone https://github.com/daisys-ai/daisys-mcp.git
cd daisys_mcp
  1. Create a virtual environment and install dependencies using uv:
bash
uv venv
source .venv/bin/activate
uv pip install -e .
uv pip install -e ".[dev]"
  1. Copy .env.example to .env and add your DAISYS username and password:
bash
cp .env.example .env
# Edit .env and add your DAISYS username and password
  1. Test the server by running the tests:
bash
uv run pytest

you can also run a full integration test with:

bash
uv run pytest -m 'requires_credentials' # ⚠️ Running full integration tests does costs tokens on the Daisys platform 
  1. Debug and test locally with MCP Inspector: uv run mcp dev daisys_mcp/server.py

Star History

Star History Chart

Repository Owner

daisys-ai
daisys-ai

Organization

Repository Details

Language Python
Default Branch main
Size 141 KB
Contributors 3
MCP Verified Nov 12, 2025

Programming Languages

Python
97.71%
Dockerfile
2.29%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Unichat MCP Server

    Unichat MCP Server

    Universal MCP server providing context-aware AI chat and code tools across major model vendors.

    Unichat MCP Server enables sending standardized requests to leading AI model vendors, including OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, and Inception, utilizing the Model Context Protocol. It features unified endpoints for chat interactions and provides specialized tools for code review, documentation generation, code explanation, and programmatic code reworking. The server is designed for seamless integration with platforms like Claude Desktop and installation via Smithery. Vendor API keys are required for secure access to supported providers.

    • 37
    • MCP
    • amidabuddha/unichat-mcp-server
  • OpenAI MCP Server

    OpenAI MCP Server

    Bridge between Claude and OpenAI models using the MCP protocol.

    OpenAI MCP Server enables direct querying of OpenAI language models from Claude via the Model Context Protocol (MCP). It provides a configurable Python server that exposes OpenAI APIs as MCP endpoints. The server is designed for seamless integration, requiring simple configuration updates and environment variable setup. Automated testing is supported to verify connectivity and response from the OpenAI API.

    • 77
    • MCP
    • pierrebrunelle/mcp-server-openai
  • mcp-server-home-assistant

    mcp-server-home-assistant

    A Model Context Protocol Server integration for Home Assistant.

    Provides an MCP server interface for Home Assistant, enabling context sharing between Home Assistant and AI models through the Model Context Protocol. Allows users to connect Claude Desktop and similar tools to Home Assistant via a WebSocket API and secure API token. Facilitates seamless integration by leveraging a custom Home Assistant component that is migrating into Home Assistant Core. Enables access and manipulation of smart home context data in standardized ways.

    • 64
    • MCP
    • allenporter/mcp-server-home-assistant
  • Azure DevOps MCP Server

    Azure DevOps MCP Server

    Standardized AI access to Azure DevOps via Model Context Protocol.

    Implements the Model Context Protocol (MCP) to enable AI assistants to securely and efficiently interact with Azure DevOps resources. Provides a standardized bridge for managing projects, work items, repositories, pull requests, and pipelines through natural language interfaces. Supports modular authentication and a feature-based architecture for scalability and integration. Facilitates seamless integration with AI tools such as Claude Desktop and Cursor AI.

    • 306
    • MCP
    • Tiberriver256/mcp-server-azure-devops
  • Kanboard MCP Server

    Kanboard MCP Server

    MCP server for seamless AI integration with Kanboard project management.

    Kanboard MCP Server is a Go-based server implementing the Model Context Protocol (MCP) for integrating AI assistants with the Kanboard project management system. It enables users to manage projects, tasks, users, and workflows in Kanboard directly via natural language commands through compatible AI tools. With built-in support for secure authentication and high performance, it facilitates streamlined project operations between Kanboard and AI-powered clients like Cursor or Claude Desktop. The server is configurable and designed for compatibility with MCP standards.

    • 15
    • MCP
    • bivex/kanboard-mcp
  • Perplexity MCP Server

    Perplexity MCP Server

    MCP Server integration for accessing the Perplexity API with context-aware chat completion.

    Perplexity MCP Server provides a Model Context Protocol (MCP) compliant server that interfaces with the Perplexity API, enabling chat completion with citations. Designed for seamless integration with clients such as Claude Desktop, it allows users to send queries and receive context-rich responses from Perplexity. Environment configuration for API key management is supported, and limitations with long-running requests are noted. Future updates are planned to enhance support for client progress reporting.

    • 85
    • MCP
    • tanigami/mcp-server-perplexity
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results