discogs-mcp-server

discogs-mcp-server

Expose Discogs catalog operations via the Model Context Protocol

46
Stars
3
Forks
46
Watchers
0
Issues
Discogs MCP Server enables music catalog operations and search functionality for the Discogs API using the Model Context Protocol (MCP). It allows integration with popular AI model clients such as Claude, LibreChat, and LM Studio by conforming to MCP standards. Built with FastMCP, it provides easy configuration, inspection tools, and Docker support for seamless deployment. The server facilitates user authentication, collection editing, and context-rich tool access for music data management.

Key Features

MCP-compliant server for the Discogs API
Music catalog search and management
Integration with Claude, LibreChat, and LM Studio
User authentication via personal access token
Support for reading and editing user collections
Docker-based deployment option
Tool inspection with MCP Inspector
Quickstart via NPX or local development
Extensive environment variable configuration
Customizable per-page data fetch limit

Use Cases

Enabling LLM-based tools to access and manage Discogs music catalogs
Integrating Discogs catalog functions within AI desktop clients
Allowing users to search and edit their Discogs music collections with AI model assistants
Providing a standardized MCP server for music collection applications
Facilitating inspection and debugging of Discogs API interactions via MCP Inspector
Automating personal music inventory management tasks
Securing catalog operations with granular access tokens
Rapid server setup for music tool prototyping and testing
Extending AI assistants with contextual Discogs music data
Deploying scalable Discogs data interfaces using Docker

README

License GitHub Release GitHub Actions Workflow Status NPM Downloads Sponsor

Discogs MCP Server

MCP Server for the Discogs API, enabling music catalog operations, search functionality, and more.

Quickstart

If you just want to get started immediately using this MCP Server with the Claude desktop app and don't care about development or running the server yourself, then make sure you have Node.js installed and your Discogs personal access token ready and skip straight to the Claude configuration section. Use the NPX method from that section.

Table of Contents

Acknowledgements

This MCP server is built using FastMCP, a typescript framework for building MCP servers. For more information about MCP and how to use MCP servers, please refer to the FastMCP documentation and the official MCP documentation.

Available Tools

Check out the list of available tools: TOOLS.md

Caveats

  • The Discogs API documentation is not perfect and some endpoints may not be fully documented or may have inconsistencies.
  • Due to the vast number of API endpoints and response types, it's not feasible to verify type safety for every possible response. Please report any type-related issues you encounter.
  • This MCP server allows for editing data in your Discogs collection. Please use with caution and verify your actions before executing them.
  • The Discogs API per_page default is 50, which can be too much data for some clients to process effectively, so within this project a discogs.config.defaultPerPage value has been set to 5. You can request more data in your prompts, but be aware that some clients may struggle with larger responses.

Prerequisites

  • Node.js (tested with Node.js 20.x.x, but 18.x.x should work as well)
    • Check your Node.js version with: node --version
  • Docker (optional, for running a local docker image without having to deal with Node or dependencies)

Setup

  1. Clone the repository
  2. Create a .env file in the root directory based on .env.example
  3. Set the following required environment variables in your .env:
    • DISCOGS_PERSONAL_ACCESS_TOKEN: Your Discogs personal access token

To get your Discogs personal access token, go to your Discogs Settings > Developers page and find your token or generate a new one. DO NOT SHARE YOUR TOKEN. OAuth support will be added in a future release.

The other environment variables in .env.example are optional and have sensible defaults, so you don't need to set them unless you have specific requirements.

Running the Server Locally

Option 1: Local Development

  1. Install dependencies:

    bash
    pnpm install
    
  2. Available commands:

    • pnpm run dev: Start the development server with hot reloading
    • pnpm run dev:stream: Start the development server with hot reloading in HTTP streaming mode
    • pnpm run build: Build the production version
    • pnpm run start: Run the production build
    • pnpm run inspect: Run the MCP Inspector (see Inspection section)
    • pnpm run format: Check code formatting (prettier)
    • pnpm run lint: Run linter (eslint)
    • pnpm run test: Run vitest
    • pnpm run test:coverage: Run vitest v8 coverage
    • pnpm run version:check: Checks that the package.json version and src/version.ts match

Option 2: Docker

  1. Build the Docker image:

    bash
    docker build -t discogs-mcp-server:latest .
    
  2. Run the container:

    bash
    docker run --env-file .env discogs-mcp-server:latest
    

    For HTTP Streaming transport mode:

    bash
    # The port should match what is in your .env file
    docker run --env-file .env -p 3001:3001 discogs-mcp-server:latest stream
    

Inspection

Run the MCP Inspector to test your local MCP server:

bash
pnpm run inspect

This will start the MCP Inspector at http://127.0.0.1:6274. Visit this URL in your browser to interact with your local MCP server.

For more information about the MCP Inspector, visit the official documentation.

MCP Clients

More client examples will be added in the future. If you'd like configuration for a specific client, either request it by opening a new issue or creating the pull request to edit this section of the README yourself.

Claude Desktop Configuration

Find your claude_desktop_config.json at Claude > Settings > Developer > Edit Config and depending on which option you'd like, add JUST ONE of the following:

NPX

Running it straight from the npm registry.

json
{
  "mcpServers": {
    "discogs": {
      "command": "npx",
      "args": [
        "-y",
        "discogs-mcp-server"
      ],
      "env": {
        "DISCOGS_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
      }
    }
  }
}

Local Node

Dependencies should have been installed before you use this method (pnpm install).

json
{
  "mcpServers": {
    "discogs": {
      "command": "npx",
      "args": [
        "tsx",
        "/PATH/TO/YOUR/PROJECT/FOLDER/src/index.ts"
      ],
      "env": {
        "DISCOGS_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
      }
    }
  }
}

Docker

The docker image should have been built before using this method.

json
{
  "mcpServers": {
    "discogs": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "--env-file",
        "/PATH/TO/YOUR/PROJECT/FOLDER/.env",
        "discogs-mcp-server:latest"
      ]
    }
  }
}

Any changes to local code will require Claude to be restarted to take effect. Also, Claude requires human-in-the-loop interaction to allow an MCP tool to be run, so everytime a new tool is accessed Claude will ask for permission. You usually only have to do this once per tool per chat. If using the free version, long chats may result in more frequent errors trying to run tools as Claude limits the amount of context within a single chat.

LibreChat

In the librechat.yaml configuration file, add this under the mcpServers section:

yaml
discogs:
  type: stdio
  command: npx
  args: ["-y", "discogs-mcp-server"]
  env:
    DISCOGS_PERSONAL_ACCESS_TOKEN: YOUR_TOKEN_GOES_HERE

LM Studio

Get to the Chat Settings. In the Program tab there will be a dropdown with a default of Install. Select Edit mcp.json. Add this under the mcpServers section:

json
"discogs": {
  "command": "npx",
  "args": [
    "-y",
    "discogs-mcp-server"
  ],
  "env": {
    "DISCOGS_PERSONAL_ACCESS_TOKEN": "YOUR_TOKEN_GOES_HERE"
  }
}

After you Save, in the Program tab there should now be an mcp/discogs toggle to enable the server. Within every chat box there is an Integrations menu where you can also enable mcp servers.

TODO

  • OAuth support
  • Missing tools:
    • Inventory uploading

License

This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.

Star History

Star History Chart

Repository Owner

cswkim
cswkim

User

Repository Details

Language TypeScript
Default Branch main
Size 1,267 KB
Contributors 2
License MIT License
MCP Verified Sep 2, 2025

Programming Languages

TypeScript
99.4%
JavaScript
0.53%
Dockerfile
0.07%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • quran-mcp-server

    quran-mcp-server

    MCP server to access Quran.com API with AI tool compatibility.

    quran-mcp-server exposes the Quran.com corpus and associated data through a Model Context Protocol (MCP) server generated from an OpenAPI specification. It provides tool endpoints for chapters, verses, translations, tafsirs, audio, languages, and more. The server is designed for seamless integration with large language models (LLMs) and AI tools, supporting both Docker and Node.js environments. Advanced logging features and flexible deployment options are included for debugging and development.

    • 49
    • MCP
    • djalal/quran-mcp-server
  • 1mcp-app/agent

    1mcp-app/agent

    A unified server that aggregates and manages multiple Model Context Protocol servers.

    1MCP Agent provides a single, unified interface that aggregates multiple Model Context Protocol (MCP) servers, enabling seamless integration and management of external tools for AI assistants. It acts as a proxy, managing server configuration, authentication, health monitoring, and dynamic server control with features like asynchronous loading, tag-based filtering, and advanced security options. Compatible with popular AI development environments, it simplifies setup by reducing redundant server instances and resource usage. Users can configure, monitor, and scale model tool integrations across various AI clients through easy CLI commands or Docker deployment.

    • 96
    • MCP
    • 1mcp-app/agent
  • mcp-server-templates

    mcp-server-templates

    Deploy Model Context Protocol servers instantly with zero configuration.

    MCP Server Templates enables rapid, zero-configuration deployment of production-ready Model Context Protocol (MCP) servers using Docker containers and a comprehensive CLI tool. It provides a library of ready-made templates for common integrations—including filesystems, GitHub, GitLab, and Zendesk—and features intelligent caching, smart tool discovery, and flexible configuration options via JSON, YAML, environment variables, or CLI. Perfect for AI developers, data scientists, and DevOps teams, it streamlines the process of setting up and managing MCP servers and has evolved into the MCP Platform for enhanced capabilities.

    • 5
    • MCP
    • Data-Everything/mcp-server-templates
  • mcp-open-library

    mcp-open-library

    Model Context Protocol server for accessing Open Library book and author data.

    Provides an implementation of a Model Context Protocol (MCP) server to enable AI assistants and clients to search and retrieve book and author information from the Open Library API. Supports searching by title, author name, and various identifiers, as well as fetching author photos and book covers. Returns structured, machine-readable data suitable for AI model context integration. Offers installation via Smithery, manual setup, and Docker deployment.

    • 34
    • MCP
    • 8enSmith/mcp-open-library
  • mcp-server-js

    mcp-server-js

    Enable secure, AI-driven process automation and code execution on YepCode via Model Context Protocol.

    YepCode MCP Server acts as a Model Context Protocol (MCP) server that facilitates seamless communication between AI platforms and YepCode’s workflow automation infrastructure. It allows AI assistants and clients to execute code, manage environment variables, and interact with storage through standardized tools. The server can expose YepCode processes directly as MCP tools and supports both hosted and local installations via NPX or Docker. Enterprise-grade security and real-time interaction make it suitable for integrating advanced automation into AI-powered environments.

    • 31
    • MCP
    • yepcode/mcp-server-js
  • mcpmcp-server

    mcpmcp-server

    Seamlessly discover, set up, and integrate MCP servers with AI clients.

    mcpmcp-server enables users to discover, configure, and connect MCP servers with preferred clients, optimizing AI integration into daily workflows. It supports streamlined setup via JSON configuration, ensuring compatibility with various platforms such as Claude Desktop on macOS. The project simplifies the connection process between AI clients and remote Model Context Protocol servers. Users are directed to an associated homepage for further platform-specific guidance.

    • 17
    • MCP
    • glenngillen/mcpmcp-server
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results