
discogs-mcp-server
Expose Discogs catalog operations via the Model Context Protocol
Key Features
Use Cases
README
Discogs MCP Server
MCP Server for the Discogs API, enabling music catalog operations, search functionality, and more.
Quickstart
If you just want to get started immediately using this MCP Server with the Claude desktop app and don't care about development or running the server yourself, then make sure you have Node.js installed and your Discogs personal access token ready and skip straight to the Claude configuration section. Use the NPX
method from that section.
Table of Contents
- Acknowledgements
- Available Tools
- Caveats
- Prerequisites
- Setup
- Running the Server
- Inspection
- MCP Clients
- TODO
- License
Acknowledgements
This MCP server is built using FastMCP, a typescript framework for building MCP servers. For more information about MCP and how to use MCP servers, please refer to the FastMCP documentation and the official MCP documentation.
Available Tools
Check out the list of available tools: TOOLS.md
Caveats
- The Discogs API documentation is not perfect and some endpoints may not be fully documented or may have inconsistencies.
- Due to the vast number of API endpoints and response types, it's not feasible to verify type safety for every possible response. Please report any type-related issues you encounter.
- This MCP server allows for editing data in your Discogs collection. Please use with caution and verify your actions before executing them.
- The Discogs API
per_page
default is50
, which can be too much data for some clients to process effectively, so within this project adiscogs.config.defaultPerPage
value has been set to5
. You can request more data in your prompts, but be aware that some clients may struggle with larger responses.
Prerequisites
- Node.js (tested with Node.js
20.x.x
, but18.x.x
should work as well)- Check your Node.js version with:
node --version
- Check your Node.js version with:
- Docker (optional, for running a local docker image without having to deal with Node or dependencies)
Setup
- Clone the repository
- Create a
.env
file in the root directory based on.env.example
- Set the following required environment variables in your
.env
:DISCOGS_PERSONAL_ACCESS_TOKEN
: Your Discogs personal access token
To get your Discogs personal access token, go to your Discogs Settings > Developers page and find your token or generate a new one. DO NOT SHARE YOUR TOKEN. OAuth support will be added in a future release.
The other environment variables in .env.example
are optional and have sensible defaults, so you don't need to set them unless you have specific requirements.
Running the Server Locally
Option 1: Local Development
-
Install dependencies:
bashpnpm install
-
Available commands:
pnpm run dev
: Start the development server with hot reloadingpnpm run dev:stream
: Start the development server with hot reloading in HTTP streaming modepnpm run build
: Build the production versionpnpm run start
: Run the production buildpnpm run inspect
: Run the MCP Inspector (see Inspection section)pnpm run format
: Check code formatting (prettier)pnpm run lint
: Run linter (eslint)pnpm run test
: Run vitestpnpm run test:coverage
: Run vitest v8 coveragepnpm run version:check
: Checks that the package.json version and src/version.ts match
Option 2: Docker
-
Build the Docker image:
bashdocker build -t discogs-mcp-server:latest .
-
Run the container:
bashdocker run --env-file .env discogs-mcp-server:latest
For HTTP Streaming transport mode:
bash# The port should match what is in your .env file docker run --env-file .env -p 3001:3001 discogs-mcp-server:latest stream
Inspection
Run the MCP Inspector to test your local MCP server:
pnpm run inspect
This will start the MCP Inspector at http://127.0.0.1:6274
. Visit this URL in your browser to interact with your local MCP server.
For more information about the MCP Inspector, visit the official documentation.
MCP Clients
More client examples will be added in the future. If you'd like configuration for a specific client, either request it by opening a new issue or creating the pull request to edit this section of the README yourself.
Claude Desktop Configuration
Find your claude_desktop_config.json
at Claude > Settings > Developer > Edit Config
and depending on which option you'd like, add JUST ONE of the following:
NPX
Running it straight from the npm registry.
{
"mcpServers": {
"discogs": {
"command": "npx",
"args": [
"-y",
"discogs-mcp-server"
],
"env": {
"DISCOGS_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
}
}
}
}
Local Node
Dependencies should have been installed before you use this method (pnpm install
).
{
"mcpServers": {
"discogs": {
"command": "npx",
"args": [
"tsx",
"/PATH/TO/YOUR/PROJECT/FOLDER/src/index.ts"
],
"env": {
"DISCOGS_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
}
}
}
}
Docker
The docker image should have been built before using this method.
{
"mcpServers": {
"discogs": {
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"--env-file",
"/PATH/TO/YOUR/PROJECT/FOLDER/.env",
"discogs-mcp-server:latest"
]
}
}
}
Any changes to local code will require Claude to be restarted to take effect. Also, Claude requires human-in-the-loop interaction to allow an MCP tool to be run, so everytime a new tool is accessed Claude will ask for permission. You usually only have to do this once per tool per chat. If using the free version, long chats may result in more frequent errors trying to run tools as Claude limits the amount of context within a single chat.
LibreChat
In the librechat.yaml
configuration file, add this under the mcpServers
section:
discogs:
type: stdio
command: npx
args: ["-y", "discogs-mcp-server"]
env:
DISCOGS_PERSONAL_ACCESS_TOKEN: YOUR_TOKEN_GOES_HERE
LM Studio
Get to the Chat Settings
. In the Program
tab there will be a dropdown with a default of Install
. Select Edit mcp.json
. Add this under the mcpServers
section:
"discogs": {
"command": "npx",
"args": [
"-y",
"discogs-mcp-server"
],
"env": {
"DISCOGS_PERSONAL_ACCESS_TOKEN": "YOUR_TOKEN_GOES_HERE"
}
}
After you Save, in the Program
tab there should now be an mcp/discogs
toggle to enable the server. Within every chat box there is an Integrations
menu where you can also enable mcp servers.
TODO
- OAuth support
- Missing tools:
- Inventory uploading
License
This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers

quran-mcp-server
MCP server to access Quran.com API with AI tool compatibility.
quran-mcp-server exposes the Quran.com corpus and associated data through a Model Context Protocol (MCP) server generated from an OpenAPI specification. It provides tool endpoints for chapters, verses, translations, tafsirs, audio, languages, and more. The server is designed for seamless integration with large language models (LLMs) and AI tools, supporting both Docker and Node.js environments. Advanced logging features and flexible deployment options are included for debugging and development.
- ⭐ 49
- MCP
- djalal/quran-mcp-server

1mcp-app/agent
A unified server that aggregates and manages multiple Model Context Protocol servers.
1MCP Agent provides a single, unified interface that aggregates multiple Model Context Protocol (MCP) servers, enabling seamless integration and management of external tools for AI assistants. It acts as a proxy, managing server configuration, authentication, health monitoring, and dynamic server control with features like asynchronous loading, tag-based filtering, and advanced security options. Compatible with popular AI development environments, it simplifies setup by reducing redundant server instances and resource usage. Users can configure, monitor, and scale model tool integrations across various AI clients through easy CLI commands or Docker deployment.
- ⭐ 96
- MCP
- 1mcp-app/agent

mcp-server-templates
Deploy Model Context Protocol servers instantly with zero configuration.
MCP Server Templates enables rapid, zero-configuration deployment of production-ready Model Context Protocol (MCP) servers using Docker containers and a comprehensive CLI tool. It provides a library of ready-made templates for common integrations—including filesystems, GitHub, GitLab, and Zendesk—and features intelligent caching, smart tool discovery, and flexible configuration options via JSON, YAML, environment variables, or CLI. Perfect for AI developers, data scientists, and DevOps teams, it streamlines the process of setting up and managing MCP servers and has evolved into the MCP Platform for enhanced capabilities.
- ⭐ 5
- MCP
- Data-Everything/mcp-server-templates

mcp-open-library
Model Context Protocol server for accessing Open Library book and author data.
Provides an implementation of a Model Context Protocol (MCP) server to enable AI assistants and clients to search and retrieve book and author information from the Open Library API. Supports searching by title, author name, and various identifiers, as well as fetching author photos and book covers. Returns structured, machine-readable data suitable for AI model context integration. Offers installation via Smithery, manual setup, and Docker deployment.
- ⭐ 34
- MCP
- 8enSmith/mcp-open-library

mcp-server-js
Enable secure, AI-driven process automation and code execution on YepCode via Model Context Protocol.
YepCode MCP Server acts as a Model Context Protocol (MCP) server that facilitates seamless communication between AI platforms and YepCode’s workflow automation infrastructure. It allows AI assistants and clients to execute code, manage environment variables, and interact with storage through standardized tools. The server can expose YepCode processes directly as MCP tools and supports both hosted and local installations via NPX or Docker. Enterprise-grade security and real-time interaction make it suitable for integrating advanced automation into AI-powered environments.
- ⭐ 31
- MCP
- yepcode/mcp-server-js

mcpmcp-server
Seamlessly discover, set up, and integrate MCP servers with AI clients.
mcpmcp-server enables users to discover, configure, and connect MCP servers with preferred clients, optimizing AI integration into daily workflows. It supports streamlined setup via JSON configuration, ensuring compatibility with various platforms such as Claude Desktop on macOS. The project simplifies the connection process between AI clients and remote Model Context Protocol servers. Users are directed to an associated homepage for further platform-specific guidance.
- ⭐ 17
- MCP
- glenngillen/mcpmcp-server
Didn't find tool you were looking for?