vrchat-mcp
A Model Context Protocol (MCP) server for structured VRChat API interaction.
Key Features
Use Cases
README
This project is a Model Context Protocol (MCP) server for interacting with the VRChat API. It allows you to retrieve various information from VRChat using a standardized protocol.
Overview
The VRChat MCP server provides a way to access VRChat's API endpoints in a structured manner. It supports a wide range of functionalities, including user authentication, retrieving user and friend information, accessing avatar and world data, and more.
Usage
To start the server, ensure you have the necessary environment variables set:
export VRCHAT_USERNAME=your_username
export VRCHAT_AUTH_TOKEN=your_auth_token
[!NOTE]
How to obtain AUTH TOKEN
You can use the following command to login and obtain an auth token:
$ npx vrchat-auth-token-checker VRChat Username: your-username Password: ******** # If 2FA is enabled 2FA Code: 123456 # Success output Auth Token: authcookie-xxxxxPlease handle the obtained token with care as it has a very long lifetime
Then, run the following command:
npx vrchat-mcp
This will launch the MCP server, allowing you to interact with the VRChat API through the defined tools.
Usage with Claude Desktop
To use this MCP server with Claude Desktop, you do not need to run npx vrchat-mcp manually. Instead, add the following configuration to your Claude Desktop config file:
- MacOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"vrchat-mcp": {
"command": "npx",
"args": ["vrchat-mcp"],
"env": {
"VRCHAT_USERNAME": "your-username",
"VRCHAT_AUTH_TOKEN": "your-auth-token"
}
}
}
}
Then, start Claude Desktop as usual. If you have to use nodenv or nvm, you may need to specify the full path to the npx command.
Available Tools
This Model Context Protocol server provides the following VRChat-related tools:
User Related
- vrchat_get_friends_list: Get a list of friends
- vrchat_send_friend_request: Send a friend request
Avatar Related
- vrchat_search_avatars: Search for avatars
- vrchat_select_avatar: Select and switch to a specific avatar
World Related
- vrchat_search_worlds: Search for worlds
- vrchat_list_favorited_worlds: Get a list of favorited worlds
Instance Related
- vrchat_create_instance: Create a new instance
- vrchat_get_instance: Get information about a specific instance
Group Related
- vrchat_search_groups: Search for groups
- vrchat_join_group: Join a group
Favorites Related
- vrchat_list_favorites: Get a list of favorites
- vrchat_add_favorite: Add a new favorite
- vrchat_list_favorite_groups: Get a list of favorite groups
Invite Related
- vrchat_list_invite_messages: Get a list of invite messages
- vrchat_request_invite: Request an invite
- vrchat_get_invite_message: Get a specific invite message
Notification Related
- vrchat_get_notifications: Get a list of notifications
Debugging
First, build the project:
npm install
npm run build
Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via npm with this command:
npx @modelcontextprotocol/inspector "./dist/main.js"
Be sure that environment variables are properly configured.
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
Publishing
To publish a new version of the package, follow these steps:
-
Pull the latest code from the main branch
bashgit checkout main git pull origin main -
Build the package
bashnpm run build -
Publish to npm
bashnpm publish -
Push changes to the remote repository
bashgit push origin main --tags
Contributing
Contributions are welcome! Please fork the repository and submit a pull request for any improvements or bug fixes.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
Unichat MCP Server
Universal MCP server providing context-aware AI chat and code tools across major model vendors.
Unichat MCP Server enables sending standardized requests to leading AI model vendors, including OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, and Inception, utilizing the Model Context Protocol. It features unified endpoints for chat interactions and provides specialized tools for code review, documentation generation, code explanation, and programmatic code reworking. The server is designed for seamless integration with platforms like Claude Desktop and installation via Smithery. Vendor API keys are required for secure access to supported providers.
- ⭐ 37
- MCP
- amidabuddha/unichat-mcp-server
OpsLevel MCP Server
Read-only MCP server for integrating OpsLevel data with AI tools.
OpsLevel MCP Server implements the Model Context Protocol to provide AI tools with a secure way to access and interact with OpsLevel account data. It supports read-only operations for a wide range of OpsLevel resources such as actions, campaigns, checks, components, documentation, domains, and more. The tool is compatible with popular environments including Claude Desktop and VS Code, enabling easy integration via configuration and API tokens. Installation options include Homebrew, Docker, and standalone binaries.
- ⭐ 8
- MCP
- OpsLevel/opslevel-mcp
Postmancer
A standalone MCP server for API testing and management via AI assistants.
Postmancer is a Model Context Protocol (MCP) server designed to facilitate API testing and management through natural language interactions with AI assistants. It enables HTTP requests, organizes API endpoints into collections, and provides tools for managing environment variables, authentication, and request history. Postmancer is particularly aimed at integrating with AI platforms like Claude for seamless, automated API workflows.
- ⭐ 28
- MCP
- hijaz/postmancer
Aiven MCP Server
Model Context Protocol server enabling LLMs to access and manage Aiven cloud data services.
Aiven MCP Server implements the Model Context Protocol (MCP) to provide secure access to Aiven's PostgreSQL, Kafka, ClickHouse, Valkey, and OpenSearch services. It enables Large Language Models (LLMs) to seamlessly integrate and interact with these cloud data platforms, supporting full stack solution development. The server offers streamlined tools for project and service management via standardized APIs and supports integration with platforms like Claude Desktop and Cursor. Environment variable configuration and explicit permission controls are used to ensure secure and flexible operations.
- ⭐ 11
- MCP
- Aiven-Open/mcp-aiven
mcp-server-chatsum
Summarize and query chat messages using the MCP Server protocol.
mcp-server-chatsum is an MCP Server designed to summarize and query chat messages. It provides tools to interact with chat data, enabling users to extract and summarize message content based on specified prompts. The server can be integrated with Claude Desktop and supports communication over stdio, offering dedicated debugging tools via the MCP Inspector. Environment variable support and database integration ensure flexible deployment for chat data management.
- ⭐ 1,024
- MCP
- chatmcp/mcp-server-chatsum
mcp-confluent
MCP server for managing Confluent Cloud resources via natural language.
mcp-confluent is a Model Context Protocol (MCP) server implementation designed to enable natural language interaction with Confluent Cloud REST APIs. It integrates with AI tools such as Claude Desktop and Goose CLI, allowing users to manage Kafka topics, connectors, and Flink SQL statements conversationally. The project offers flexible configuration, CLI usage, and supports various transports for secure and customizable operations.
- ⭐ 115
- MCP
- confluentinc/mcp-confluent
Didn't find tool you were looking for?