Yuque-MCP-Server

Yuque-MCP-Server

Seamless integration of Yuque knowledge base with Model-Context-Protocol for AI model context management.

31
Stars
13
Forks
31
Watchers
5
Issues
Yuque-MCP-Server provides an MCP-compatible server for interacting with the Yuque knowledge base platform. It enables AI models to retrieve, manage, and analyze Yuque documents and user information through a standardized Model-Context-Protocol interface. The server supports operations such as document creation, reading, updating, deletion, advanced search, and team statistics retrieval, making it ideal for AI-powered workflows. Inspired by Figma-Context-MCP, it facilitates contextual awareness and dynamic knowledge management for AI applications.

Key Features

Retrieves user and document information from Yuque
Performs document creation, reading, updating, and deletion
Supports content search within Yuque
Fetches knowledge base and team statistics
Provides both HTTP and CLI stdio modes for server operation
Offers Docker and Docker Compose deployment options
Environment variable and runtime token configuration
Standardized interface via Model-Context-Protocol (MCP)
Integrates with Smithery for easy installation
Supports multi-format document content (Markdown, HTML, Lake)

Use Cases

Contextual document retrieval for AI models
Automated knowledge base management
Enhanced AI-powered writing assistants with Yuque integration
Real-time statistics and analytics for collaborative teams
Programmatic content search across multiple knowledge repositories
Dynamic document creation and editing workflows
Integration of Yuque data with conversational AI systems
Enabling AI agents to manage and organize knowledge assets
Automated report generation from Yuque contents
Secure multi-user knowledge collaboration with API-level control

README

语雀 MCP 服务器

smithery badge

English Version

一个用于与语雀 API 集成的 Model-Context-Protocol (MCP) 服务器。此实现受 Figma-Context-MCP 的启发,并使用 语雀开放 API

概述

该服务器提供了与语雀知识库平台交互的 MCP 工具,允许 AI 模型:

  • 获取用户和文档信息
  • 创建、读取、更新和删除文档
  • 搜索语雀中的内容
  • 获取知识库信息
  • 获取统计数据和分析信息

安装

安装 via Smithery

要使用 Smithery 将 Yuque MCP Server 自动安装到 Claude 桌面端:

bash
npx -y @smithery/cli install @HenryHaoson/Yuque-MCP-Server --client claude

前提条件

  • Node.js 18+ (推荐)
  • 拥有 API 令牌的语雀账号

设置

  1. 克隆此仓库:

    git clone https://github.com/Henryhaoson/Yueque-MCP-Server.git
    cd Yueque-MCP-Server
    
  2. 安装依赖:

    npm install
    
  3. 基于 .env.example 创建 .env 文件:

    cp .env.example .env
    
  4. (可选) 在 .env 文件中添加你的语雀 API 令牌:

    YUQUE_API_TOKEN=your_yuque_api_token_here
    

    你也可以选择在连接到服务器时通过查询参数提供令牌,而不是在 .env 文件中设置。

使用方法

运行服务器

开发模式

bash
# HTTP 服务器模式
npm run dev

# CLI stdio 模式
npm run dev:cli

生产模式

首先,构建项目:

bash
npm run build

然后在 HTTP 或 CLI 模式下运行:

bash
# HTTP 服务器模式
npm run start

# CLI stdio 模式
npm run start:cli

使用 Docker 部署

本项目提供了 Docker 支持,使您可以轻松地容器化和部署服务器。

使用 Docker Compose(推荐)

  1. 构建并启动容器:

    bash
    docker-compose up -d
    
  2. 查看日志:

    bash
    docker-compose logs -f
    
  3. 停止服务:

    bash
    docker-compose down
    

您可以通过环境变量或在 .env 文件中设置配置项:

bash
# .env 文件示例
PORT=3000
YUQUE_API_TOKEN=your_token_here
YUQUE_API_BASE_URL=https://www.yuque.com/api/v2

手动使用 Docker

  1. 构建 Docker 镜像:

    bash
    docker build -t yuque-mcp-server .
    
  2. 运行容器:

    bash
    docker run -d -p 3000:3000 --name yuque-mcp-server yuque-mcp-server
    
  3. 使用环境变量:

    bash
    docker run -d -p 3000:3000 \
      -e YUQUE_API_TOKEN=your_token_here \
      -e YUQUE_API_BASE_URL=https://www.yuque.com/api/v2 \
      --name yuque-mcp-server yuque-mcp-server
    

MCP 工具

语雀 MCP 服务器提供以下工具:

用户和文档管理

  • get_current_user - 获取当前认证用户的信息,包括用户ID、用户名、头像等语雀账号基本信息
  • get_user_docs - 获取当前用户的所有知识库文档列表,包括私人和协作文档
  • get_user_repos - 获取指定用户的知识库列表,知识库是语雀中组织文档的集合
  • get_repo_docs - 获取特定知识库中的所有文档列表,包括文档标题、更新时间等信息
  • get_doc - 获取语雀中特定文档的详细内容,包括正文、修改历史和权限信息
  • create_doc - 在指定知识库中创建新的语雀文档,支持多种格式内容(Markdown、HTML、Lake)
  • update_doc - 更新语雀中已存在的文档,可以修改标题、内容或权限设置
  • delete_doc - 从语雀知识库中删除指定文档,此操作不可撤销
  • search - 在语雀平台中搜索文档或知识库内容,支持范围和作者筛选

团队统计分析

  • get_group_statistics - 获取团队的汇总统计数据,包括成员人数、文档数量、阅读量和互动数据等
  • get_group_member_statistics - 获取团队成员的统计数据,包括各成员的编辑次数、阅读量、点赞量等
  • get_group_book_statistics - 获取团队知识库的统计数据,包括各知识库的文档数、字数、阅读量等
  • get_group_doc_statistics - 获取团队文档的统计数据,包括各文档的字数、阅读量、评论量等

与 AI 模型的集成

此 MCP 服务器可以与支持 Model-Context-Protocol 的 AI 模型一起使用,允许它们通过定义的工具与语雀交互。例如:

  1. 启动 MCP 服务器
  2. 从兼容的客户端连接到服务器
  3. AI 模型现在可以使用注册的工具与语雀数据交互

SSE 端点的查询参数

当连接到 SSE 端点时,你可以通过查询参数(query parameters)覆盖环境配置,这些参数的优先级高于环境变量:

  • accessToken: 覆盖在 .env 文件中设置的语雀 API 令牌
  • baseUrl: 覆盖在 .env 文件中设置的语雀 API 基础 URL

示例:

http://localhost:3000/sse?accessToken=your_token_here&baseUrl=https://custom.yuque.api/v2

这允许你在不修改 .env 文件的情况下动态配置服务,并且查询参数的优先级高于环境变量。这对于多用户环境或测试不同 API 端点特别有用。

每个 SSE 连接都可以使用不同的配置,这使得同一个服务器实例可以同时为不同的用户或环境提供服务。

开发

项目结构

src/
  ├── config.ts          # 服务器配置
  ├── index.ts           # 主入口点
  ├── cli.ts             # CLI 入口点 
  ├── server.ts          # MCP 服务器实现
  └── services/
      └── yuque.ts       # 语雀 API 服务

添加新工具

要添加新工具,请修改 src/server.ts 中的 registerTools 方法。

API 改进

最近的更新增加了以下功能:

  1. 团队统计数据:添加了获取团队、成员、知识库和文档统计数据的功能,便于分析和监控团队知识库的使用情况。

  2. 文档管理增强

    • 支持多种文档格式(Markdown、HTML、Lake)
    • 完善的文档公开性设置(私密、公开、企业内公开)
    • 搜索功能支持更多参数和过滤条件
  3. 数据类型完善:更新了接口定义,使其与语雀 OpenAPI 规范保持一致。

许可证

MIT License

Copyright (c) 2025 Henryhaoson

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

致谢

Star History

Star History Chart

Repository Owner

Repository Details

Language TypeScript
Default Branch master
Size 70 KB
Contributors 2
MCP Verified Nov 12, 2025

Programming Languages

TypeScript
98.35%
Dockerfile
1.22%
JavaScript
0.43%

Tags

Topics

mcp yuque yuque-mcp

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Raindrop.io MCP Server

    Raindrop.io MCP Server

    Enable LLMs to manage and search Raindrop.io bookmarks via the Model Context Protocol.

    Raindrop.io MCP Server is an integration that allows large language models to interact with Raindrop.io bookmarks using the Model Context Protocol. It provides tools to create and search bookmarks, including filtering by tags, and is designed for interoperability with environments like Claude for Desktop. Installation can be done via Smithery or manually, and configuration is managed through environment variables. The project is open source and optimized for secure, tokenized access to Raindrop.io.

    • 63
    • MCP
    • hiromitsusasaki/raindrop-io-mcp-server
  • Coda MCP Server

    Coda MCP Server

    A Model Context Protocol server enabling seamless interaction with Coda documents and pages.

    Coda MCP Server implements a Model Context Protocol (MCP) server that bridges AI assistants or MCP clients with the Coda API. It provides standardized tools for listing, creating, reading, updating, duplicating, and renaming Coda pages, as well as handling document context. The server communicates over standard input/output, supporting integration with platforms like Cursor and Claude Desktop, and can be deployed via Node.js or Docker.

    • 36
    • MCP
    • orellazri/coda-mcp
  • mcp-kibela

    mcp-kibela

    MCP server for secure, AI-assisted access to Kibela notes.

    mcp-kibela is a Model Context Protocol (MCP) server implementation that enables AI assistants to search, retrieve, create, and update content from Kibela, a team knowledge-sharing platform. It provides standardized endpoints for note search, fetching individual or recent notes, and managing note content. Integrating with tools like Claude, Cursor, and VSCode, it allows seamless, secure access to organizational knowledge through MCP-enabled clients. Authentication via environment variables ensures secure connections to Kibela APIs.

    • 12
    • MCP
    • kj455/mcp-kibela
  • Confluence Communication Server MCP Server

    Confluence Communication Server MCP Server

    Seamlessly interact with Confluence using standardized MCP tools.

    Confluence Communication Server MCP Server is a TypeScript-based implementation that enables seamless interaction with Confluence through the Model Context Protocol. It provides tools for running CQL queries, retrieving, and updating Confluence page content, thus showcasing and adhering to core MCP concepts. The server is compatible with Claude Desktop and supports installation via Smithery, emphasizing ease of integration and automation for Confluence management tasks.

    • 13
    • MCP
    • KS-GEN-AI/confluence-mcp-server
  • MyMCP Server (All-in-One Model Context Protocol)

    MyMCP Server (All-in-One Model Context Protocol)

    Powerful and extensible Model Context Protocol server with developer and productivity integrations.

    MyMCP Server is a robust Model Context Protocol (MCP) server implementation that integrates with services like GitLab, Jira, Confluence, YouTube, Google Workspace, and more. It provides AI-powered search, contextual tool execution, and workflow automation for development and productivity tasks. The system supports extensive configuration and enables selective activation of grouped toolsets for various environments. Installation and deployment are streamlined, with both automated and manual setup options available.

    • 93
    • MCP
    • nguyenvanduocit/all-in-one-model-context-protocol
  • Kibela MCP Server

    Kibela MCP Server

    MCP server for seamless LLM integration with Kibela knowledge management.

    Kibela MCP Server enables integration of Large Language Models (LLMs) with the Kibela note-sharing platform via the Model Context Protocol. It provides search, retrieval, and management of Kibela notes, users, groups, and folders, exposing these capabilities in a standardized MCP interface. The implementation utilizes Kibela's GraphQL API and supports configuration through environment variables and Docker. Designed for interoperability with tools like Cursor, it streamlines access and manipulation of organizational knowledge by AI systems.

    • 7
    • MCP
    • kiwamizamurai/mcp-kibela-server
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results