文颜 MCP Server (Wenyan MCP Server)

文颜 MCP Server (Wenyan MCP Server)

AI-powered tool for publishing and formatting Markdown articles to WeChat official accounts via MCP.

907
Stars
101
Forks
907
Watchers
27
Issues
文颜 MCP Server is a server component based on the Model Context Protocol (MCP) that enables automated publishing and formatting of Markdown articles to the WeChat Official Account drafts. It integrates with AI tools and leverages a theming system shared with other Wenyan products to beautify content. The tool is designed for multi-platform use and supports integration with local, CI/CD, and Docker-based AI toolchains, offering seamless automation for article management and publication.

Key Features

Publishes Markdown articles to WeChat Official Account drafts
Supports integration via Model Context Protocol (MCP)
Automatic layout and formatting with customizable themes
Provides a choice of multiple open-source Typora themes
Handles local and remote image uploads automatically
Easy integration with MCP clients through configuration files
Offers deployment options via npm, direct compilation, or Docker
Supports multi-platform use including CI/CD environments
Enables AI-assisted management of article publishing
Extensible with environment variable-based authentication

Use Cases

Automating the publication of blog articles to WeChat Official Accounts
AI-assisted formatting and layout of Markdown content
Integrating content publishing into custom AI-driven workflows
Managing and previewing article themes before publication
Enabling CI/CD-powered content delivery to social platforms
Batch publishing of content using scripts or automation tools
Deploying a content server in a Dockerized production environment
Automating image asset uploads for online articles
Facilitating team-based article management and publication
Enhancing typographic quality of published content with advanced theming

README

logo

文颜 MCP Server

npm License NPM Downloads Docker Pulls Stars

「文颜」是一款多平台排版美化工具,让你将 Markdown 一键发布至微信公众号、知乎、今日头条等主流写作平台。

文颜现已推出多个版本:

文颜 MCP Server 是一个基于模型上下文协议(Model Context Protocol, MCP)的服务器组件,支持将 Markdown 格式的文章发布至微信公众号草稿箱,并使用与 文颜 相同的主题系统进行排版。

https://github.com/user-attachments/assets/2c355f76-f313-48a7-9c31-f0f69e5ec207

使用场景:

功能

  • 列出并选择支持的文章主题
  • 使用内置主题对 Markdown 内容排版
  • 发布文章到微信公众号草稿箱
  • 自动上传本地或网络图片

主题效果

👉 内置主题预览

文颜采用了多个开源的 Typora 主题,在此向各位作者表示感谢:

使用方式

方式一:本地安装(推荐)

npm install -g @wenyan-md/mcp

与 MCP Client 集成

在你的 MCP 配置文件中加入以下内容:

json
{
  "mcpServers": {
    "wenyan-mcp": {
      "name": "公众号助手",
      "command": "wenyan-mcp",
      "env": {
        "WECHAT_APP_ID": "your_app_id",
        "WECHAT_APP_SECRET": "your_app_secret"
      }
    }
  }
}

说明:

  • WECHAT_APP_ID 微信公众号平台的 App ID
  • WECHAT_APP_SECRET 微信平台的 App Secret

方式二:编译运行

编译

确保已安装 Node.js 环境:

bash
git clone https://github.com/caol64/wenyan-mcp.git
cd wenyan-mcp

npm install
npx tsc -b

与 MCP Client 集成

在你的 MCP 配置文件中加入以下内容:

json
{
  "mcpServers": {
    "wenyan-mcp": {
      "name": "公众号助手",
      "command": "node",
      "args": [
        "Your/path/to/wenyan-mcp/dist/index.js"
      ],
      "env": {
        "WECHAT_APP_ID": "your_app_id",
        "WECHAT_APP_SECRET": "your_app_secret"
      }
    }
  }
}

说明:

  • WECHAT_APP_ID 微信公众号平台的 App ID
  • WECHAT_APP_SECRET 微信平台的 App Secret

方式三:使用 Docker 运行(推荐)

适合部署到服务器环境,或与本地 AI 工具链集成。

你可以直接下载编译好的docker镜像

bash
docker pull caol64/wenyan-mcp

或者自己构建镜像

bash
docker build -t wenyan-mcp .
# 国内用户可以指定`npm`镜像源。
docker build --build-arg NPM_REGISTRY=https://mirrors.cloud.tencent.com/npm/ -t wenyan-mcp .

与 MCP Client 集成

在你的 MCP 配置文件中加入以下内容:

json
{
  "mcpServers": {
    "wenyan-mcp": {
      "name": "公众号助手",
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "-v", "/your/host/image/path:/mnt/host-downloads",
        "-e", "WECHAT_APP_ID=your_app_id",
        "-e", "WECHAT_APP_SECRET=your_app_secret",
        "-e", "HOST_IMAGE_PATH=/your/host/image/path",
        "wenyan-mcp"
      ]
    }
  }
}

说明:

  • -v 挂载宿主机目录,使容器内部可以访问本地图片。与环境变量HOST_IMAGE_PATH保持一致。你的 Markdown 文章内的本地图片应该都放置在该目录中,docker会自动将它们映射到容器内。容器无法读取在该目录以外的图片。
  • -e 注入docker容器的环境变量:
  • WECHAT_APP_ID 微信公众号平台的 App ID
  • WECHAT_APP_SECRET 微信平台的 App Secret
  • HOST_IMAGE_PATH 宿主机图片目录

微信公众号 IP 白名单

请务必将服务器 IP 加入公众号平台的 IP 白名单,以确保上传接口调用成功。 详细配置说明请参考:https://yuzhi.tech/docs/wenyan/upload

配置说明(Frontmatter)

为了可以正确上传文章,需要在每一篇 Markdown 文章的开头添加一段frontmatter,提供titlecover两个字段:

md
---
title: 在本地跑一个大语言模型(2) - 给模型提供外部知识库
cover: /Users/lei/Downloads/result_image.jpg
---
  • title 是文章标题,必填。

  • cover 是文章封面,支持本地路径和网络图片:

    • 如果正文有至少一张图片,可省略,此时将使用其中一张作为封面;
    • 如果正文无图片,则必须提供 cover。

关于图片自动上传

  • 支持图片路径:

    • 本地路径(如:/Users/lei/Downloads/result_image.jpg
    • 网络路径(如:https://example.com/image.jpg

示例文章格式

md
---
title: 在本地跑一个大语言模型(2) - 给模型提供外部知识库
cover: /Users/lei/Downloads/result_image.jpg
---

在[上一篇文章](https://babyno.top/posts/2024/02/running-a-large-language-model-locally/)中,我们展示了如何在本地运行大型语言模型。本篇将介绍如何让模型从外部知识库中检索定制数据,提升答题准确率,让它看起来更“智能”。

## 准备模型

访问 `Ollama` 的模型页面,搜索 `qwen`,我们使用支持中文语义的“[通义千问](https://ollama.com/library/qwen:7b)”模型进行实验。

![](https://mmbiz.qpic.cn/mmbiz_jpg/Jsq9IicjScDVUjkPc6O22ZMvmaZUzof5bLDjMyLg2HeAXd0icTvlqtL7oiarSlOicTtiaiacIxpVOV1EeMKl96PhRPPw/640?wx_fmt=jpeg)

如何调试

使用 Inspector 进行简单调试:

npx @modelcontextprotocol/inspector

启动成功出现类似提示:

🔗 Open inspector with token pre-filled:
   http://localhost:6274/?MCP_PROXY_AUTH_TOKEN=761c05058aa4f84ad02280e62d7a7e52ec0430d00c4c7a61492cca59f9eac299
   (Auto-open is disabled when authentication is enabled)

访问以上链接即可打开调试页面。

debug

  1. 正确填写启动命令
  2. 添加环境变量
  3. 点击 Connect
  4. 选择 Tools -> List Tools
  5. 选择要调试的接口
  6. 填入参数并点击 Run Tool
  7. 查看完整参数

赞助

如果您觉得这个项目不错,可以给我家猫咪买点罐头吃。喂猫❤️

License

Apache License Version 2.0

Star History

Star History Chart

Repository Owner

caol64
caol64

User

Repository Details

Language JavaScript
Default Branch main
Size 126 KB
Contributors 2
License Apache License 2.0
MCP Verified Nov 12, 2025

Programming Languages

JavaScript
86.02%
Dockerfile
8.4%
Swift
5.57%

Topics

mcp-server wenyan

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • @growi/mcp-server

    @growi/mcp-server

    Bridge GROWI wiki content to AI models with context-aware access and management.

    @growi/mcp-server acts as a Model Context Protocol (MCP) server that enables AI models to access, search, and manage GROWI wiki content within an organization. It facilitates seamless connection between multiple GROWI instances and language models, enhancing information retrieval and knowledge management capabilities. The platform provides comprehensive tools for page, tag, comment, and revision management as well as share link and user activity tracking. Its flexible configuration allows simultaneous operation with several GROWI apps for scalable deployment.

    • 10
    • MCP
    • growilabs/growi-mcp-server
  • Yuque-MCP-Server

    Yuque-MCP-Server

    Seamless integration of Yuque knowledge base with Model-Context-Protocol for AI model context management.

    Yuque-MCP-Server provides an MCP-compatible server for interacting with the Yuque knowledge base platform. It enables AI models to retrieve, manage, and analyze Yuque documents and user information through a standardized Model-Context-Protocol interface. The server supports operations such as document creation, reading, updating, deletion, advanced search, and team statistics retrieval, making it ideal for AI-powered workflows. Inspired by Figma-Context-MCP, it facilitates contextual awareness and dynamic knowledge management for AI applications.

    • 31
    • MCP
    • HenryHaoson/Yuque-MCP-Server
  • MCP-wolfram-alpha

    MCP-wolfram-alpha

    An MCP server for querying the Wolfram Alpha API.

    MCP-wolfram-alpha provides an implementation of the Model Context Protocol, enabling integration with the Wolfram Alpha API. It exposes prompts and tools to facilitate AI systems in answering natural language queries by leveraging Wolfram Alpha's computational knowledge engine. The server requires an API key and offers configuration examples for seamless setup and development.

    • 64
    • MCP
    • SecretiveShell/MCP-wolfram-alpha
  • MyMCP Server (All-in-One Model Context Protocol)

    MyMCP Server (All-in-One Model Context Protocol)

    Powerful and extensible Model Context Protocol server with developer and productivity integrations.

    MyMCP Server is a robust Model Context Protocol (MCP) server implementation that integrates with services like GitLab, Jira, Confluence, YouTube, Google Workspace, and more. It provides AI-powered search, contextual tool execution, and workflow automation for development and productivity tasks. The system supports extensive configuration and enables selective activation of grouped toolsets for various environments. Installation and deployment are streamlined, with both automated and manual setup options available.

    • 93
    • MCP
    • nguyenvanduocit/all-in-one-model-context-protocol
  • Typst MCP Server

    Typst MCP Server

    Facilitates AI-driven Typst interactions with LaTeX conversion, validation, and image generation tools.

    Typst MCP Server implements the Model Context Protocol, enabling AI models to interface seamlessly with Typst, a markup-based typesetting system. It provides tools for tasks such as converting LaTeX to Typst, validating Typst syntax, listing and retrieving Typst documentation chapters, and rendering Typst code as images. The server is compatible with MCP agent clients, such as Claude Desktop and VS Code’s agent mode. All functionalities are exposed as tools for ease of LLM integration.

    • 79
    • MCP
    • johannesbrandenburger/typst-mcp
  • Notion MCP Server

    Notion MCP Server

    Enable LLMs to interact with Notion using the Model Context Protocol.

    Notion MCP Server allows large language models to interface with Notion workspaces through a Model Context Protocol server, supporting both data retrieval and editing capabilities. It includes experimental Markdown conversion to optimize token usage for more efficient communication with LLMs. The server can be configured with environment variables and controlled for specific tool access. Integration with applications like Claude Desktop is supported for seamless automation.

    • 834
    • MCP
    • suekou/mcp-notion-server
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results