Qiniu MCP Server

Qiniu MCP Server

Bridge AI models to Qiniu cloud storage and multimedia services via the Model Context Protocol.

30
Stars
13
Forks
30
Watchers
2
Issues
Qiniu MCP Server implements the Model Context Protocol (MCP) to enable AI model clients to seamlessly access Qiniu cloud storage, multimedia, and CDN services within the context of user interactions. It provides capabilities such as bucket and file listing, file uploading, file reading, image processing tasks like resizing and rounded corners, as well as CDN cache operations. The server is designed to be integrated with AI tooling like VS Code's Cline plugin, supporting contextual data access and manipulation via a standardized protocol.

Key Features

List and manage Qiniu storage buckets
List and upload files to Qiniu buckets
Read content from files stored in cloud buckets
Generate download links for stored files
Process images (resize, rounded corners) via cloud multimedia services
CDN cache refresh and prefetch for file URLs
Environment configuration via .env and environment variables
Integration with AI model clients (e.g., VS Code Cline plugin)
Supports batch configuration for multiple buckets
Secure authentication using access and secret keys

Use Cases

Allowing generative AI tools to retrieve and manage cloud-stored data in context
Showcasing and previewing images and files directly in AI-enhanced IDE environments
Automating cloud storage uploads from AI model outputs
Replacing local storage access with secure cloud-backed workflows in AI pipelines
Dynamically processing and transforming images as part of AI-driven media projects
Triggering CDN refreshes and prefetches to reduce latency in AI-powered web applications
Supporting collaborative AI projects with shared, centralized cloud file access
Integrating intelligent multimedia file management into model-driven assistants
Strengthening security and compliance by leveraging cloud-based authentication
Extending IDE experiences with contextual cloud file browsing and modification

README

Qiniu MCP Server

概述

基于七牛云产品构建的 Model Context Protocol (MCP) Server,支持用户在 AI 大模型客户端的上下文中通过该 MCP Server 来访问七牛云存储、智能多媒体服务等。

关于访问七牛云存储详细情况请参考 基于 MCP 使用大模型访问七牛云存储

能力集:

  • 存储
    • 获取 Bucket 列表
    • 获取 Bucket 中的文件列表
    • 上传本地文件,以及给出文件内容进行上传
    • 读取文件内容
    • 获取文件下载链接
  • 智能多媒体
    • 图片缩放
    • 图片切圆角
  • CDN
    • 根据链接刷新文件
    • 根据链接预取文件

环境要求

  • Python 3.12 或更高版本
  • uv 包管理器

如果还没有安装 uv,可以使用以下命令安装:

bash
# Mac,推荐使用 brew 安装
brew install uv


# Linux & Mac
# 1. 安装
curl -LsSf https://astral.sh/uv/install.sh | sh
# 2. 安装完成后,请确保将软件包安装路径(包含 uv 和 uvx 可执行文件的目录)添加到系统的 PATH 环境变量中。
# 假设安装包路径为 /Users/xxx/.local/bin(见安装执行输出)
### 临时生效(当前会话),在当前终端中执行以下命令:
export PATH="/Users/xxx/.local/bin:$PATH"
### 永久生效(推荐),在当前终端中执行以下命令:
echo 'export PATH="/Users/xxx/.local/bin:$PATH"' >> ~/.bash_profile
source ~/.bash_profile


# Windows
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

具体安装方式参考 uv 安装

在 Cline 中使用:

步骤:

  1. 在 vscode 下载 Cline 插件(下载后 Cline 插件后在侧边栏会增加 Cline 的图标)
  2. 配置大模型
  3. 配置 qiniu MCP
    1. 点击 Cline 图标进入 Cline 插件,选择 MCP Server 模块
    2. 选择 installed,点击 Advanced MCP Settings 配置 MCP Server,参考下面配置信息
    {
      "mcpServers": {
        "qiniu": {
          "command": "uvx",
          "args": [
            "qiniu-mcp-server"
          ],
          "env": {
            "QINIU_ACCESS_KEY": "YOUR_ACCESS_KEY",
            "QINIU_SECRET_KEY": "YOUR_SECRET_KEY",
            "QINIU_REGION_NAME": "YOUR_REGION_NAME",
            "QINIU_ENDPOINT_URL": "YOUR_ENDPOINT_URL",
            "QINIU_BUCKETS": "YOUR_BUCKET_A,YOUR_BUCKET_B"
         },
          "disabled": false
        }
      }
    }
    
    1. 点击 qiniu MCP Server 的链接开关进行连接
  4. 在 Cline 中创建一个聊天窗口,此时我们可以和 AI 进行交互来使用 qiniu-mcp-server ,下面给出几个示例:
    • 列举 qiniu 的资源信息
    • 列举 qiniu 中所有的 Bucket
    • 列举 qiniu 中 xxx Bucket 的文件
    • 读取 qiniu xxx Bucket 中 yyy 的文件内容
    • 对 qiniu xxx Bucket 中 yyy 的图片切个宽200像素的圆角
    • 刷新下 qiniu 的这个 CDN 链接:https://developer.qiniu.com/test.txt

注: cursor 中创建 MCP Server 可直接使用上述配置。 claude 中使用时可能会遇到:Error: spawn uvx ENOENT 错误,解决方案:command 中 参数填写 uvx 的绝对路径,eg: /usr/local/bin/uvx

开发

  1. 克隆仓库:
bash
# 克隆项目并进入目录
git clone git@github.com:qiniu/qiniu-mcp-server.git
cd qiniu-mcp-server
  1. 创建并激活虚拟环境:
bash
uv venv
source .venv/bin/activate  # Linux/macOS
# 或
.venv\Scripts\activate  # Windows
  1. 安装依赖:
bash
uv pip install -e .
  1. 配置

复制环境变量模板:

bash
cp .env.example .env

编辑 .env 文件,配置以下参数:

bash
# S3/Kodo 认证信息
QINIU_ACCESS_KEY=your_access_key
QINIU_SECRET_KEY=your_secret_key

# 区域信息
QINIU_REGION_NAME=your_region
QINIU_ENDPOINT_URL=endpoint_url # eg:https://s3.your_region.qiniucs.com

# 配置 bucket,多个 bucket 使用逗号隔开,建议最多配置 20 个 bucket
QINIU_BUCKETS=bucket1,bucket2,bucket3

扩展功能,首先在 core 目录下新增一个业务包目录(eg: 存储 -> storage),在此业务包目录下完成功能拓展。 在业务包目录下的 __init__.py 文件中定义 load 函数用于注册业务工具或者资源,最后在 core 目录下的 __init__.py 中调用此 load 函数完成工具或资源的注册。

shell
core
├── __init__.py # 各个业务工具或者资源加载
└── storage # 存储业务目录
    ├── __init__.py # 加载存储工具或者资源
    ├── resource.py # 存储资源扩展
    ├── storage.py # 存储工具类
    └── tools.py # 存储工具扩展

测试

使用 Model Control Protocol Inspector 测试

强烈推荐使用 Model Control Protocol Inspector 进行测试。

shell
# node 版本为:v22.4.0
npx @modelcontextprotocol/inspector uv --directory . run qiniu-mcp-server

本地启动 MCP Server 示例

  1. 使用标准输入输出(stdio)模式启动(默认):
bash
uv --directory . run qiniu-mcp-server
  1. 使用 SSE 模式启动(用于 Web 应用):
bash
uv --directory . run qiniu-mcp-server --transport sse --port 8000

Star History

Star History Chart

Repository Owner

qiniu
qiniu

Organization

Repository Details

Language Python
Default Branch main
Size 156 KB
Contributors 3
License MIT License
MCP Verified Nov 11, 2025

Programming Languages

Python
98.67%
Dockerfile
1.33%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • QuantConnect MCP Server

    QuantConnect MCP Server

    Official bridge for secure AI access to QuantConnect's algorithmic trading cloud platform

    QuantConnect MCP Server enables artificial intelligence systems such as Claude and OpenAI to interface with QuantConnect's cloud platform through an official, secure, and dockerized implementation of the Model Context Protocol (MCP). It facilitates automated project management, strategy writing, backtesting, and live deployment by exposing a comprehensive suite of API tools for users with valid access credentials. As the maintained official version, it ensures security, easy deployment, and cross-platform compatibility for advanced algorithmic trading automation.

    • 50
    • MCP
    • QuantConnect/mcp-server
  • Box MCP Server (Remote)

    Box MCP Server (Remote)

    Securely connect AI agents to Box content and Box AI using the Model Context Protocol.

    Box MCP Server (Remote) enables AI agent platforms to securely interact with Box data and AI-powered tools via the Model Context Protocol. It supports OAuth-based authentication and provides various capabilities, including user identification, file and folder operations, and access to Box AI tools. The service exposes an endpoint for easy integration by MCP-compatible clients while ensuring data never leaves the Box environment. It offers both admin console and developer console setup options and comprehensive documentation for connection.

    • 0
    • MCP
    • box/mcp-server-box-remote
  • Daisys MCP server

    Daisys MCP server

    A beta server implementation for the Model Context Protocol supporting audio context with Daisys integration.

    Daisys MCP server provides a beta implementation of the Model Context Protocol (MCP), enabling seamless integration between the Daisys AI platform and various MCP clients. It allows users to connect MCP-compatible clients to Daisys by configurable authentication and environment settings, with out-of-the-box support for audio file storage and playback. The server is designed to be extensible, including support for both user-level deployments and developer contributions, with best practices for secure authentication and dependency management.

    • 10
    • MCP
    • daisys-ai/daisys-mcp
  • Outsource MCP

    Outsource MCP

    Unified MCP server for multi-provider AI text and image generation

    Outsource MCP is a Model Context Protocol server that bridges AI applications with multiple model providers via a single unified interface. It enables AI tools and clients to access over 20 major providers for both text and image generation, streamlining model selection and API integration. Built on FastMCP and Agno agent frameworks, it supports flexible configuration and is compatible with MCP-enabled AI tools. Authentication is provider-specific, and all interactions use a simple standardized API format.

    • 26
    • MCP
    • gwbischof/outsource-mcp
  • Unsplash MCP Server

    Unsplash MCP Server

    Seamless Unsplash image integration via the Model Context Protocol.

    Unsplash MCP Server provides a simple and robust interface to search and integrate high-quality Unsplash images through the Model Context Protocol (MCP). It offers advanced photo search capabilities with filters for keywords, color schemes, orientation, and sorting. Designed for easy integration with development environments such as Cursor and Smithery, it simplifies embedding Unsplash image search into AI and automation workflows.

    • 186
    • MCP
    • hellokaton/unsplash-mcp-server
  • box-mcp-server

    box-mcp-server

    Expose your Box files to AI with a Model Context Protocol server.

    box-mcp-server lets users connect their Box accounts to AI applications via the Model Context Protocol (MCP). It securely authenticates to Box with enterprise credentials or developer tokens and serves file search and reading capabilities to downstream clients. Designed for use with Claude Desktop and the MCP Inspector, it provides seamless integration of Box documents into AI workflows.

    • 10
    • MCP
    • hmk/box-mcp-server
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results