imagen3-mcp

imagen3-mcp

Google Imagen 3.0 image generation service with MCP protocol integration.

43
Stars
6
Forks
43
Watchers
4
Issues
Imagen3-MCP is an image generation tool leveraging Google's Imagen 3.0 model and exposes its functionality via the Model Control Protocol (MCP). It requires a valid Google Gemini API key and can be easily deployed as a standalone executable. The tool supports flexible configuration with various environment variables and integrates with platforms like Cherry Studio and Cursor for seamless image generation tasks. Example prompts and outputs are provided to showcase its photorealistic image creation capabilities.

Key Features

Integration with Google's Imagen 3.0 model
MCP (Model Control Protocol) support
Photorealistic image generation from text prompts
Standalone executable installation
Flexible configuration via environment variables
Support for Cherry Studio and Cursor integration
Image resource server customization
Proxy URL support for network environments
API key authentication with Gemini
Multilingual documentation (Chinese, English)

Use Cases

Generating high-quality, photorealistic images from textual descriptions
Integrating advanced image generation workflows within Cherry Studio
Deploying AI-powered image generation services in custom pipelines
Using as an image generation backend for software requiring MCP compatibility
Enabling remote or containerized model access for image resources
Demonstrating image model capabilities for technology showcases
Rapid prototyping of image synthesis applications
Serving image generation for creative content production
Building intelligent assistants for visual content creation
Bypassing network restrictions using proxy configuration for image model API access

README

Imagen3-MCP

English Version

基于 Google 的 Imagen 3.0 的图像生成工具,通过 MCP(Model Control Protocol)提供服务。

效果

画一只奔跑的杰克罗素犬,长焦镜头,阳光透过狗狗的毛发,照片级画质

奔跑的杰克罗素犬

画一个科技感十足的苹果

科技感十足的苹果

安装要求

安装步骤——Cherry Studio

  1. GitHub Releases 下载最新版本的可执行文件
  2. 将下载的可执行文件放置在系统中的任意位置,例如 C:\bin\imagen3-mcp.exe
  3. 在 Cherry Studio 中配置:
    • Command 字段填写可执行文件路径,例如 C:\bin\imagen3-mcp.exe
    • 环境变量 GEMINI_API_KEY 中填写你的 Gemini API 密钥
    • [可选] 环境变量 BASE_URL 中填写代理地址,例如 https://lingxi-proxy.hamflx.dev/api/provider/google(这个地址可以解决 GFW 的问题,但是解决不了 Google 对 IP 的限制问题,因此还是得挂梯子)。
    • [可选] 环境变量 SERVER_LISTEN_ADDR:设置服务器监听的 IP 地址(默认为 127.0.0.1)。
    • [可选] 环境变量 SERVER_PORT:设置服务器监听的端口和图片 URL 使用的端口(默认为 9981)。
    • [可选] 环境变量 IMAGE_RESOURCE_SERVER_ADDR:设置图片 URL 中使用的服务器地址(默认为 127.0.0.1)。这在服务器运行在容器或远程机器上时很有用。

配置

安装步骤——Cursor

json
{
  "mcpServers": {
    "imagen3": {
      "command": "C:\\bin\\imagen3-mcp.exe",
      "env": {
        "GEMINI_API_KEY": "<GEMINI_API_KEY>"
        // Optional environment variables:
        // "BASE_URL": "<PROXY_URL>",
        // "SERVER_LISTEN_ADDR": "0.0.0.0", // Example: Listen on all interfaces
        // "SERVER_PORT": "9981",
        // "IMAGE_RESOURCE_SERVER_ADDR": "your.domain.com" // Example: Use a domain name for image URLs
      }
    }
  }
}

许可证

MIT


Imagen3-MCP (English)

An image generation tool based on Google's Imagen 3.0, providing services through MCP (Model Control Protocol).

Examples

A running Jack Russell Terrier, telephoto lens, sunlight filtering through the dog's fur, photorealistic quality

Running Jack Russell Terrier

A high-tech apple

High-tech apple

Requirements

Installation Steps—Cherry Studio

  1. Download the latest executable from GitHub Releases
  2. Place the downloaded executable anywhere in your system, e.g., C:\bin\imagen3-mcp.exe
  3. Configure in Cherry Studio:
    • Fill in the Command field with the executable path, e.g., C:\bin\imagen3-mcp.exe
    • Enter your Gemini API key in the GEMINI_API_KEY environment variable
    • [Optional] Enter a proxy URL in the BASE_URL environment variable, e.g., https://your-proxy.com.
    • [Optional] Set the SERVER_LISTEN_ADDR environment variable: The IP address the server listens on (defaults to 127.0.0.1).
    • [Optional] Set the SERVER_PORT environment variable: The port the server listens on and uses for image URLs (defaults to 9981).
    • [Optional] Set the IMAGE_RESOURCE_SERVER_ADDR environment variable: The server address used in the image URLs (defaults to 127.0.0.1). Useful if the server runs in a container or remote machine.

Configuration

Installation Steps—Cursor

json
{
  "mcpServers": {
    "imagen3": {
      "command": "C:\\bin\\imagen3-mcp.exe",
      "env": {
        "GEMINI_API_KEY": "<GEMINI_API_KEY>"
        // Optional environment variables:
        // "BASE_URL": "<PROXY_URL>",
        // "SERVER_LISTEN_ADDR": "0.0.0.0", // Example: Listen on all interfaces
        // "SERVER_PORT": "9981",
        // "IMAGE_RESOURCE_SERVER_ADDR": "your.domain.com" // Example: Use a domain name for image URLs
      }
    }
  }
}

License

MIT

Star History

Star History Chart

Repository Owner

hamflx
hamflx

User

Repository Details

Language Rust
Default Branch master
Size 932 KB
Contributors 1
MCP Verified Sep 1, 2025

Programming Languages

Rust
100%

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • openai-gpt-image-mcp

    openai-gpt-image-mcp

    MCP-compatible server for image generation and editing via OpenAI APIs.

    Provides a Model Context Protocol (MCP) tool server that interfaces with OpenAI's GPT-4o/gpt-image-1 APIs to generate and edit images from text prompts. Supports advanced image editing operations including inpainting, outpainting, and compositing with customizable options. Integrates with MCP-compatible clients such as Claude Desktop, VSCode, Cursor, and Windsurf. Offers both base64 and file output for generated images, with automatic file handling for large images.

    • 70
    • MCP
    • SureScaleAI/openai-gpt-image-mcp
  • McGravity

    McGravity

    Unified load balancer and proxy for multiple MCP servers

    McGravity acts as a scalable unified proxy and load balancer for multiple MCP (Model Context Protocol) servers. It allows clients to connect through a single endpoint to access and manage multiple MCP servers efficiently. The tool offers load balancing, configuration via YAML, CLI and Docker support, and plans to evolve with features such as health checks and a web interface. Designed for modern GenAI infrastructure, it simplifies connection, balancing, and scalability of MCP server deployments.

    • 68
    • MCP
    • tigranbs/mcgravity
  • mcp-server-templates

    mcp-server-templates

    Deploy Model Context Protocol servers instantly with zero configuration.

    MCP Server Templates enables rapid, zero-configuration deployment of production-ready Model Context Protocol (MCP) servers using Docker containers and a comprehensive CLI tool. It provides a library of ready-made templates for common integrations—including filesystems, GitHub, GitLab, and Zendesk—and features intelligent caching, smart tool discovery, and flexible configuration options via JSON, YAML, environment variables, or CLI. Perfect for AI developers, data scientists, and DevOps teams, it streamlines the process of setting up and managing MCP servers and has evolved into the MCP Platform for enhanced capabilities.

    • 5
    • MCP
    • Data-Everything/mcp-server-templates
  • mcp-access-point

    mcp-access-point

    Bridge HTTP services with Model Context Protocol (MCP) clients seamlessly.

    MCP Access Point acts as a lightweight gateway that enables direct communication between MCP-compatible clients and traditional HTTP services without requiring server-side modifications. Built on the high-performance Pingora proxy library, it supports protocol conversion between HTTP, SSE, and MCP, supporting multi-tenancy and customizable routing. It empowers various MCP clients, such as Cursor Desktop, MCP Inspector, and VS Code, to interact with existing APIs efficiently. Configuration is flexible via YAML, and deployment is possible both locally and through Docker.

    • 118
    • MCP
    • sxhxliang/mcp-access-point
  • k8s-mcp-server

    k8s-mcp-server

    Securely enable Claude to run Kubernetes CLI tools via Anthropic's Model Context Protocol.

    K8s MCP Server provides a Docker-based implementation of Anthropic's Model Context Protocol (MCP), allowing Claude to securely execute Kubernetes CLI tools such as kubectl, helm, istioctl, and argocd within a containerized environment. It integrates with Claude Desktop so users can interact with their Kubernetes clusters using natural language. The server emphasizes security by operating as a non-root user and offering strict command validation, while also supporting major cloud providers like AWS, Google Cloud, and Azure. Easy configuration and support for various Unix tools further enhance its capabilities.

    • 166
    • MCP
    • alexei-led/k8s-mcp-server
  • manim-mcp-server

    manim-mcp-server

    MCP server for generating Manim animations on demand.

    Manim MCP Server allows users to execute Manim Python scripts via a standardized protocol, generating animation videos that are returned as output. It integrates with systems like Claude to dynamically render animation content from user scripts and supports configurable deployment using environment variables. The server handles management of output files and cleanup of temporary resources, designed with portability and ease of integration in mind.

    • 454
    • MCP
    • abhiemj/manim-mcp-server
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results