Outsource MCP

Outsource MCP

Unified MCP server for multi-provider AI text and image generation

26
Stars
5
Forks
26
Watchers
4
Issues
Outsource MCP is a Model Context Protocol server that bridges AI applications with multiple model providers via a single unified interface. It enables AI tools and clients to access over 20 major providers for both text and image generation, streamlining model selection and API integration. Built on FastMCP and Agno agent frameworks, it supports flexible configuration and is compatible with MCP-enabled AI tools. Authentication is provider-specific, and all interactions use a simple standardized API format.

Key Features

Supports over 20 AI providers including OpenAI, Anthropic, Google, and more
Unified interface for text and image generation
Based on the Model Context Protocol standard
Simple API with just 'provider', 'model', and 'prompt' parameters
Provider-specific authentication through environment variables
Compatible with any MCP-enabled AI client
Image generation with DALL-E 2 and DALL-E 3
Text generation using a variety of cutting-edge AI models
Integration with FastMCP and Agno agent frameworks
Flexible configuration and multi-provider management

Use Cases

Integrate multiple AI model providers into a single AI tool or platform
Automate text content generation across a variety of language models
Outsource image generation tasks to AI providers such as DALL-E
Build custom AI workflows spanning different model vendors
Enable MCP-compatible clients to access expanded model choices
Provide flexible authentication for various provider keys as needed
Power creative tools with both text and image AI generation capabilities
Streamline onboarding of new AI providers in enterprise contexts
Standardize AI model interface for client applications
Facilitate developer experimentation with different AI APIs

README

Outsource MCP

An MCP (Model Context Protocol) server that enables AI applications to outsource tasks to various model providers through a unified interface.

Compatible with any AI tool that supports the Model Context Protocol, including Claude Desktop, Cline, and other MCP-enabled applications. Built with FastMCP for the MCP server implementation and Agno for AI agent capabilities.

Features

  • 🤖 Multi-Provider Support: Access 20+ AI providers through a single interface
  • 📝 Text Generation: Generate text using models from OpenAI, Anthropic, Google, and more
  • 🎨 Image Generation: Create images using DALL-E 3 and DALL-E 2
  • 🔧 Simple API: Consistent interface with just three parameters: provider, model, and prompt
  • 🔑 Flexible Authentication: Only configure API keys for the providers you use

Configuration

Add the following configuration to your MCP client. Consult your MCP client's documentation for specific configuration details.

json
{
  "mcpServers": {
    "outsource-mcp": {
      "command": "uvx",
      "args": ["--from", "git+https://github.com/gwbischof/outsource-mcp.git", "outsource-mcp"],
      "env": {
        "OPENAI_API_KEY": "your-openai-key",
        "ANTHROPIC_API_KEY": "your-anthropic-key",
        "GOOGLE_API_KEY": "your-google-key",
        "GROQ_API_KEY": "your-groq-key",
        "DEEPSEEK_API_KEY": "your-deepseek-key",
        "XAI_API_KEY": "your-xai-key",
        "PERPLEXITY_API_KEY": "your-perplexity-key",
        "COHERE_API_KEY": "your-cohere-key",
        "FIREWORKS_API_KEY": "your-fireworks-key",
        "HUGGINGFACE_API_KEY": "your-huggingface-key",
        "MISTRAL_API_KEY": "your-mistral-key",
        "NVIDIA_API_KEY": "your-nvidia-key",
        "OLLAMA_HOST": "http://localhost:11434",
        "OPENROUTER_API_KEY": "your-openrouter-key",
        "TOGETHER_API_KEY": "your-together-key",
        "CEREBRAS_API_KEY": "your-cerebras-key",
        "DEEPINFRA_API_KEY": "your-deepinfra-key",
        "SAMBANOVA_API_KEY": "your-sambanova-key"
      }
    }
  }
}

Note: The environment variables are optional. Only include the API keys for the providers you want to use.

Quick Start

Once installed and configured, you can use the tools in your MCP client:

  1. Generate text: Use the outsource_text tool with provider "openai", model "gpt-4o-mini", and prompt "Write a haiku about coding"
  2. Generate images: Use the outsource_image tool with provider "openai", model "dall-e-3", and prompt "A futuristic city skyline at sunset"

Tools

outsource_text

Creates an Agno agent with a specified provider and model to generate text responses.

Arguments:

  • provider: The provider name (e.g., "openai", "anthropic", "google", "groq", etc.)
  • model: The model name (e.g., "gpt-4o", "claude-3-5-sonnet-20241022", "gemini-2.0-flash-exp")
  • prompt: The text prompt to send to the model

outsource_image

Generates images using AI models.

Arguments:

  • provider: The provider name (currently only "openai" is supported)
  • model: The model name ("dall-e-3" or "dall-e-2")
  • prompt: The image generation prompt

Returns the URL of the generated image.

Note: Image generation is currently only supported by OpenAI models (DALL-E 2 and DALL-E 3). Other providers only support text generation.

Supported Providers

The following providers are supported. Use the provider name (in parentheses) as the provider argument:

Core Providers

  • OpenAI (openai) - GPT-4, GPT-3.5, DALL-E, etc. | Models
  • Anthropic (anthropic) - Claude 3.5, Claude 3, etc. | Models
  • Google (google) - Gemini Pro, Gemini Flash, etc. | Models
  • Groq (groq) - Llama 3, Mixtral, etc. | Models
  • DeepSeek (deepseek) - DeepSeek Chat & Coder | Models
  • xAI (xai) - Grok models | Models
  • Perplexity (perplexity) - Sonar models | Models

Additional Providers

  • Cohere (cohere) - Command models | Models
  • Mistral AI (mistral) - Mistral Large, Medium, Small | Models
  • NVIDIA (nvidia) - Various models | Models
  • HuggingFace (huggingface) - Open source models | Models
  • Ollama (ollama) - Local models | Models
  • Fireworks AI (fireworks) - Fast inference | Models
  • OpenRouter (openrouter) - Multi-provider access | Models
  • Together AI (together) - Open source models | Models
  • Cerebras (cerebras) - Fast inference | Models
  • DeepInfra (deepinfra) - Optimized models | Models
  • SambaNova (sambanova) - Enterprise models | Models

Enterprise Providers

  • AWS Bedrock (aws or bedrock) - AWS-hosted models | Models
  • Azure AI (azure) - Azure-hosted models | Models
  • IBM WatsonX (ibm or watsonx) - IBM models | Models
  • LiteLLM (litellm) - Universal interface | Models
  • Vercel v0 (vercel or v0) - Vercel AI | Models
  • Meta Llama (meta) - Direct Meta access | Models

Environment Variables

Each provider requires its corresponding API key:

Provider Environment Variable Example
OpenAI OPENAI_API_KEY sk-...
Anthropic ANTHROPIC_API_KEY sk-ant-...
Google GOOGLE_API_KEY AIza...
Groq GROQ_API_KEY gsk_...
DeepSeek DEEPSEEK_API_KEY sk-...
xAI XAI_API_KEY xai-...
Perplexity PERPLEXITY_API_KEY pplx-...
Cohere COHERE_API_KEY ...
Fireworks FIREWORKS_API_KEY ...
HuggingFace HUGGINGFACE_API_KEY hf_...
Mistral MISTRAL_API_KEY ...
NVIDIA NVIDIA_API_KEY nvapi-...
Ollama OLLAMA_HOST http://localhost:11434
OpenRouter OPENROUTER_API_KEY ...
Together TOGETHER_API_KEY ...
Cerebras CEREBRAS_API_KEY ...
DeepInfra DEEPINFRA_API_KEY ...
SambaNova SAMBANOVA_API_KEY ...
AWS Bedrock AWS credentials Via AWS CLI/SDK
Azure AI Azure credentials Via Azure CLI/SDK
IBM WatsonX IBM_WATSONX_API_KEY ...
Meta Llama LLAMA_API_KEY ...

Note: Only configure the API keys for providers you plan to use.

Examples

Text Generation

# Using OpenAI
provider: openai
model: gpt-4o-mini
prompt: Write a haiku about coding

# Using Anthropic
provider: anthropic
model: claude-3-5-sonnet-20241022
prompt: Explain quantum computing in simple terms

# Using Google
provider: google
model: gemini-2.0-flash-exp
prompt: Create a recipe for chocolate chip cookies

Image Generation

# Using DALL-E 3
provider: openai
model: dall-e-3
prompt: A serene Japanese garden with cherry blossoms

# Using DALL-E 2
provider: openai
model: dall-e-2
prompt: A futuristic cityscape at sunset

Development

Prerequisites

  • Python 3.11 or higher
  • uv package manager

Setup

bash
git clone https://github.com/gwbischof/outsource-mcp.git
cd outsource-mcp
uv sync

Testing with MCP Inspector

The MCP Inspector allows you to test the server interactively:

bash
mcp dev server.py

Running Tests

The test suite includes integration tests that verify both text and image generation:

bash
# Run all tests
uv run pytest

Note: Integration tests require API keys to be set in your environment.

Troubleshooting

Common Issues

  1. "Error: Unknown provider"

    • Check that you're using a supported provider name from the list above
    • Provider names are case-insensitive
  2. "Error: OpenAI API error"

    • Verify your API key is correctly set in the environment variables
    • Check that your API key has access to the requested model
    • Ensure you have sufficient credits/quota
  3. "Error: No image was generated"

    • This can happen if the image generation request fails
    • Try a simpler prompt or different model (dall-e-2 vs dall-e-3)
  4. Environment variables not working

    • Make sure to restart your MCP client after updating the configuration
    • Verify the configuration file location for your specific MCP client
    • Check that the environment variables are properly formatted in the configuration

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Star History

Star History Chart

Repository Owner

gwbischof
gwbischof

User

Repository Details

Language Python
Default Branch main
Size 349 KB
Contributors 1
License MIT License
MCP Verified Nov 11, 2025

Programming Languages

Python
96.52%
Dockerfile
3.48%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Unichat MCP Server

    Unichat MCP Server

    Universal MCP server providing context-aware AI chat and code tools across major model vendors.

    Unichat MCP Server enables sending standardized requests to leading AI model vendors, including OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, and Inception, utilizing the Model Context Protocol. It features unified endpoints for chat interactions and provides specialized tools for code review, documentation generation, code explanation, and programmatic code reworking. The server is designed for seamless integration with platforms like Claude Desktop and installation via Smithery. Vendor API keys are required for secure access to supported providers.

    • 37
    • MCP
    • amidabuddha/unichat-mcp-server
  • MCP Link

    MCP Link

    Convert Any OpenAPI V3 API to an MCP Server for seamless AI Agent integration.

    MCP Link enables automatic conversion of any OpenAPI v3-compliant RESTful API into a Model Context Protocol (MCP) server, allowing instant compatibility with AI-driven agent frameworks. It eliminates the need for manual interface creation and code modification by translating OpenAPI schemas into MCP endpoints. MCP Link supports robust feature mapping and authentication, making it easy to expose existing APIs to AI ecosystems using a standardized protocol. The tool is designed for both developers and organizations seeking to streamline API integration with AI agents.

    • 572
    • MCP
    • automation-ai-labs/mcp-link
  • FastMCP

    FastMCP

    The fast, Pythonic way to build MCP servers and clients.

    FastMCP is a production-ready framework for building Model Context Protocol (MCP) applications in Python. It streamlines the creation of MCP servers and clients, providing advanced features such as enterprise authentication, composable tools, OpenAPI/FastAPI generation, server proxying, deployment tools, and comprehensive client libraries. Designed for ease of use, it offers both standard protocol support and robust utilities for production deployments.

    • 20,201
    • MCP
    • jlowin/fastmcp
  • Google Workspace MCP Server

    Google Workspace MCP Server

    Full natural language control of Google Workspace through the Model Context Protocol.

    Google Workspace MCP Server enables comprehensive natural language interaction with Google services such as Calendar, Drive, Gmail, Docs, Sheets, Slides, Forms, Tasks, and Chat via any MCP-compatible client or AI assistant. It supports both single-user and secure multi-user OAuth 2.1 authentication, providing a production-ready backend for custom apps. Built on FastMCP, it delivers high performance and advanced context handling, offering deep integration with the entire Google Workspace suite.

    • 890
    • MCP
    • taylorwilsdon/google_workspace_mcp
  • MCP Rubber Duck

    MCP Rubber Duck

    A bridge server for querying multiple OpenAI-compatible LLMs through the Model Context Protocol.

    MCP Rubber Duck acts as an MCP (Model Context Protocol) server that enables users to query and manage multiple OpenAI-compatible large language models from a unified API. It supports parallel querying of various providers, context management across sessions, failover between providers, and response caching. This tool is designed for debugging and experimentation by allowing users to receive diverse AI-driven perspectives from different model endpoints.

    • 56
    • MCP
    • nesquikm/mcp-rubber-duck
  • MyMCP Server (All-in-One Model Context Protocol)

    MyMCP Server (All-in-One Model Context Protocol)

    Powerful and extensible Model Context Protocol server with developer and productivity integrations.

    MyMCP Server is a robust Model Context Protocol (MCP) server implementation that integrates with services like GitLab, Jira, Confluence, YouTube, Google Workspace, and more. It provides AI-powered search, contextual tool execution, and workflow automation for development and productivity tasks. The system supports extensive configuration and enables selective activation of grouped toolsets for various environments. Installation and deployment are streamlined, with both automated and manual setup options available.

    • 93
    • MCP
    • nguyenvanduocit/all-in-one-model-context-protocol
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results