Defang

Defang

Develop Once, Deploy Anywhere.

144
Stars
20
Forks
144
Watchers
175
Issues
Defang provides a command-line interface (CLI) and Model Context Protocol (MCP) Server that enable seamless deployment of applications from local development environments to the cloud. It supports integration with popular IDEs such as VS Code, Cursor, Windsurf, and Claude, allowing users to manage and deploy their workflows efficiently. Defang delivers secure, scalable deployments with built-in support for Docker Compose and Pulumi, and offers samples for Golang, Python, and Node.js projects. Its AI-powered features enable developers to generate and launch cloud services effortlessly.

Key Features

Command-line interface for cloud deployment
Model Context Protocol (MCP) Server for IDE integration
Seamless deployment from local to cloud environments
Supports popular editors (VS Code, Cursor, Windsurf, Claude)
Built-in Docker Compose support
Pulumi provider samples
Samples in Golang, Python, and Node.js
AI-powered service generation
Easy installation through multiple package managers
Integrated experience within development environments

Use Cases

Deploying applications from local machines to the cloud
Managing cloud resources via CLI
Integrating cloud deployment workflows into IDEs
Rapid prototyping and deployment using Docker Compose
Infrastructure as code deployments with Pulumi
Automating service generation using AI
Collaborative cloud development within teams
End-to-end lifecycle management for cloud applications
Streamlining developer onboarding for cloud projects
Testing and iterating software projects in multiple languages before deployment

README

Go package Discord GitHub Release

Defang

Develop Once, Deploy Anywhere.

Take your app from Docker Compose to a secure and scalable deployment on your favorite cloud in minutes.

Defang CLI

The Defang Command-Line Interface (CLI) is designed for developers who prefer to manage their workflows directly from the terminal. It offers full access to Defang’s capabilities, allowing you to build, test, and deploy applications efficiently to the cloud.

Defang MCP Server

Install in VS Code Install in VS Code Insiders Install MCP Server

The Defang Model Context Protocol (MCP) Server is tailored for developers who work primarily within integrated development environments (IDEs). It enables seamless cloud deployment from supported editors such as Cursor, Windsurf, VS Code, VS Code Insiders and Claude delivering a fully integrated experience without leaving your development environment.

This repo includes:

  • Public releases of the Defang CLI; click here for the latest version
  • Built-in support for MCP Server — the Defang MCP Server makes cloud deployment as easy as a single prompt. Learn more
  • Samples in Golang, Python, and Node.js that show how to accomplish various tasks and deploy them to the DOP using a Docker Compose file using the Defang CLI.
  • Samples that show how to deploy an app using the Defang Pulumi Provider.

Getting started

  • Read our Getting Started page
  • Follow the installation instructions from the Installing page
  • Take a look at our Samples folder for example projects in various programming languages.
  • Try the AI integration by running defang generate
  • Start your new service with defang compose up

Installing

Install the Defang CLI from one of the following sources:

  • Using the Homebrew package manager DefangLabs/defang tap:

    brew install DefangLabs/defang/defang
    
  • Using a shell script:

    eval "$(curl -fsSL s.defang.io/install)"
    
  • Using Go:

    go install github.com/DefangLabs/defang/src/cmd/cli@latest
    
  • Using the Nix package manager:

    • with Nix-Env:
      nix-env -if https://github.com/DefangLabs/defang/archive/main.tar.gz
      
    • or with Flakes:
      nix profile install github:DefangLabs/defang#defang-bin --refresh
      
  • Using winget:

    winget install defang
    
  • Using a PowerShell script:

    iwr https://s.defang.io/defang_win_amd64.zip -OutFile defang.zip
    Expand-Archive defang.zip . -Force
    
  • Using the official image from Docker Hub:

    docker run -it defangio/defang-cli help
    
  • Using NPX:

    npx defang@latest help
    
  • or download the latest binary of the Defang CLI.

Support

  • File any issues here

Command completion

The Defang CLI supports command completion for Bash, Zsh, Fish, and Powershell. To get the shell script for command completion, run the following command:

defang completion [bash|zsh|fish|powershell]

If you're using Bash, you can add the following to your ~/.bashrc file:

source <(defang completion bash)

If you're using Zsh, you can add the following to your ~/.zshrc file:

source <(defang completion zsh)

or pipe the output to a file called _defang in the directory with the completions.

If you're using Fish, you can add the following to your ~/.config/fish/config.fish file:

defang completion fish | source

If you're using Powershell, you can add the following to your $HOME\Documents\PowerShell\Microsoft.PowerShell_profile.ps1 file:

Invoke-Expression -Command (defang completion powershell | Out-String)

Environment Variables

The Defang CLI recognizes the following environment variables:

  • COMPOSE_DISABLE_ENV_FILE - Whether to disable loading environment variables from a .env file in the current directory; defaults to false
  • COMPOSE_FILE - The Compose file(s) to use; defaults to compose.yaml, compose.yml, docker-compose.yaml, or docker-compose.yml in the current directory
  • COMPOSE_PATH_SEPARATOR - The path separator to use for COMPOSE_FILE; defaults to : on Unix/MacOS and ; on Windows
  • COMPOSE_PROJECT_NAME - The name of the project to use; overrides the name in the Compose file
  • DEFANG_ACCESS_TOKEN - The access token to use for authentication; if not specified, uses token from defang login
  • DEFANG_BUILD_CONTEXT_LIMIT - The maximum size of the build context when building container images; defaults to 100MiB
  • DEFANG_CD_BUCKET - The S3 bucket to use for the BYOC CD pipeline; defaults to defang-cd-bucket-…
  • DEFANG_CD_IMAGE - The image to use for the Continuous Deployment (CD) pipeline; defaults to public.ecr.aws/defang-io/cd:public-beta
  • DEFANG_DEBUG - set this to 1 or true to enable debug logging
  • DEFANG_DISABLE_ANALYTICS - If set to true, disables sending analytics to Defang; defaults to false
  • DEFANG_EDITOR - The editor to launch after new project generation; defaults to code (VS Code)
  • DEFANG_FABRIC - The address of the Defang Fabric to use; defaults to fabric-prod1.defang.dev
  • DEFANG_JSON - If set to true, outputs JSON instead of human-readable output; defaults to false
  • DEFANG_HIDE_HINTS - If set to true, hides hints in the CLI output; defaults to false
  • DEFANG_HIDE_UPDATE - If set to true, hides the update notification; defaults to false
  • DEFANG_ISSUER - The OAuth2 issuer to use for authentication; defaults to https://auth.defang.io
  • DEFANG_MODEL_ID - The model ID of the LLM to use for the generate/debug AI integration (Pro users only)
  • DEFANG_NO_CACHE - If set to true, disables pull-through caching of container images; defaults to false
  • DEFANG_ORG - The name of the organization to use; defaults to the user's GitHub name
  • DEFANG_PREFIX - The prefix to use for all BYOC resources; defaults to Defang
  • DEFANG_PROVIDER - The name of the cloud provider to use, auto (default), aws, digitalocean, gcp, or defang
  • DEFANG_PULUMI_BACKEND - The Pulumi backend URL or "pulumi-cloud"; defaults to a self-hosted backend
  • DEFANG_PULUMI_DIR - Run Pulumi from this folder, instead of spawning a cloud task; requires --debug (BYOC only)
  • DEFANG_PULUMI_VERSION - Override the version of the Pulumi image to use (aws provider only)
  • NO_COLOR - If set to any value, disables color output; by default, color output is enabled depending on the terminal
  • PULUMI_ACCESS_TOKEN - The Pulumi access token to use for authentication to Pulumi Cloud; see DEFANG_PULUMI_BACKEND
  • PULUMI_CONFIG_PASSPHRASE - Passphrase used to generate a unique key for your stack, and configuration and encrypted state values
  • TZ - The timezone to use for log timestamps: an IANA TZ name like UTC or Europe/Amsterdam; defaults to Local
  • XDG_STATE_HOME - The directory to use for storing state; defaults to ~/.local/state

Environment variables will be loaded from a .defangrc file in the current directory, if it exists. This file follows the same format as a .env file: KEY=VALUE pairs on each line, lines starting with # are treated as comments and ignored.

Development

At Defang we use the Nix package manager for our dev environment, in conjunction with DirEnv.

To get started quickly, install Nix and DirEnv, then create a .envrc file to automatically load the Defang developer environment:

sh
echo use flake >> .envrc
direnv allow

Star History

Star History Chart

Repository Owner

DefangLabs
DefangLabs

Organization

Repository Details

Language Go
Default Branch main
Size 53,538 KB
Contributors 21
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

Go
97.22%
TypeScript
1.45%
Makefile
0.6%
Nix
0.33%
Shell
0.32%
Dockerfile
0.08%
JavaScript
0.01%

Tags

Topics

aws cloud compose deploy deployment digitalocean docker docker-compose gcp infrastructure-as-code

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • QuantConnect MCP Server

    QuantConnect MCP Server

    Official bridge for secure AI access to QuantConnect's algorithmic trading cloud platform

    QuantConnect MCP Server enables artificial intelligence systems such as Claude and OpenAI to interface with QuantConnect's cloud platform through an official, secure, and dockerized implementation of the Model Context Protocol (MCP). It facilitates automated project management, strategy writing, backtesting, and live deployment by exposing a comprehensive suite of API tools for users with valid access credentials. As the maintained official version, it ensures security, easy deployment, and cross-platform compatibility for advanced algorithmic trading automation.

    • 50
    • MCP
    • QuantConnect/mcp-server
  • TOD - TheOneDev CLI Tool

    TOD - TheOneDev CLI Tool

    Streamlined DevOps and AI integration for OneDev 13+ via an advanced CLI and Model Context Protocol server.

    TOD is a command-line tool designed for OneDev 13+ that enhances development workflows with CI/CD job execution against local changes or specific branches and tags. It features a comprehensive Model Context Protocol (MCP) server for integrating and communicating with AI assistants, allowing intelligent and natural interactions with OneDev. The tool also supports pull request management and build specification migrations, making it a versatile utility for developers seeking automation and AI-driven enhancements.

    • 3
    • MCP
    • theonedev/tod
  • Edge Delta MCP Server

    Edge Delta MCP Server

    Seamlessly integrate Edge Delta APIs into the Model Context Protocol ecosystem.

    Edge Delta MCP Server is a Model Context Protocol server enabling advanced integration with Edge Delta APIs. It allows developers and tools to extract, analyze, and automate observability data from Edge Delta through standardized MCP interfaces. The server supports AI-powered applications and automations, and can be deployed via Docker for straightforward operation. The Go API is available for experimental programmatic access.

    • 5
    • MCP
    • edgedelta/edgedelta-mcp-server
  • Inspektor Gadget MCP Server

    Inspektor Gadget MCP Server

    AI-powered Kubernetes troubleshooting via Model Context Protocol.

    Inspektor Gadget MCP Server provides an AI-powered debugging and inspection interface for Kubernetes clusters. Leveraging the Model Context Protocol, it enables intelligent output summarization, one-click deployment of Inspektor Gadget, and automated discovery of debugging tools from Artifact Hub. The server integrates seamlessly with VS Code for interactive AI commands, simplifying Kubernetes troubleshooting and monitoring workflows.

    • 16
    • MCP
    • inspektor-gadget/ig-mcp-server
  • mcp-get

    mcp-get

    A command-line tool for discovering, installing, and managing Model Context Protocol servers.

    mcp-get is a CLI tool designed to help users discover, install, and manage Model Context Protocol (MCP) servers. It enables seamless integration of Large Language Models (LLMs) with various external data sources and tools by utilizing a standardized protocol. The tool provides access to a curated registry of MCP servers and supports installation and management across multiple programming languages and environments. Although now archived, mcp-get simplifies environment variable management, package versioning, and server updates to enhance the LLM ecosystem.

    • 497
    • MCP
    • michaellatman/mcp-get
  • In Memoria

    In Memoria

    Persistent memory and instant context for AI coding assistants, integrated via MCP.

    In Memoria is an MCP server that enables AI coding assistants such as Claude or Copilot to retain, recall, and provide context about codebases across sessions. It learns patterns, architecture, and conventions from user code, offering persistent intelligence that eliminates repetitive explanations and generic suggestions. Through the Model Context Protocol, it allows AI tools to perform semantic search, smart file routing, and track project-specific decisions efficiently.

    • 94
    • MCP
    • pi22by7/In-Memoria
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results