mcp-server-apache-airflow

mcp-server-apache-airflow

A Model Context Protocol server for integrating Apache Airflow with MCP clients.

109
Stars
28
Forks
109
Watchers
5
Issues
mcp-server-apache-airflow provides a Model Context Protocol (MCP) server implementation that allows standardized interaction with Apache Airflow environments. By wrapping Airflow's REST API, it enables MCP clients to manage and orchestrate workflows, DAGs, and runs in a consistent and interoperable manner. This implementation leverages the official Apache Airflow client library to ensure robust compatibility and maintainability. It streamlines the management of Airflow resources by exposing comprehensive endpoint coverage for key workflow operations.

Key Features

Implements Model Context Protocol server for Apache Airflow
Wraps and exposes Airflow REST API endpoints
Supports comprehensive DAG management operations
Enables creation, listing, updating, and deletion of DAG runs
Uses the official Apache Airflow client library
Standardizes workflow interactions for MCP clients
Allows batch operations and DAG file reparsing
Integrates with security and compliance assessment badges
Promotes interoperability across platforms via MCP
Provides detailed endpoint coverage for workflow automation

Use Cases

Standardizing the integration of Apache Airflow with external workflow tools
Allowing MCP clients to automate workflow orchestration in Airflow environments
Facilitating context-aware management of Airflow DAGs and runs
Enabling unified monitoring and updating of complex Airflow-based pipelines
Streamlining workflow automation for organizations using Airflow
Providing a protocol-compliant interface for workflow and job scheduling
Connecting legacy or proprietary workflow orchestrators with Airflow infrastructure
Batch managing and reparsing multiple DAGs programmatically
Supporting secure, auditable access to Airflow process management
Facilitating hybrid cloud or multi-tenant workflow automation scenarios

README

MseeP.ai Security Assessment Badge

mcp-server-apache-airflow

smithery badge PyPI - Downloads

A Model Context Protocol (MCP) server implementation for Apache Airflow, enabling seamless integration with MCP clients. This project provides a standardized way to interact with Apache Airflow through the Model Context Protocol.

About

This project implements a Model Context Protocol server that wraps Apache Airflow's REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses the official Apache Airflow client library to ensure compatibility and maintainability.

Feature Implementation Status

Feature API Path Status
DAG Management
List DAGs /api/v1/dags
Get DAG Details /api/v1/dags/{dag_id}
Pause DAG /api/v1/dags/{dag_id}
Unpause DAG /api/v1/dags/{dag_id}
Update DAG /api/v1/dags/{dag_id}
Delete DAG /api/v1/dags/{dag_id}
Get DAG Source /api/v1/dagSources/{file_token}
Patch Multiple DAGs /api/v1/dags
Reparse DAG File /api/v1/dagSources/{file_token}/reparse
DAG Runs
List DAG Runs /api/v1/dags/{dag_id}/dagRuns
Create DAG Run /api/v1/dags/{dag_id}/dagRuns
Get DAG Run Details /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}
Update DAG Run /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}
Delete DAG Run /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}
Get DAG Runs Batch /api/v1/dags/~/dagRuns/list
Clear DAG Run /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/clear
Set DAG Run Note /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/setNote
Get Upstream Dataset Events /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents
Tasks
List DAG Tasks /api/v1/dags/{dag_id}/tasks
Get Task Details /api/v1/dags/{dag_id}/tasks/{task_id}
Get Task Instance /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}
List Task Instances /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances
Update Task Instance /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}
Get Task Instance Log /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/logs/{task_try_number}
Clear Task Instances /api/v1/dags/{dag_id}/clearTaskInstances
Set Task Instances State /api/v1/dags/{dag_id}/updateTaskInstancesState
List Task Instance Tries /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/tries
Variables
List Variables /api/v1/variables
Create Variable /api/v1/variables
Get Variable /api/v1/variables/{variable_key}
Update Variable /api/v1/variables/{variable_key}
Delete Variable /api/v1/variables/{variable_key}
Connections
List Connections /api/v1/connections
Create Connection /api/v1/connections
Get Connection /api/v1/connections/{connection_id}
Update Connection /api/v1/connections/{connection_id}
Delete Connection /api/v1/connections/{connection_id}
Test Connection /api/v1/connections/test
Pools
List Pools /api/v1/pools
Create Pool /api/v1/pools
Get Pool /api/v1/pools/{pool_name}
Update Pool /api/v1/pools/{pool_name}
Delete Pool /api/v1/pools/{pool_name}
XComs
List XComs /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries
Get XCom Entry /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key}
Datasets
List Datasets /api/v1/datasets
Get Dataset /api/v1/datasets/{uri}
Get Dataset Events /api/v1/datasetEvents
Create Dataset Event /api/v1/datasetEvents
Get DAG Dataset Queued Event /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}
Get DAG Dataset Queued Events /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents
Delete DAG Dataset Queued Event /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}
Delete DAG Dataset Queued Events /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents
Get Dataset Queued Events /api/v1/datasets/{uri}/dagRuns/queued/datasetEvents
Delete Dataset Queued Events /api/v1/datasets/{uri}/dagRuns/queued/datasetEvents
Monitoring
Get Health /api/v1/health
DAG Stats
Get DAG Stats /api/v1/dags/statistics
Config
Get Config /api/v1/config
Plugins
Get Plugins /api/v1/plugins
Providers
List Providers /api/v1/providers
Event Logs
List Event Logs /api/v1/eventLogs
Get Event Log /api/v1/eventLogs/{event_log_id}
System
Get Import Errors /api/v1/importErrors
Get Import Error Details /api/v1/importErrors/{import_error_id}
Get Health Status /api/v1/health
Get Version /api/v1/version

Setup

Dependencies

This project depends on the official Apache Airflow client library (apache-airflow-client). It will be automatically installed when you install this package.

Environment Variables

Set the following environment variables:

AIRFLOW_HOST=<your-airflow-host>        # Optional, defaults to http://localhost:8080
AIRFLOW_API_VERSION=v1                  # Optional, defaults to v1
READ_ONLY=true                          # Optional, enables read-only mode (true/false, defaults to false)

Authentication

Choose one of the following authentication methods:

Basic Authentication (default):

AIRFLOW_USERNAME=<your-airflow-username>
AIRFLOW_PASSWORD=<your-airflow-password>

JWT Token Authentication:

AIRFLOW_JWT_TOKEN=<your-jwt-token>

To obtain a JWT token, you can use Airflow's authentication endpoint:

bash
ENDPOINT_URL="http://localhost:8080"  # Replace with your Airflow endpoint
curl -X 'POST' \
  "${ENDPOINT_URL}/auth/token" \
  -H 'Content-Type: application/json' \
  -d '{ "username": "<your-username>", "password": "<your-password>" }'

Note: If both JWT token and basic authentication credentials are provided, JWT token takes precedence.

Usage with Claude Desktop

Add to your claude_desktop_config.json:

Basic Authentication:

json
{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uvx",
      "args": ["mcp-server-apache-airflow"],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_USERNAME": "your-username",
        "AIRFLOW_PASSWORD": "your-password"
      }
    }
  }
}

JWT Token Authentication:

json
{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uvx",
      "args": ["mcp-server-apache-airflow"],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_JWT_TOKEN": "your-jwt-token"
      }
    }
  }
}

For read-only mode (recommended for safety):

Basic Authentication:

json
{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uvx",
      "args": ["mcp-server-apache-airflow"],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_USERNAME": "your-username",
        "AIRFLOW_PASSWORD": "your-password",
        "READ_ONLY": "true"
      }
    }
  }
}

JWT Token Authentication:

json
{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uvx",
      "args": ["mcp-server-apache-airflow", "--read-only"],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_JWT_TOKEN": "your-jwt-token"
      }
    }
  }
}

Alternative configuration using uv:

Basic Authentication:

json
{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uv",
      "args": [
        "--directory",
        "/path/to/mcp-server-apache-airflow",
        "run",
        "mcp-server-apache-airflow"
      ],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_USERNAME": "your-username",
        "AIRFLOW_PASSWORD": "your-password"
      }
    }
  }
}

JWT Token Authentication:

json
{
  "mcpServers": {
    "mcp-server-apache-airflow": {
      "command": "uv",
      "args": [
        "--directory",
        "/path/to/mcp-server-apache-airflow",
        "run",
        "mcp-server-apache-airflow"
      ],
      "env": {
        "AIRFLOW_HOST": "https://your-airflow-host",
        "AIRFLOW_JWT_TOKEN": "your-jwt-token"
      }
    }
  }
}

Replace /path/to/mcp-server-apache-airflow with the actual path where you've cloned the repository.

Selecting the API groups

You can select the API groups you want to use by setting the --apis flag.

bash
uv run mcp-server-apache-airflow --apis dag --apis dagrun

The default is to use all APIs.

Allowed values are:

  • config
  • connections
  • dag
  • dagrun
  • dagstats
  • dataset
  • eventlog
  • importerror
  • monitoring
  • plugin
  • pool
  • provider
  • taskinstance
  • variable
  • xcom

Read-Only Mode

You can run the server in read-only mode by using the --read-only flag or by setting the READ_ONLY=true environment variable. This will only expose tools that perform read operations (GET requests) and exclude any tools that create, update, or delete resources.

Using the command-line flag:

bash
uv run mcp-server-apache-airflow --read-only

Using the environment variable:

bash
READ_ONLY=true uv run mcp-server-apache-airflow

In read-only mode, the server will only expose tools like:

  • Listing DAGs, DAG runs, tasks, variables, connections, etc.
  • Getting details of specific resources
  • Reading configurations and monitoring information
  • Testing connections (non-destructive)

Write operations like creating, updating, deleting DAGs, variables, connections, triggering DAG runs, etc. will not be available in read-only mode.

You can combine read-only mode with API group selection:

bash
uv run mcp-server-apache-airflow --read-only --apis dag --apis variable

Manual Execution

You can also run the server manually:

bash
make run

make run accepts following options:

Options:

  • --port: Port to listen on for SSE (default: 8000)
  • --transport: Transport type (stdio/sse/http, default: stdio)

Or, you could run the sse server directly, which accepts same parameters:

bash
make run-sse

Also, you could start service directly using uv like in the following command:

bash
uv run src --transport http --port 8080

Installing via Smithery

To install Apache Airflow MCP Server for Claude Desktop automatically via Smithery:

bash
npx -y @smithery/cli install @yangkyeongmo/mcp-server-apache-airflow --client claude

Development

Setting up Development Environment

  1. Clone the repository:
bash
git clone https://github.com/yangkyeongmo/mcp-server-apache-airflow.git
cd mcp-server-apache-airflow
  1. Install development dependencies:
bash
uv sync --dev
  1. Create a .env file for environment variables (optional for development):
bash
touch .env

Note: No environment variables are required for running tests. The AIRFLOW_HOST defaults to http://localhost:8080 for development and testing purposes.

Running Tests

The project uses pytest for testing with the following commands available:

bash
# Run all tests
make test

Code Quality

bash
# Run linting
make lint

# Run code formatting
make format

Continuous Integration

The project includes a GitHub Actions workflow (.github/workflows/test.yml) that automatically:

  • Runs tests on Python 3.10, 3.11, and 3.12
  • Executes linting checks using ruff
  • Runs on every push and pull request to main branch

The CI pipeline ensures code quality and compatibility across supported Python versions before any changes are merged.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

The package is deployed automatically to PyPI when project.version is updated in pyproject.toml. Follow semver for versioning.

Please include version update in the PR in order to apply the changes to core logic.

License

MIT License

Star History

Star History Chart

Repository Owner

Repository Details

Language Python
Default Branch main
Size 397 KB
Contributors 9
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

Python
99.35%
Dockerfile
0.38%
Makefile
0.27%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Taskade MCP

    Taskade MCP

    Tools and server for Model Context Protocol workflows and agent integration

    Taskade MCP provides an official server and tools to implement and interact with the Model Context Protocol (MCP), enabling seamless connectivity between Taskade’s API and MCP-compatible clients such as Claude or Cursor. It includes utilities for generating MCP tools from any OpenAPI schema and supports the deployment of autonomous agents, workflow automation, and real-time collaboration. The platform promotes extensibility by supporting integration via API, OpenAPI, and MCP, making it easier to build and connect agentic systems.

    • 90
    • MCP
    • taskade/mcp
  • mcp-difyworkflow-server

    mcp-difyworkflow-server

    MCP server for managing and executing multiple Dify workflows on demand.

    mcp-difyworkflow-server is an MCP-compliant server tool that facilitates the querying and invocation of custom Dify workflows. It supports dynamic execution of multiple workflows by interfacing with the Dify platform, enabling users to manage workflow credentials and operations efficiently. Configuration allows mapping of workflows to API keys, and commands to list or execute available workflows are provided.

    • 58
    • MCP
    • gotoolkits/mcp-difyworkflow-server
  • Aviationstack MCP Server

    Aviationstack MCP Server

    MCP server offering comprehensive endpoints for aviation and flight data.

    Aviationstack MCP Server provides an MCP-compliant API that exposes tools to access real-time and scheduled flight data, aircraft details, random aircraft types, countries, and city information from the AviationStack API. It offers ready-to-use endpoints for airline-specific flight queries, airport schedules, and in-depth vehicle, country, and city data. The solution applies the Model Context Protocol by defining MCP tools as Python functions with standardized interfaces, designed for seamless integration into MCP-compatible environments. The server is built using Python, incorporates the FastMCP library, and is intended for easy deployment and use in application development.

    • 11
    • MCP
    • Pradumnasaraf/aviationstack-mcp
  • MCP Server for ZenML

    MCP Server for ZenML

    Expose ZenML data and pipeline operations via the Model Context Protocol.

    Implements a Model Context Protocol (MCP) server for interfacing with the ZenML API, enabling standardized access to ZenML resources for AI applications. Provides tools for reading data about users, stacks, pipelines, runs, and artifacts, as well as triggering new pipeline runs if templates are available. Includes robust testing, automated quality checks, and supports secure connection from compatible MCP clients. Designed for easy integration with ZenML instances, supporting both local and remote ZenML deployments.

    • 32
    • MCP
    • zenml-io/mcp-zenml
  • Stape MCP Server

    Stape MCP Server

    An MCP server implementation for integrating Stape with AI model context protocols.

    Stape MCP Server provides an implementation of the Model Context Protocol server tailored for the Stape platform. It enables secure and standardized access to model context capabilities, allowing integration with tools such as Claude Desktop and Cursor AI. Users can easily configure and authenticate MCP connections using provided configuration samples, while managing context and credentials securely. The server is open source and maintained by the Stape Team under the Apache 2.0 license.

    • 4
    • MCP
    • stape-io/stape-mcp-server
  • anki-mcp

    anki-mcp

    MCP server for seamless integration with Anki via AnkiConnect.

    An MCP server that bridges Anki flashcards with the Model Context Protocol, exposing AnkiConnect functionalities as standardized MCP tools. It organizes Anki actions into intuitive services covering decks, notes, cards, and models for easy access and automation. Designed for integration with AI assistants and other MCP-compatible clients, it enables operations like creating, modifying, and organizing flashcards through a unified protocol.

    • 6
    • MCP
    • ujisati/anki-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results