[](https://mseep.ai/app/yangkyeongmo-mcp-server-apache-airflow)
# mcp-server-apache-airflow
[](https://smithery.ai/server/@yangkyeongmo/mcp-server-apache-airflow)
A Model Context Protocol (MCP) server implementation for Apache Airflow, enabling seamless integration with MCP clients. This project provides a standardized way to interact with Apache Airflow through the Model Context Protocol.
<a href="https://glama.ai/mcp/servers/e99b6vx9lw">
<img width="380" height="200" src="https://glama.ai/mcp/servers/e99b6vx9lw/badge" alt="Server for Apache Airflow MCP server" />
</a>
## About
This project implements a [Model Context Protocol](https://modelcontextprotocol.io/introduction) server that wraps Apache Airflow's REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses the official Apache Airflow client library to ensure compatibility and maintainability.
## Feature Implementation Status
| Feature | API Path | Status |
| -------------------------------- | --------------------------------------------------------------------------------------------- | ------ |
| **DAG Management** | | |
| List DAGs | `/api/v1/dags` | ✅ |
| Get DAG Details | `/api/v1/dags/{dag_id}` | ✅ |
| Pause DAG | `/api/v1/dags/{dag_id}` | ✅ |
| Unpause DAG | `/api/v1/dags/{dag_id}` | ✅ |
| Update DAG | `/api/v1/dags/{dag_id}` | ✅ |
| Delete DAG | `/api/v1/dags/{dag_id}` | ✅ |
| Get DAG Source | `/api/v1/dagSources/{file_token}` | ✅ |
| Patch Multiple DAGs | `/api/v1/dags` | ✅ |
| Reparse DAG File | `/api/v1/dagSources/{file_token}/reparse` | ✅ |
| **DAG Runs** | | |
| List DAG Runs | `/api/v1/dags/{dag_id}/dagRuns` | ✅ |
| Create DAG Run | `/api/v1/dags/{dag_id}/dagRuns` | ✅ |
| Get DAG Run Details | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | ✅ |
| Update DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | ✅ |
| Delete DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | ✅ |
| Get DAG Runs Batch | `/api/v1/dags/~/dagRuns/list` | ✅ |
| Clear DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/clear` | ✅ |
| Set DAG Run Note | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/setNote` | ✅ |
| Get Upstream Dataset Events | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents` | ✅ |
| **Tasks** | | |
| List DAG Tasks | `/api/v1/dags/{dag_id}/tasks` | ✅ |
| Get Task Details | `/api/v1/dags/{dag_id}/tasks/{task_id}` | ✅ |
| Get Task Instance | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}` | ✅ |
| List Task Instances | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances` | ✅ |
| Update Task Instance | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}` | ✅ |
| Clear Task Instances | `/api/v1/dags/{dag_id}/clearTaskInstances` | ✅ |
| Set Task Instances State | `/api/v1/dags/{dag_id}/updateTaskInstancesState` | ✅ |
| **Variables** | | |
| List Variables | `/api/v1/variables` | ✅ |
| Create Variable | `/api/v1/variables` | ✅ |
| Get Variable | `/api/v1/variables/{variable_key}` | ✅ |
| Update Variable | `/api/v1/variables/{variable_key}` | ✅ |
| Delete Variable | `/api/v1/variables/{variable_key}` | ✅ |
| **Connections** | | |
| List Connections | `/api/v1/connections` | ✅ |
| Create Connection | `/api/v1/connections` | ✅ |
| Get Connection | `/api/v1/connections/{connection_id}` | ✅ |
| Update Connection | `/api/v1/connections/{connection_id}` | ✅ |
| Delete Connection | `/api/v1/connections/{connection_id}` | ✅ |
| Test Connection | `/api/v1/connections/test` | ✅ |
| **Pools** | | |
| List Pools | `/api/v1/pools` | ✅ |
| Create Pool | `/api/v1/pools` | ✅ |
| Get Pool | `/api/v1/pools/{pool_name}` | ✅ |
| Update Pool | `/api/v1/pools/{pool_name}` | ✅ |
| Delete Pool | `/api/v1/pools/{pool_name}` | ✅ |
| **XComs** | | |
| List XComs | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries` | ✅ |
| Get XCom Entry | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key}` | ✅ |
| **Datasets** | | |
| List Datasets | `/api/v1/datasets` | ✅ |
| Get Dataset | `/api/v1/datasets/{uri}` | ✅ |
| Get Dataset Events | `/api/v1/datasetEvents` | ✅ |
| Create Dataset Event | `/api/v1/datasetEvents` | ✅ |
| Get DAG Dataset Queued Event | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}` | ✅ |
| Get DAG Dataset Queued Events | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents` | ✅ |
| Delete DAG Dataset Queued Event | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}` | ✅ |
| Delete DAG Dataset Queued Events | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents` | ✅ |
| Get Dataset Queued Events | `/api/v1/datasets/{uri}/dagRuns/queued/datasetEvents` | ✅ |
| Delete Dataset Queued Events | `/api/v1/datasets/{uri}/dagRuns/queued/datasetEvents` | ✅ |
| **Monitoring** | | |
| Get Health | `/api/v1/health` | ✅ |
| **DAG Stats** | | |
| Get DAG Stats | `/api/v1/dags/statistics` | ✅ |
| **Config** | | |
| Get Config | `/api/v1/config` | ✅ |
| **Plugins** | | |
| Get Plugins | `/api/v1/plugins` | ✅ |
| **Providers** | | |
| List Providers | `/api/v1/providers` | ✅ |
| **Event Logs** | | |
| List Event Logs | `/api/v1/eventLogs` | ✅ |
| Get Event Log | `/api/v1/eventLogs/{event_log_id}` | ✅ |
| **System** | | |
| Get Import Errors | `/api/v1/importErrors` | ✅ |
| Get Import Error Details | `/api/v1/importErrors/{import_error_id}` | ✅ |
| Get Health Status | `/api/v1/health` | ✅ |
| Get Version | `/api/v1/version` | ✅ |
## Setup
### Dependencies
This project depends on the official Apache Airflow client library (`apache-airflow-client`). It will be automatically installed when you install this package.
### Environment Variables
Set the following environment variables:
```
AIRFLOW_HOST=<your-airflow-host> # Optional, defaults to http://localhost:8080
AIRFLOW_USERNAME=<your-airflow-username>
AIRFLOW_PASSWORD=<your-airflow-password>
AIRFLOW_API_VERSION=v1 # Optional, defaults to v1
```
### Usage with Claude Desktop
Add to your `claude_desktop_config.json`:
```json
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uvx",
"args": ["mcp-server-apache-airflow"],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password"
}
}
}
}
```
For read-only mode (recommended for safety):
```json
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uvx",
"args": ["mcp-server-apache-airflow", "--read-only"],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password"
}
}
}
}
```
Alternative configuration using `uv`:
```json
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uv",
"args": [
"--directory",
"/path/to/mcp-server-apache-airflow",
"run",
"mcp-server-apache-airflow"
],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password"
}
}
}
}
```
Replace `/path/to/mcp-server-apache-airflow` with the actual path where you've cloned the repository.
### Selecting the API groups
You can select the API groups you want to use by setting the `--apis` flag.
```bash
uv run mcp-server-apache-airflow --apis "dag,dagrun"
```
The default is to use all APIs.
Allowed values are:
- config
- connections
- dag
- dagrun
- dagstats
- dataset
- eventlog
- importerror
- monitoring
- plugin
- pool
- provider
- taskinstance
- variable
- xcom
### Read-Only Mode
You can run the server in read-only mode by using the `--read-only` flag. This will only expose tools that perform read operations (GET requests) and exclude any tools that create, update, or delete resources.
```bash
uv run mcp-server-apache-airflow --read-only
```
In read-only mode, the server will only expose tools like:
- Listing DAGs, DAG runs, tasks, variables, connections, etc.
- Getting details of specific resources
- Reading configurations and monitoring information
- Testing connections (non-destructive)
Write operations like creating, updating, deleting DAGs, variables, connections, triggering DAG runs, etc. will not be available in read-only mode.
You can combine read-only mode with API group selection:
```bash
uv run mcp-server-apache-airflow --read-only --apis "dag,variable"
```
### Manual Execution
You can also run the server manually:
```bash
make run
```
`make run` accepts following options:
Options:
- `--port`: Port to listen on for SSE (default: 8000)
- `--transport`: Transport type (stdio/sse, default: stdio)
Or, you could run the sse server directly, which accepts same parameters:
```bash
make run-sse
```
### Installing via Smithery
To install Apache Airflow MCP Server for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@yangkyeongmo/mcp-server-apache-airflow):
```bash
npx -y @smithery/cli install @yangkyeongmo/mcp-server-apache-airflow --client claude
```
## Development
### Setting up Development Environment
1. Clone the repository:
```bash
git clone https://github.com/yangkyeongmo/mcp-server-apache-airflow.git
cd mcp-server-apache-airflow
```
2. Install development dependencies:
```bash
uv sync --dev
```
3. Create a `.env` file for environment variables (optional for development):
```bash
touch .env
```
> **Note**: No environment variables are required for running tests. The `AIRFLOW_HOST` defaults to `http://localhost:8080` for development and testing purposes.
### Running Tests
The project uses pytest for testing with the following commands available:
```bash
# Run all tests
make test
```
### Code Quality
```bash
# Run linting
make lint
# Run code formatting
make format
```
### Continuous Integration
The project includes a GitHub Actions workflow (`.github/workflows/test.yml`) that automatically:
- Runs tests on Python 3.10, 3.11, and 3.12
- Executes linting checks using ruff
- Runs on every push and pull request to `main` branch
The CI pipeline ensures code quality and compatibility across supported Python versions before any changes are merged.
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
The package is deployed automatically to PyPI when project.version is updated in `pyproject.toml`.
Follow semver for versioning.
Please include version update in the PR in order to apply the changes to core logic.
## License
[MIT License](LICENSE)
Raw data
{
"_id": null,
"home_page": null,
"name": "mcp-server-apache-airflow",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "airflow, apache-airflow, mcp, model-context-protocol",
"author": null,
"author_email": "Gyeongmo Yang <me@gmyang.dev>",
"download_url": "https://files.pythonhosted.org/packages/ee/5d/f9dc6ebaec1c3fbc837ecf7b10bbb5ccfe7000fd2df14f85dd677ab29697/mcp_server_apache_airflow-0.2.5.tar.gz",
"platform": null,
"description": "[](https://mseep.ai/app/yangkyeongmo-mcp-server-apache-airflow)\n\n# mcp-server-apache-airflow\n\n[](https://smithery.ai/server/@yangkyeongmo/mcp-server-apache-airflow)\n\nA Model Context Protocol (MCP) server implementation for Apache Airflow, enabling seamless integration with MCP clients. This project provides a standardized way to interact with Apache Airflow through the Model Context Protocol.\n\n<a href=\"https://glama.ai/mcp/servers/e99b6vx9lw\">\n <img width=\"380\" height=\"200\" src=\"https://glama.ai/mcp/servers/e99b6vx9lw/badge\" alt=\"Server for Apache Airflow MCP server\" />\n</a>\n\n## About\n\nThis project implements a [Model Context Protocol](https://modelcontextprotocol.io/introduction) server that wraps Apache Airflow's REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses the official Apache Airflow client library to ensure compatibility and maintainability.\n\n## Feature Implementation Status\n\n| Feature | API Path | Status |\n| -------------------------------- | --------------------------------------------------------------------------------------------- | ------ |\n| **DAG Management** | | |\n| List DAGs | `/api/v1/dags` | \u2705 |\n| Get DAG Details | `/api/v1/dags/{dag_id}` | \u2705 |\n| Pause DAG | `/api/v1/dags/{dag_id}` | \u2705 |\n| Unpause DAG | `/api/v1/dags/{dag_id}` | \u2705 |\n| Update DAG | `/api/v1/dags/{dag_id}` | \u2705 |\n| Delete DAG | `/api/v1/dags/{dag_id}` | \u2705 |\n| Get DAG Source | `/api/v1/dagSources/{file_token}` | \u2705 |\n| Patch Multiple DAGs | `/api/v1/dags` | \u2705 |\n| Reparse DAG File | `/api/v1/dagSources/{file_token}/reparse` | \u2705 |\n| **DAG Runs** | | |\n| List DAG Runs | `/api/v1/dags/{dag_id}/dagRuns` | \u2705 |\n| Create DAG Run | `/api/v1/dags/{dag_id}/dagRuns` | \u2705 |\n| Get DAG Run Details | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | \u2705 |\n| Update DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | \u2705 |\n| Delete DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | \u2705 |\n| Get DAG Runs Batch | `/api/v1/dags/~/dagRuns/list` | \u2705 |\n| Clear DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/clear` | \u2705 |\n| Set DAG Run Note | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/setNote` | \u2705 |\n| Get Upstream Dataset Events | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents` | \u2705 |\n| **Tasks** | | |\n| List DAG Tasks | `/api/v1/dags/{dag_id}/tasks` | \u2705 |\n| Get Task Details | `/api/v1/dags/{dag_id}/tasks/{task_id}` | \u2705 |\n| Get Task Instance | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}` | \u2705 |\n| List Task Instances | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances` | \u2705 |\n| Update Task Instance | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}` | \u2705 |\n| Clear Task Instances | `/api/v1/dags/{dag_id}/clearTaskInstances` | \u2705 |\n| Set Task Instances State | `/api/v1/dags/{dag_id}/updateTaskInstancesState` | \u2705 |\n| **Variables** | | |\n| List Variables | `/api/v1/variables` | \u2705 |\n| Create Variable | `/api/v1/variables` | \u2705 |\n| Get Variable | `/api/v1/variables/{variable_key}` | \u2705 |\n| Update Variable | `/api/v1/variables/{variable_key}` | \u2705 |\n| Delete Variable | `/api/v1/variables/{variable_key}` | \u2705 |\n| **Connections** | | |\n| List Connections | `/api/v1/connections` | \u2705 |\n| Create Connection | `/api/v1/connections` | \u2705 |\n| Get Connection | `/api/v1/connections/{connection_id}` | \u2705 |\n| Update Connection | `/api/v1/connections/{connection_id}` | \u2705 |\n| Delete Connection | `/api/v1/connections/{connection_id}` | \u2705 |\n| Test Connection | `/api/v1/connections/test` | \u2705 |\n| **Pools** | | |\n| List Pools | `/api/v1/pools` | \u2705 |\n| Create Pool | `/api/v1/pools` | \u2705 |\n| Get Pool | `/api/v1/pools/{pool_name}` | \u2705 |\n| Update Pool | `/api/v1/pools/{pool_name}` | \u2705 |\n| Delete Pool | `/api/v1/pools/{pool_name}` | \u2705 |\n| **XComs** | | |\n| List XComs | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries` | \u2705 |\n| Get XCom Entry | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key}` | \u2705 |\n| **Datasets** | | |\n| List Datasets | `/api/v1/datasets` | \u2705 |\n| Get Dataset | `/api/v1/datasets/{uri}` | \u2705 |\n| Get Dataset Events | `/api/v1/datasetEvents` | \u2705 |\n| Create Dataset Event | `/api/v1/datasetEvents` | \u2705 |\n| Get DAG Dataset Queued Event | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}` | \u2705 |\n| Get DAG Dataset Queued Events | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents` | \u2705 |\n| Delete DAG Dataset Queued Event | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}` | \u2705 |\n| Delete DAG Dataset Queued Events | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents` | \u2705 |\n| Get Dataset Queued Events | `/api/v1/datasets/{uri}/dagRuns/queued/datasetEvents` | \u2705 |\n| Delete Dataset Queued Events | `/api/v1/datasets/{uri}/dagRuns/queued/datasetEvents` | \u2705 |\n| **Monitoring** | | |\n| Get Health | `/api/v1/health` | \u2705 |\n| **DAG Stats** | | |\n| Get DAG Stats | `/api/v1/dags/statistics` | \u2705 |\n| **Config** | | |\n| Get Config | `/api/v1/config` | \u2705 |\n| **Plugins** | | |\n| Get Plugins | `/api/v1/plugins` | \u2705 |\n| **Providers** | | |\n| List Providers | `/api/v1/providers` | \u2705 |\n| **Event Logs** | | |\n| List Event Logs | `/api/v1/eventLogs` | \u2705 |\n| Get Event Log | `/api/v1/eventLogs/{event_log_id}` | \u2705 |\n| **System** | | |\n| Get Import Errors | `/api/v1/importErrors` | \u2705 |\n| Get Import Error Details | `/api/v1/importErrors/{import_error_id}` | \u2705 |\n| Get Health Status | `/api/v1/health` | \u2705 |\n| Get Version | `/api/v1/version` | \u2705 |\n\n## Setup\n\n### Dependencies\n\nThis project depends on the official Apache Airflow client library (`apache-airflow-client`). It will be automatically installed when you install this package.\n\n### Environment Variables\n\nSet the following environment variables:\n\n```\nAIRFLOW_HOST=<your-airflow-host> # Optional, defaults to http://localhost:8080\nAIRFLOW_USERNAME=<your-airflow-username>\nAIRFLOW_PASSWORD=<your-airflow-password>\nAIRFLOW_API_VERSION=v1 # Optional, defaults to v1\n```\n\n### Usage with Claude Desktop\n\nAdd to your `claude_desktop_config.json`:\n\n```json\n{\n \"mcpServers\": {\n \"mcp-server-apache-airflow\": {\n \"command\": \"uvx\",\n \"args\": [\"mcp-server-apache-airflow\"],\n \"env\": {\n \"AIRFLOW_HOST\": \"https://your-airflow-host\",\n \"AIRFLOW_USERNAME\": \"your-username\",\n \"AIRFLOW_PASSWORD\": \"your-password\"\n }\n }\n }\n}\n```\n\nFor read-only mode (recommended for safety):\n\n```json\n{\n \"mcpServers\": {\n \"mcp-server-apache-airflow\": {\n \"command\": \"uvx\",\n \"args\": [\"mcp-server-apache-airflow\", \"--read-only\"],\n \"env\": {\n \"AIRFLOW_HOST\": \"https://your-airflow-host\",\n \"AIRFLOW_USERNAME\": \"your-username\",\n \"AIRFLOW_PASSWORD\": \"your-password\"\n }\n }\n }\n}\n```\n\nAlternative configuration using `uv`:\n\n```json\n{\n \"mcpServers\": {\n \"mcp-server-apache-airflow\": {\n \"command\": \"uv\",\n \"args\": [\n \"--directory\",\n \"/path/to/mcp-server-apache-airflow\",\n \"run\",\n \"mcp-server-apache-airflow\"\n ],\n \"env\": {\n \"AIRFLOW_HOST\": \"https://your-airflow-host\",\n \"AIRFLOW_USERNAME\": \"your-username\",\n \"AIRFLOW_PASSWORD\": \"your-password\"\n }\n }\n }\n}\n```\n\nReplace `/path/to/mcp-server-apache-airflow` with the actual path where you've cloned the repository.\n\n### Selecting the API groups\n\nYou can select the API groups you want to use by setting the `--apis` flag.\n\n```bash\nuv run mcp-server-apache-airflow --apis \"dag,dagrun\"\n```\n\nThe default is to use all APIs.\n\nAllowed values are:\n\n- config\n- connections\n- dag\n- dagrun\n- dagstats\n- dataset\n- eventlog\n- importerror\n- monitoring\n- plugin\n- pool\n- provider\n- taskinstance\n- variable\n- xcom\n\n### Read-Only Mode\n\nYou can run the server in read-only mode by using the `--read-only` flag. This will only expose tools that perform read operations (GET requests) and exclude any tools that create, update, or delete resources.\n\n```bash\nuv run mcp-server-apache-airflow --read-only\n```\n\nIn read-only mode, the server will only expose tools like:\n- Listing DAGs, DAG runs, tasks, variables, connections, etc.\n- Getting details of specific resources\n- Reading configurations and monitoring information\n- Testing connections (non-destructive)\n\nWrite operations like creating, updating, deleting DAGs, variables, connections, triggering DAG runs, etc. will not be available in read-only mode.\n\nYou can combine read-only mode with API group selection:\n\n```bash\nuv run mcp-server-apache-airflow --read-only --apis \"dag,variable\"\n```\n\n### Manual Execution\n\nYou can also run the server manually:\n\n```bash\nmake run\n```\n\n`make run` accepts following options:\n\nOptions:\n\n- `--port`: Port to listen on for SSE (default: 8000)\n- `--transport`: Transport type (stdio/sse, default: stdio)\n\nOr, you could run the sse server directly, which accepts same parameters:\n\n```bash\nmake run-sse\n```\n\n### Installing via Smithery\n\nTo install Apache Airflow MCP Server for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@yangkyeongmo/mcp-server-apache-airflow):\n\n```bash\nnpx -y @smithery/cli install @yangkyeongmo/mcp-server-apache-airflow --client claude\n```\n\n## Development\n\n### Setting up Development Environment\n\n1. Clone the repository:\n```bash\ngit clone https://github.com/yangkyeongmo/mcp-server-apache-airflow.git\ncd mcp-server-apache-airflow\n```\n\n2. Install development dependencies:\n```bash\nuv sync --dev\n```\n\n3. Create a `.env` file for environment variables (optional for development):\n```bash\ntouch .env\n```\n\n> **Note**: No environment variables are required for running tests. The `AIRFLOW_HOST` defaults to `http://localhost:8080` for development and testing purposes.\n\n### Running Tests\n\nThe project uses pytest for testing with the following commands available:\n\n```bash\n# Run all tests\nmake test\n```\n\n### Code Quality\n\n```bash\n# Run linting\nmake lint\n\n# Run code formatting\nmake format\n```\n\n### Continuous Integration\n\nThe project includes a GitHub Actions workflow (`.github/workflows/test.yml`) that automatically:\n\n- Runs tests on Python 3.10, 3.11, and 3.12\n- Executes linting checks using ruff\n- Runs on every push and pull request to `main` branch\n\nThe CI pipeline ensures code quality and compatibility across supported Python versions before any changes are merged.\n\n## Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.\n\nThe package is deployed automatically to PyPI when project.version is updated in `pyproject.toml`.\nFollow semver for versioning.\n\nPlease include version update in the PR in order to apply the changes to core logic.\n\n## License\n\n[MIT License](LICENSE)\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Model Context Protocol (MCP) server for Apache Airflow",
"version": "0.2.5",
"project_urls": {
"Bug Tracker": "https://github.com/yangkyeongmo/mcp-server-apache-airflow/issues",
"Homepage": "https://github.com/yangkyeongmo/mcp-server-apache-airflow",
"Repository": "https://github.com/yangkyeongmo/mcp-server-apache-airflow.git"
},
"split_keywords": [
"airflow",
" apache-airflow",
" mcp",
" model-context-protocol"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "ef6942ccef4e5fc2b4e20b8f223fd7f75dd278340a08cec4b3538cbc1d73edd1",
"md5": "718d7fdd99d9aac3b5a938441232724d",
"sha256": "9e012f1453cb0649416a0dc8dc14f8552b73fe1a176be595267b10c77c93610b"
},
"downloads": -1,
"filename": "mcp_server_apache_airflow-0.2.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "718d7fdd99d9aac3b5a938441232724d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 24070,
"upload_time": "2025-07-10T04:02:59",
"upload_time_iso_8601": "2025-07-10T04:02:59.052039Z",
"url": "https://files.pythonhosted.org/packages/ef/69/42ccef4e5fc2b4e20b8f223fd7f75dd278340a08cec4b3538cbc1d73edd1/mcp_server_apache_airflow-0.2.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "ee5df9dc6ebaec1c3fbc837ecf7b10bbb5ccfe7000fd2df14f85dd677ab29697",
"md5": "775adf2a473f85f57dfcff81c99c1ed0",
"sha256": "78001bc480c9dbaeef4d66587324ecd4f5ef9455b2c5ca835bbfb527161ff458"
},
"downloads": -1,
"filename": "mcp_server_apache_airflow-0.2.5.tar.gz",
"has_sig": false,
"md5_digest": "775adf2a473f85f57dfcff81c99c1ed0",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 13756,
"upload_time": "2025-07-10T04:03:00",
"upload_time_iso_8601": "2025-07-10T04:03:00.570715Z",
"url": "https://files.pythonhosted.org/packages/ee/5d/f9dc6ebaec1c3fbc837ecf7b10bbb5ccfe7000fd2df14f85dd677ab29697/mcp_server_apache_airflow-0.2.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-10 04:03:00",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "yangkyeongmo",
"github_project": "mcp-server-apache-airflow",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "mcp-server-apache-airflow"
}