Name | dmp-af JSON |
Version |
0.14.6
JSON |
| download |
home_page | None |
Summary | Distibuted dbt runs on Apache Airflow |
upload_time | 2025-10-20 18:49:10 |
maintainer | None |
docs_url | None |
author | None |
requires_python | <3.13,>=3.10 |
license | None |
keywords |
airflow
dbt
python
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
[](https://pypi.org/project/dmp-af/)
[](https://github.com/dmp-labs/dmp-af/actions)
[](https://www.apache.org/licenses/LICENSE-2.0.txt)
[](https://pypi.org/project/dmp-af/)
[](https://pypi.org/project/dmp-af/)
[](https://github.com/astral-sh/uv)
[](https://github.com/psf/black)
# dmp-af: distributed dbt runs on Airflow
## Overview
**dmp-af** runs your dbt models in parallel on Airflow. Each model becomes an independent task while preserving
dependencies across domains.
**Built for scale.** Designed for large dbt projects (1000+ models)
and [data mesh architecture](https://www.datamesh-architecture.com/#what-is-data-mesh). Works with any project size.

### Why dmp-af?
1. **Domain-driven architecture** - Separate models by domain into different DAGs, run in parallel, perfect for data
mesh
2. **dbt-first design** - All configuration in dbt model configs, analytics teams stay in dbt, no Airflow knowledge
required
3. **Flexible scheduling** - Multiple schedules per model (`@hourly`, `@daily`, `@weekly`,
`@monthly`, [and more](examples/manual_scheduling.md))
4. **Enterprise features** - Multiple dbt targets, configurable test strategies, built-in maintenance, Kubernetes
support
## Installation
To install `dmp-af` run `pip install dmp-af`.
To contribute we recommend to use `uv` to install package dependencies.
Run `uv sync --all-packages --all-groups --all-extras` to install all dependencies.
## _dmp-af_ by Example
All tutorials and examples are located in the [examples](examples/README.md) folder.
To get basic Airflow DAGs for your dbt project, you need to put the following code into your `dags` folder:
```python
# LABELS: dag, airflow (it's required for airflow dag-processor)
from dmp_af.dags import compile_dmp_af_dags
from dmp_af.conf import Config, DbtDefaultTargetsConfig, DbtProjectConfig
# specify here all settings for your dbt project
config = Config(
dbt_project=DbtProjectConfig(
dbt_project_name='my_dbt_project',
dbt_project_path='/path/to/my_dbt_project',
dbt_models_path='/path/to/my_dbt_project/models',
dbt_profiles_path='/path/to/my_dbt_project',
dbt_target_path='/path/to/my_dbt_project/target',
dbt_log_path='/path/to/my_dbt_project/logs',
dbt_schema='my_dbt_schema',
),
dbt_default_targets=DbtDefaultTargetsConfig(default_target='dev'),
dry_run=False, # set to True if you want to turn on dry-run mode
)
dags = compile_dmp_af_dags(
manifest_path='/path/to/my_dbt_project/target/manifest.json',
config=config,
)
for dag_name, dag in dags.items():
globals()[dag_name] = dag
```
In _dbt_project.yml_ you need to set up default targets for all nodes in your project
(see [example](examples/dags/dbt_project.yml)):
```yaml
sql_cluster: "dev"
daily_sql_cluster: "dev"
py_cluster: "dev"
bf_cluster: "dev"
```
This will create Airflow DAGs for your dbt project.
Check out the documentation for more details [here](docs/docs.md).
## Key Features
**Auto-generated DAGs**
- Automatically creates Airflow DAGs from your dbt project
- Organizes by domain and schedule
- Handles dependencies across domains
**Idempotent runs**
- Each model is a separate Airflow task
- Date intervals passed to every run
- Reliable backfills and reruns
**Team-friendly**
- Analytics teams stay in dbt
- No Airflow DAG writing required
- Infrastructure handled automatically
## Requirements
`dmp-af` is tested with:
| Airflow version | Python versions | `dbt-core` versions |
|-----------------|-----------------|---------------------|
| 2.6.3 | >=3.10,<3.12 | >=1.7,<=1.10 |
| 2.7.3 | >=3.10,<3.12 | >=1.7,<=1.10 |
| 2.8.4 | >=3.10,<3.12 | >=1.7,<=1.10 |
| 2.9.3 | >=3.10,<3.13 | >=1.7,<=1.10 |
| 2.10.5 | >=3.10,<3.13 | >=1.7,<=1.10 |
| 2.11.0 | >=3.10,<3.13 | >=1.7,<=1.10 |
## Project Information
- [Docs](docs/docs.md)
- [PyPI](https://pypi.org/project/dmp-af/)
- [Contributing](CONTRIBUTING.md)
- [Original dbt-af Project](https://github.com/Toloka/dbt-af)
## About this fork
This project is a fork of [Toloka AI BV's original repository](https://github.com/Toloka/dbt-af).
It includes substantial modifications by IJKOS & PARTNERS LTD.
This fork is not affiliated with or endorsed by Toloka AI BV.
The original project is licensed under the [Apache License 2.0](./LICENSE).
### Migrating from dbt-af
If you're currently using dbt-af and want to migrate to dmp-af, see our [Migration Guide](MIGRATION.md) for step-by-step
instructions.
Raw data
{
"_id": null,
"home_page": null,
"name": "dmp-af",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.10",
"maintainer_email": null,
"keywords": "airflow, dbt, python",
"author": null,
"author_email": "Nikita Yurasov <nikitayurasov@ijkos.com>, Evgeny Ermakov <jk.ermakov@ijkos.com>, Leonid Kozhinov <lkozhinov@ijkos.com>",
"download_url": "https://files.pythonhosted.org/packages/3a/e4/0112b25411eabfa3b7c465264d9957af3ada6d363e2e40c64663cd7c8d4e/dmp_af-0.14.6.tar.gz",
"platform": null,
"description": "[](https://pypi.org/project/dmp-af/)\n[](https://github.com/dmp-labs/dmp-af/actions)\n\n[](https://www.apache.org/licenses/LICENSE-2.0.txt)\n[](https://pypi.org/project/dmp-af/)\n[](https://pypi.org/project/dmp-af/)\n\n[](https://github.com/astral-sh/uv)\n[](https://github.com/psf/black)\n\n# dmp-af: distributed dbt runs on Airflow\n\n## Overview\n\n**dmp-af** runs your dbt models in parallel on Airflow. Each model becomes an independent task while preserving\ndependencies across domains.\n\n**Built for scale.** Designed for large dbt projects (1000+ models)\nand [data mesh architecture](https://www.datamesh-architecture.com/#what-is-data-mesh). Works with any project size.\n\n\n\n### Why dmp-af?\n\n1. **Domain-driven architecture** - Separate models by domain into different DAGs, run in parallel, perfect for data\n mesh\n2. **dbt-first design** - All configuration in dbt model configs, analytics teams stay in dbt, no Airflow knowledge\n required\n3. **Flexible scheduling** - Multiple schedules per model (`@hourly`, `@daily`, `@weekly`,\n `@monthly`, [and more](examples/manual_scheduling.md))\n4. **Enterprise features** - Multiple dbt targets, configurable test strategies, built-in maintenance, Kubernetes\n support\n\n## Installation\n\nTo install `dmp-af` run `pip install dmp-af`.\n\nTo contribute we recommend to use `uv` to install package dependencies.\nRun `uv sync --all-packages --all-groups --all-extras` to install all dependencies.\n\n## _dmp-af_ by Example\n\nAll tutorials and examples are located in the [examples](examples/README.md) folder.\n\nTo get basic Airflow DAGs for your dbt project, you need to put the following code into your `dags` folder:\n\n```python\n# LABELS: dag, airflow (it's required for airflow dag-processor)\nfrom dmp_af.dags import compile_dmp_af_dags\nfrom dmp_af.conf import Config, DbtDefaultTargetsConfig, DbtProjectConfig\n\n# specify here all settings for your dbt project\nconfig = Config(\n dbt_project=DbtProjectConfig(\n dbt_project_name='my_dbt_project',\n dbt_project_path='/path/to/my_dbt_project',\n dbt_models_path='/path/to/my_dbt_project/models',\n dbt_profiles_path='/path/to/my_dbt_project',\n dbt_target_path='/path/to/my_dbt_project/target',\n dbt_log_path='/path/to/my_dbt_project/logs',\n dbt_schema='my_dbt_schema',\n ),\n dbt_default_targets=DbtDefaultTargetsConfig(default_target='dev'),\n dry_run=False, # set to True if you want to turn on dry-run mode\n)\n\ndags = compile_dmp_af_dags(\n manifest_path='/path/to/my_dbt_project/target/manifest.json',\n config=config,\n)\nfor dag_name, dag in dags.items():\n globals()[dag_name] = dag\n```\n\nIn _dbt_project.yml_ you need to set up default targets for all nodes in your project\n(see [example](examples/dags/dbt_project.yml)):\n\n```yaml\nsql_cluster: \"dev\"\ndaily_sql_cluster: \"dev\"\npy_cluster: \"dev\"\nbf_cluster: \"dev\"\n```\n\nThis will create Airflow DAGs for your dbt project.\n\nCheck out the documentation for more details [here](docs/docs.md).\n\n## Key Features\n\n**Auto-generated DAGs**\n\n- Automatically creates Airflow DAGs from your dbt project\n- Organizes by domain and schedule\n- Handles dependencies across domains\n\n**Idempotent runs**\n\n- Each model is a separate Airflow task\n- Date intervals passed to every run\n- Reliable backfills and reruns\n\n**Team-friendly**\n\n- Analytics teams stay in dbt\n- No Airflow DAG writing required\n- Infrastructure handled automatically\n\n## Requirements\n\n`dmp-af` is tested with:\n\n| Airflow version | Python versions | `dbt-core` versions |\n|-----------------|-----------------|---------------------|\n| 2.6.3 | >=3.10,<3.12 | >=1.7,<=1.10 |\n| 2.7.3 | >=3.10,<3.12 | >=1.7,<=1.10 |\n| 2.8.4 | >=3.10,<3.12 | >=1.7,<=1.10 |\n| 2.9.3 | >=3.10,<3.13 | >=1.7,<=1.10 |\n| 2.10.5 | >=3.10,<3.13 | >=1.7,<=1.10 |\n| 2.11.0 | >=3.10,<3.13 | >=1.7,<=1.10 |\n\n## Project Information\n\n- [Docs](docs/docs.md)\n- [PyPI](https://pypi.org/project/dmp-af/)\n- [Contributing](CONTRIBUTING.md)\n- [Original dbt-af Project](https://github.com/Toloka/dbt-af)\n\n## About this fork\n\nThis project is a fork of [Toloka AI BV's original repository](https://github.com/Toloka/dbt-af).\nIt includes substantial modifications by IJKOS & PARTNERS LTD.\nThis fork is not affiliated with or endorsed by Toloka AI BV.\n\nThe original project is licensed under the [Apache License 2.0](./LICENSE).\n\n### Migrating from dbt-af\n\nIf you're currently using dbt-af and want to migrate to dmp-af, see our [Migration Guide](MIGRATION.md) for step-by-step\ninstructions.\n",
"bugtrack_url": null,
"license": null,
"summary": "Distibuted dbt runs on Apache Airflow",
"version": "0.14.6",
"project_urls": {
"Documentation": "https://dmp-labs.github.io/dmp-af/",
"Homepage": "https://github.com/dmp-labs/dmp-af",
"Release Notes": "https://github.com/dmp-labs/dmp-af/blob/main/CHANGELOG.md",
"Source Code": "https://github.com/dmp-labs/dmp-af"
},
"split_keywords": [
"airflow",
" dbt",
" python"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "4d91c667dd421a0cdd9b2c80379c2e94037e0d379ce4be84bb52696b3f4eba55",
"md5": "b4d0023be8835d2eb45e413e498dd8fc",
"sha256": "2e1cd4118a37146f19792811b9a4c85e5dff22dfff32575e32d76a2d84f80885"
},
"downloads": -1,
"filename": "dmp_af-0.14.6-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b4d0023be8835d2eb45e413e498dd8fc",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.10",
"size": 55262,
"upload_time": "2025-10-20T18:49:09",
"upload_time_iso_8601": "2025-10-20T18:49:09.142445Z",
"url": "https://files.pythonhosted.org/packages/4d/91/c667dd421a0cdd9b2c80379c2e94037e0d379ce4be84bb52696b3f4eba55/dmp_af-0.14.6-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "3ae40112b25411eabfa3b7c465264d9957af3ada6d363e2e40c64663cd7c8d4e",
"md5": "3d563817ee4dfb6d42582e20e82c02ee",
"sha256": "5143b5e56853c3f073fe14a24079bbebe9dfe982d73c91444818cc76a1bcbe88"
},
"downloads": -1,
"filename": "dmp_af-0.14.6.tar.gz",
"has_sig": false,
"md5_digest": "3d563817ee4dfb6d42582e20e82c02ee",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.13,>=3.10",
"size": 42090,
"upload_time": "2025-10-20T18:49:10",
"upload_time_iso_8601": "2025-10-20T18:49:10.484022Z",
"url": "https://files.pythonhosted.org/packages/3a/e4/0112b25411eabfa3b7c465264d9957af3ada6d363e2e40c64663cd7c8d4e/dmp_af-0.14.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-20 18:49:10",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "dmp-labs",
"github_project": "dmp-af",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "dmp-af"
}