=======================
Airflow Bootstrap Utils
=======================
Collection of Python tools to generate Airflow DAGs from control and configuration files.
Exported Console Scripts
------------------------
* airflow-generate-dag: Generate Airflow DAGs from control and configuration files.
* airflow-validate-control-file: Validate control files.
Installation
------------
```shell
pip install airflow-bootstrap-utils
```
Usage
-----
.. code-block:: shell
generate-airflow-dag-script --control_file /tmp/airflow-demo/simple_workflow.yaml --outdir /tmp/airflow-demo
--config_file was not specified and therefore was set to '/tmp/airflow-bootstrap-utils/venv/lib/python3.10/site-packages/airflow_bootstrap_utils/conf/config.yaml'
--template_dir was not specified and therefore was set to '/tmp/airflow-bootstrap-utils/venv/lib/python3.10/site-packages/airflow_bootstrap_utils/templates'
--logfile was not specified and therefore was set to '/tmp/airflow-demo/generate_dag.log'
Wrote Airflow DAG Python script list to file '/tmp/sundaram/airflow-bootstrap-utils/user_processing_L2/2025-02-15-101314/airflow_dag_scripts.txt'
Contents of airflow_dag_scripts.txt:
.. code-block:: text
cat /tmp/sundaram/airflow-bootstrap-utils/user_processing_L2/2025-02-15-101314/airflow_dag_scripts.txt
## method-created: /tmp/airflow-bootstrap-utils/venv/lib/python3.10/site-packages/airflow_bootstrap_utils/manager.py
## date-created: 2025-02-15-101314
## created-by: sundaram
## control-file: /tmp/airflow-demo/simple_workflow.yaml
## logfile: /tmp/airflow-demo/generate_dag.log
/tmp/sundaram/airflow-bootstrap-utils/user_processing_L1/2025-02-15-101314/L1/user_processing_L1.airflow.dag.py
/tmp/sundaram/airflow-bootstrap-utils/user_processing_L2/2025-02-15-101314/L2/user_processing_L2.airflow.dag.py
Contents of L1 airflow script:
.. code-block:: python
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime
#------------------------------
# Airflow DAG definition
#------------------------------
dag = DAG(
'user_processing_L1',
description='This is an example DAG in the airflow-bootstrap-utils. Please provide a better description in your instance in your copy of the control file.',
schedule_interval='“@weekly”',
start_date=datetime(2024,3,22),
catchup=False,
)
#------------------------------
# Airflow task defintions
#------------------------------
"""
The do_something executable is going to do something.
It is going to use the lab number argument to do
something.
The lab number is a required parameter.
"""
do_something = BashOperator(
task_id='do_something',
bash_command='bash /opt/do_something.sh --lab_number L1',
dag=dag
)
"""
The do_something_else executable is going to do something else.
It is going to use the lab number argument to do
something else.
The lab number is a required parameter.
"""
do_something_else = BashOperator(
task_id='do_something_else',
bash_command='bash /opt/do_something_else.sh --lab_number L1',
dag=dag
)
#------------------------------
# Define the task dependencies
#------------------------------
do_something >> do_something_else
Contents of L2 airflow script:
.. code-block:: python
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime
#------------------------------
# Airflow DAG definition
#------------------------------
dag = DAG(
'user_processing_L2',
description='This is an example DAG in the airflow-bootstrap-utils. Please provide a better description in your instance in your copy of the control file.',
schedule_interval='“@weekly”',
start_date=datetime(2024,3,22),
catchup=False,
)
#------------------------------
# Airflow task defintions
#------------------------------
"""
The do_something executable is going to do something.
It is going to use the lab number argument to do
something.
The lab number is a required parameter.
"""
do_something = BashOperator(
task_id='do_something',
bash_command='bash /opt/do_something.sh --lab_number L1',
dag=dag
)
"""
The do_something_else executable is going to do something else.
It is going to use the lab number argument to do
something else.
The lab number is a required parameter.
"""
do_something_else = BashOperator(
task_id='do_something_else',
bash_command='bash /opt/do_something_else.sh --lab_number L1',
dag=dag
)
#------------------------------
# Define the task dependencies
#------------------------------
do_something >> do_something_else
=======
History
=======
0.1.0 (2024-03-20)
------------------
* First release on PyPI.
Raw data
{
"_id": null,
"home_page": "https://github.com/jai-python3/airflow_bootstrap_utils",
"name": "airflow-bootstrap-utils",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "airflow_bootstrap_utils",
"author": "Jaideep Sundaram",
"author_email": "jai.python3@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/7a/88/5c03778c5de215e6f1e8209dd9fcf07e2c5bfea1781155c83bebe570a16f/airflow_bootstrap_utils-0.2.1.tar.gz",
"platform": null,
"description": "=======================\nAirflow Bootstrap Utils\n=======================\n\nCollection of Python tools to generate Airflow DAGs from control and configuration files.\n\n\n\nExported Console Scripts\n------------------------\n\n* airflow-generate-dag: Generate Airflow DAGs from control and configuration files.\n* airflow-validate-control-file: Validate control files.\n\nInstallation\n------------\n\n```shell\npip install airflow-bootstrap-utils\n```\n\nUsage\n-----\n\n.. code-block:: shell\n\n generate-airflow-dag-script --control_file /tmp/airflow-demo/simple_workflow.yaml --outdir /tmp/airflow-demo\n --config_file was not specified and therefore was set to '/tmp/airflow-bootstrap-utils/venv/lib/python3.10/site-packages/airflow_bootstrap_utils/conf/config.yaml'\n --template_dir was not specified and therefore was set to '/tmp/airflow-bootstrap-utils/venv/lib/python3.10/site-packages/airflow_bootstrap_utils/templates'\n --logfile was not specified and therefore was set to '/tmp/airflow-demo/generate_dag.log'\n Wrote Airflow DAG Python script list to file '/tmp/sundaram/airflow-bootstrap-utils/user_processing_L2/2025-02-15-101314/airflow_dag_scripts.txt'\n\nContents of airflow_dag_scripts.txt:\n\n.. code-block:: text\n\n cat /tmp/sundaram/airflow-bootstrap-utils/user_processing_L2/2025-02-15-101314/airflow_dag_scripts.txt \n ## method-created: /tmp/airflow-bootstrap-utils/venv/lib/python3.10/site-packages/airflow_bootstrap_utils/manager.py\n ## date-created: 2025-02-15-101314\n ## created-by: sundaram\n ## control-file: /tmp/airflow-demo/simple_workflow.yaml\n ## logfile: /tmp/airflow-demo/generate_dag.log\n /tmp/sundaram/airflow-bootstrap-utils/user_processing_L1/2025-02-15-101314/L1/user_processing_L1.airflow.dag.py\n /tmp/sundaram/airflow-bootstrap-utils/user_processing_L2/2025-02-15-101314/L2/user_processing_L2.airflow.dag.py\n\nContents of L1 airflow script:\n\n.. code-block:: python\n\n from airflow import DAG\n from airflow.operators.bash_operator import BashOperator\n from datetime import datetime\n\n #------------------------------\n # Airflow DAG definition\n #------------------------------\n dag = DAG(\n 'user_processing_L1',\n description='This is an example DAG in the airflow-bootstrap-utils. Please provide a better description in your instance in your copy of the control file.',\n schedule_interval='\u201c@weekly\u201d',\n start_date=datetime(2024,3,22),\n catchup=False,\n )\n\n #------------------------------\n # Airflow task defintions\n #------------------------------\n\n \"\"\"\n The do_something executable is going to do something.\n It is going to use the lab number argument to do\n something.\n The lab number is a required parameter.\n \"\"\"\n do_something = BashOperator(\n task_id='do_something',\n bash_command='bash /opt/do_something.sh --lab_number L1',\n dag=dag\n )\n\n\n \"\"\"\n The do_something_else executable is going to do something else.\n It is going to use the lab number argument to do\n something else.\n The lab number is a required parameter.\n \"\"\"\n do_something_else = BashOperator(\n task_id='do_something_else',\n bash_command='bash /opt/do_something_else.sh --lab_number L1',\n dag=dag\n )\n\n #------------------------------\n # Define the task dependencies\n #------------------------------\n do_something >> do_something_else\n\nContents of L2 airflow script:\n\n.. code-block:: python\n\n from airflow import DAG\n from airflow.operators.bash_operator import BashOperator\n from datetime import datetime\n\n #------------------------------\n # Airflow DAG definition\n #------------------------------\n dag = DAG(\n 'user_processing_L2',\n description='This is an example DAG in the airflow-bootstrap-utils. Please provide a better description in your instance in your copy of the control file.',\n schedule_interval='\u201c@weekly\u201d',\n start_date=datetime(2024,3,22),\n catchup=False,\n )\n\n #------------------------------\n # Airflow task defintions\n #------------------------------\n\n \"\"\"\n The do_something executable is going to do something.\n It is going to use the lab number argument to do\n something.\n The lab number is a required parameter.\n \"\"\"\n do_something = BashOperator(\n task_id='do_something',\n bash_command='bash /opt/do_something.sh --lab_number L1',\n dag=dag\n )\n\n\n \"\"\"\n The do_something_else executable is going to do something else.\n It is going to use the lab number argument to do\n something else.\n The lab number is a required parameter.\n \"\"\"\n do_something_else = BashOperator(\n task_id='do_something_else',\n bash_command='bash /opt/do_something_else.sh --lab_number L1',\n dag=dag\n )\n\n #------------------------------\n # Define the task dependencies\n #------------------------------\n do_something >> do_something_else\n\n\n=======\nHistory\n=======\n\n0.1.0 (2024-03-20)\n------------------\n\n* First release on PyPI.\n",
"bugtrack_url": null,
"license": null,
"summary": "Collection of Python tools to generate Airflow DAGs from control and configuration files.",
"version": "0.2.1",
"project_urls": {
"Homepage": "https://github.com/jai-python3/airflow_bootstrap_utils"
},
"split_keywords": [
"airflow_bootstrap_utils"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "aad4035cabd2a2bf5d145bdec91f6c3985e9a08f0930ff9dc5d52da646e5c841",
"md5": "3a18556a975add24addcf4ea4107efe8",
"sha256": "f5507e584ff5b97288eacacb3a3fd270ae7b6a40308fdf377b53aecb6244414d"
},
"downloads": -1,
"filename": "airflow_bootstrap_utils-0.2.1-py2.py3-none-any.whl",
"has_sig": false,
"md5_digest": "3a18556a975add24addcf4ea4107efe8",
"packagetype": "bdist_wheel",
"python_version": "py2.py3",
"requires_python": ">=3.10",
"size": 15302,
"upload_time": "2025-02-15T15:20:19",
"upload_time_iso_8601": "2025-02-15T15:20:19.434265Z",
"url": "https://files.pythonhosted.org/packages/aa/d4/035cabd2a2bf5d145bdec91f6c3985e9a08f0930ff9dc5d52da646e5c841/airflow_bootstrap_utils-0.2.1-py2.py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7a885c03778c5de215e6f1e8209dd9fcf07e2c5bfea1781155c83bebe570a16f",
"md5": "0eed5b8174647ee50bd9c8f7e0422254",
"sha256": "ee522e658c1882629397aadf974224970ff8617debf6efa3627fcff17a3e8254"
},
"downloads": -1,
"filename": "airflow_bootstrap_utils-0.2.1.tar.gz",
"has_sig": false,
"md5_digest": "0eed5b8174647ee50bd9c8f7e0422254",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 18841,
"upload_time": "2025-02-15T15:20:21",
"upload_time_iso_8601": "2025-02-15T15:20:21.238338Z",
"url": "https://files.pythonhosted.org/packages/7a/88/5c03778c5de215e6f1e8209dd9fcf07e2c5bfea1781155c83bebe570a16f/airflow_bootstrap_utils-0.2.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-15 15:20:21",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "jai-python3",
"github_project": "airflow_bootstrap_utils",
"github_not_found": true,
"lcname": "airflow-bootstrap-utils"
}