airflow-helper


Nameairflow-helper JSON
Version 0.2.0 PyPI version JSON
download
home_pagehttps://github.com/xnuinside/airflow-helper
Summary
upload_time2023-12-09 20:59:44
maintainer
docs_urlNone
authorIuliia Volkova
requires_python>=3.8,<3.12
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            .. role:: raw-html-m2r(raw)
   :format: html


About the Airflow Helper
^^^^^^^^^^^^^^^^^^^^^^^^


.. image:: https://img.shields.io/pypi/v/airflow-helper
   :target: https://img.shields.io/pypi/v/airflow-helper
   :alt: badge1
 
.. image:: https://img.shields.io/pypi/l/airflow-helper
   :target: https://img.shields.io/pypi/l/airflow-helper
   :alt: badge2
 
.. image:: https://img.shields.io/pypi/pyversions/airflow-helper
   :target: https://img.shields.io/pypi/pyversions/airflow-helper
   :alt: badge3
 
.. image:: https://github.com/xnuinside/airflow-helper/actions/workflows/ci-tests-runner.yml/badge.svg
   :target: https://github.com/xnuinside/airflow-helper/actions/workflows/ci-tests-runner.yml/badge.svg
   :alt: workflow


It's pretty fresh. Docs maybe not clear yet, keep calm ! I will update them soon :) 

.. image:: img/airflow_helper_middle_logo.png
    :width: 150
    :alt: Alternative text
    

**Airflow Helper** is a tool that currently allows setting up Airflow Variables, Connections, and Pools from a YAML configuration file. Support yaml inheritance & can obtain all settings from existed Airflow Server!

In the future, it can be extended with other helpful features. I'm open to any suggestions and feature requests. Just open an issue and describe what you want.

Motivation
^^^^^^^^^^

This project allows to set up Connections & Variables & Pools for Airflow from yaml config. And export them to one config file.

Yeah, I know, I know... secrets backend ...

But I want to have all variables on my local machine toooo without need to connect to any secrets backend. And on tests also!

So I want to have some tool with that I can define ones all needed connections & variables in config file & forget about them during init new environment on local machine or running tests in CI.

Some of functionality looks like 'duplicated' airflow normal cli, but no.. 
I tried to use for, example, ``airflow connections export`` command, but it is export dozend of default connections, that I'm not interested in - and I don't want them, I want only those connections, that created by me.

Airflow Versions Supports
^^^^^^^^^^^^^^^^^^^^^^^^^

You can see the github pipeline, that test library opposite each Airflow Version.
I can only guarantee that 100% library works with Apache Airflow versions that are added on the CI/CD pipeline, but with big chance it works with all 2.x Apache Airflow versions.

How to use
----------

Installation
~~~~~~~~~~~~


#. With Python in virtualenv from PyPi: https://pypi.org/project/airflow-helper/

.. code-block:: console


       pip install airflow-helper

.. code-block:: console


     airflow-helper --version


#. With docker image from Docker Hub: https://hub.docker.com/repository/docker/xnuinside/airflow-helper/

.. code-block:: console


     # pull image
     docker pull xnuinside/airflow-helper:latest

     # sample how to run command

     docker run -it xnuinside/airflow-helper:latest --help


#. Example, how to use in docker-compose: example/docker-compose-example.yaml

Default settings
~~~~~~~~~~~~~~~~

All arguments that required in cli or Python code have 'default' setting, you can check all of them in file 'airflow_helper/settings.py'

Airflow Helper settings & flags
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

You can configure how you want to use config - overwrite existed variables/connections/pools with values from config or just skip them, or raise error if already exist. 

In cli (or as arguments in Python main class, if you use helper directly from python) exist several useful flags, that you can use:

.. code-block:: console


         airflow-helper load [OPTIONS] [FILE_PATH]       

     # options:
       --url    TEXT  Apache Airflow full url to connect. You can provide it or host & port separately. [default: None]--host   TEXT  Apache Airflow server host form that obtain existed settings [default: http://localhost] 
       --port   TEXT  Apache Airflow server port form that obtain existed settings [default: 8080]              
       --user       -u    TEXT  Apache Airflow user with read rights [default: airflow]
       --password   -p    TEXT  Apache Airflow user password [default: airflow]       
       --overwrite  -o          Overwrite Connections & Pools if they already exists 
       --skip-existed  -se      Skip `already exists` errors
       --help          -h       Show this message and exit.

.. code-block:: console


         airflow-helper create [OPTIONS] COMMAND [ARGS] 

     # commands:
       from-server                Create config with values from existed Airflow Server         
       new                        Create new empty config 
     # options
       --help          -h       Show this message and exit.

What if I already have Airflow server with dozens of variables??
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

**Obtain current Variables, Connections, Pools from existed server**

Note: you should provide host url with protocol like: 'https://path-to-your-airflow-server.com' if protocol not in url, it will add 'http://' as default protocol

Generate config from existed Airflow Server - it is simple. Just provide creds with read access to existed Airflow Server like. We use Airflow REST API under the hood, so we need: 

.. code-block::

   - server host & port or just url in format 'http://path-to-airflow:8080'
   - user login
   - user password


And use Airflow Helper:


#. From cli

.. code-block:: command


     # to get help
     airflow-helper create -h

     # to use command
     airflow-helper create path/where/to/save/airflow_settings.yaml --host https://your-airflow-host --port 8080 -u airflow-user -p airflow-password


#. From python code

.. code-block:: python


   from airflow_helper import RemoteConfigObtainter


   # by default it will save config in file airflow_settings.yaml
   RemoteConfigObtainter(
     user='airflow_user', password='airflow_user_pass', url='https://path-to-airflow:8080').dump_config()
   # but you can provide your own path like:

   RemoteConfigObtainter(
     user='airflow_user', password='airflow_user_pass', url='https://path-to-airflow:8080').dump_config(
       file_path='any/path/to/future/airflow_config.yaml'
     )

It will create airflow_settings.yaml with all Variables, Pools & Connections inside!

**Define config from Scratch**


#. You can init empty config with cli

.. code-block:: console


     airflow-helper create new path/airflow_settings.yaml

It will create empty sample-file with pre-defined config values.


#. Define airflow_settings.yaml file. You can check examples as a files in example/ folder in this git repo
   (check 'Config keys' to see that keys are allowed - or check example/ folder)

About connections:
Note that 'type' it is not Name of Connection type. It is type id check them here - https://github.com/search?q=repo%3Aapache%2Fairflow%20conn_type&type=code 

.. code-block:: yaml


       airflow:
         connections:
         - conn_type: fs
           connection_id: fs_default
           host: localhost
           login: fs_default
           port: null
         pools:
         - description: Default pool
           include_deferred: false
           name: default_pool
           slots: 120
         - description: ''
           include_deferred: true
           name: deferred
           slots: 0
         variables:
         - description: null
           key: variable-name
           value: "variable-value"


#. 
   Run Airflow Helper to load config

   Required settings: 


   * path to config file (by default it search ``airflow_settings.yaml`` file)
   * Airflow Server address (by default it tries to connect to localhost:8080)
   * Airflow user login (with admin rights that allowed to set up Pools, Variables, Connections)
   * Airflow user password (for login upper)

   2.1 Run Airflow Helper from cli

.. code-block:: console


     # to get help

     airflow-helper load -h

     # to load config 
     airflow-helper load path/to/airflow_settings.yaml --host https://your-airflow-host --port 8080 -u airflow-user -p airflow-password

   2.2. Run Airflow Helper from Python Code

.. code-block:: python



     from airflow_helper import ConfigUploader


     # you can provide only url or host & port
     ConfigUploader(
       file_path=file_path, url=url, host=host, port=port, user=user, password=password
       ).upload_config_to_server()

Inheritance (include one config in another)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

I love inheritance. So you can use it too. If you have some base vars/pools/connections for all environments and you don't want copy-paste same settings in multiple files - just use ``include:`` property at the start of your config. 

Note, that ``include`` allows you to include a list of files, they will be inherit one-by-one in order that you define under ``include`` arg from the top to the bottom.

Example:


#. Define your 'base' config, for example: airflow_settings_base.yaml

.. code-block:: yaml


     connections:
     - conn_type: fs
       connection_id: fs_default
       host: localhost
       login: fs_default
       port: null
     pools:
     - description: Default pool
       include_deferred: false
       name: default_pool
       slots: 120


#. Now create your dev-env config : airflow_settings_dev.yaml (names can be any that you want) and use 'include:' property inside it

.. code-block:: yaml


   include: 
     - "airflow_settings_base.yaml"

   # here put only dev-special variables/connections/pools
   airflow:
       variables:
           pass

This mean that final config that will be uploaded to server will contain base settings + settings that you defined directly in airflow_settings_dev.yaml config

Library Configuration
^^^^^^^^^^^^^^^^^^^^^

Airflow Helper uses a bunch of 'default' settings under the hood. Because library uses pydantic-settings, you can also overwrite those configurations settings with environment variables or with monkey patch python code. 

To get full list of possible default settings - check file airflow_helper/settings.py.

If you never heard about pydantic-settings - check https://docs.pydantic.dev/latest/concepts/pydantic_settings/.

Example, to overwrite default airflow host you should provide environment variable with prefix ``AIRFLOW_HELPER_`` and name ``HOST``\ , so variable name should looks like ``AIRFLOW_HELPER_HOST``

TODO
^^^^


#. Documentation website
#. Getting Variables, Pools, Connections directly from Airflow DB (currently available only with Airflow REST API)
#. Load configs from S3 and other cloud object storages
#. Load configs from git
#. Create overwrite mode for settings upload

Changelog
---------

*0.2.0*


#. 
   Added check for variables - now if variable already exists on server Airflow Helper will raise error if you tries to overwrite it from the config.
   To overwrite existed Variables, Connections, Pools - use flag '--overwrite' or argument with same name, if you use Airflow Helper from Python.

#. 
   Added flag --skip-existed to avoid raise error if variables/connections/pools exists already on Airflow Server - it will just add new one from config file.

*0.1.2*


#. Do not fail if some sections from config are not exists

*0.1.1*


#. Overwrite option added to ``airflow-helper load`` command

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/xnuinside/airflow-helper",
    "name": "airflow-helper",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8,<3.12",
    "maintainer_email": "",
    "keywords": "",
    "author": "Iuliia Volkova",
    "author_email": "xnuinside@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/df/42/bdf66aa1b5f8b76512ba27ebf24008c5e930c5831f50a0f7b6011065211d/airflow_helper-0.2.0.tar.gz",
    "platform": null,
    "description": ".. role:: raw-html-m2r(raw)\n   :format: html\n\n\nAbout the Airflow Helper\n^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n.. image:: https://img.shields.io/pypi/v/airflow-helper\n   :target: https://img.shields.io/pypi/v/airflow-helper\n   :alt: badge1\n \n.. image:: https://img.shields.io/pypi/l/airflow-helper\n   :target: https://img.shields.io/pypi/l/airflow-helper\n   :alt: badge2\n \n.. image:: https://img.shields.io/pypi/pyversions/airflow-helper\n   :target: https://img.shields.io/pypi/pyversions/airflow-helper\n   :alt: badge3\n \n.. image:: https://github.com/xnuinside/airflow-helper/actions/workflows/ci-tests-runner.yml/badge.svg\n   :target: https://github.com/xnuinside/airflow-helper/actions/workflows/ci-tests-runner.yml/badge.svg\n   :alt: workflow\n\n\nIt's pretty fresh. Docs maybe not clear yet, keep calm ! I will update them soon :) \n\n.. image:: img/airflow_helper_middle_logo.png\n    :width: 150\n    :alt: Alternative text\n    \n\n**Airflow Helper** is a tool that currently allows setting up Airflow Variables, Connections, and Pools from a YAML configuration file. Support yaml inheritance & can obtain all settings from existed Airflow Server!\n\nIn the future, it can be extended with other helpful features. I'm open to any suggestions and feature requests. Just open an issue and describe what you want.\n\nMotivation\n^^^^^^^^^^\n\nThis project allows to set up Connections & Variables & Pools for Airflow from yaml config. And export them to one config file.\n\nYeah, I know, I know... secrets backend ...\n\nBut I want to have all variables on my local machine toooo without need to connect to any secrets backend. And on tests also!\n\nSo I want to have some tool with that I can define ones all needed connections & variables in config file & forget about them during init new environment on local machine or running tests in CI.\n\nSome of functionality looks like 'duplicated' airflow normal cli, but no.. \nI tried to use for, example, ``airflow connections export`` command, but it is export dozend of default connections, that I'm not interested in - and I don't want them, I want only those connections, that created by me.\n\nAirflow Versions Supports\n^^^^^^^^^^^^^^^^^^^^^^^^^\n\nYou can see the github pipeline, that test library opposite each Airflow Version.\nI can only guarantee that 100% library works with Apache Airflow versions that are added on the CI/CD pipeline, but with big chance it works with all 2.x Apache Airflow versions.\n\nHow to use\n----------\n\nInstallation\n~~~~~~~~~~~~\n\n\n#. With Python in virtualenv from PyPi: https://pypi.org/project/airflow-helper/\n\n.. code-block:: console\n\n\n       pip install airflow-helper\n\n.. code-block:: console\n\n\n     airflow-helper --version\n\n\n#. With docker image from Docker Hub: https://hub.docker.com/repository/docker/xnuinside/airflow-helper/\n\n.. code-block:: console\n\n\n     # pull image\n     docker pull xnuinside/airflow-helper:latest\n\n     # sample how to run command\n\n     docker run -it xnuinside/airflow-helper:latest --help\n\n\n#. Example, how to use in docker-compose: example/docker-compose-example.yaml\n\nDefault settings\n~~~~~~~~~~~~~~~~\n\nAll arguments that required in cli or Python code have 'default' setting, you can check all of them in file 'airflow_helper/settings.py'\n\nAirflow Helper settings & flags\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\nYou can configure how you want to use config - overwrite existed variables/connections/pools with values from config or just skip them, or raise error if already exist. \n\nIn cli (or as arguments in Python main class, if you use helper directly from python) exist several useful flags, that you can use:\n\n.. code-block:: console\n\n\n         airflow-helper load [OPTIONS] [FILE_PATH]       \n\n     # options:\n       --url    TEXT  Apache Airflow full url to connect. You can provide it or host & port separately. [default: None]--host   TEXT  Apache Airflow server host form that obtain existed settings [default: http://localhost] \n       --port   TEXT  Apache Airflow server port form that obtain existed settings [default: 8080]              \n       --user       -u    TEXT  Apache Airflow user with read rights [default: airflow]\n       --password   -p    TEXT  Apache Airflow user password [default: airflow]       \n       --overwrite  -o          Overwrite Connections & Pools if they already exists \n       --skip-existed  -se      Skip `already exists` errors\n       --help          -h       Show this message and exit.\n\n.. code-block:: console\n\n\n         airflow-helper create [OPTIONS] COMMAND [ARGS] \n\n     # commands:\n       from-server                Create config with values from existed Airflow Server         \n       new                        Create new empty config \n     # options\n       --help          -h       Show this message and exit.\n\nWhat if I already have Airflow server with dozens of variables??\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n**Obtain current Variables, Connections, Pools from existed server**\n\nNote: you should provide host url with protocol like: 'https://path-to-your-airflow-server.com' if protocol not in url, it will add 'http://' as default protocol\n\nGenerate config from existed Airflow Server - it is simple. Just provide creds with read access to existed Airflow Server like. We use Airflow REST API under the hood, so we need: \n\n.. code-block::\n\n   - server host & port or just url in format 'http://path-to-airflow:8080'\n   - user login\n   - user password\n\n\nAnd use Airflow Helper:\n\n\n#. From cli\n\n.. code-block:: command\n\n\n     # to get help\n     airflow-helper create -h\n\n     # to use command\n     airflow-helper create path/where/to/save/airflow_settings.yaml --host https://your-airflow-host --port 8080 -u airflow-user -p airflow-password\n\n\n#. From python code\n\n.. code-block:: python\n\n\n   from airflow_helper import RemoteConfigObtainter\n\n\n   # by default it will save config in file airflow_settings.yaml\n   RemoteConfigObtainter(\n     user='airflow_user', password='airflow_user_pass', url='https://path-to-airflow:8080').dump_config()\n   # but you can provide your own path like:\n\n   RemoteConfigObtainter(\n     user='airflow_user', password='airflow_user_pass', url='https://path-to-airflow:8080').dump_config(\n       file_path='any/path/to/future/airflow_config.yaml'\n     )\n\nIt will create airflow_settings.yaml with all Variables, Pools & Connections inside!\n\n**Define config from Scratch**\n\n\n#. You can init empty config with cli\n\n.. code-block:: console\n\n\n     airflow-helper create new path/airflow_settings.yaml\n\nIt will create empty sample-file with pre-defined config values.\n\n\n#. Define airflow_settings.yaml file. You can check examples as a files in example/ folder in this git repo\n   (check 'Config keys' to see that keys are allowed - or check example/ folder)\n\nAbout connections:\nNote that 'type' it is not Name of Connection type. It is type id check them here - https://github.com/search?q=repo%3Aapache%2Fairflow%20conn_type&type=code \n\n.. code-block:: yaml\n\n\n       airflow:\n         connections:\n         - conn_type: fs\n           connection_id: fs_default\n           host: localhost\n           login: fs_default\n           port: null\n         pools:\n         - description: Default pool\n           include_deferred: false\n           name: default_pool\n           slots: 120\n         - description: ''\n           include_deferred: true\n           name: deferred\n           slots: 0\n         variables:\n         - description: null\n           key: variable-name\n           value: \"variable-value\"\n\n\n#. \n   Run Airflow Helper to load config\n\n   Required settings: \n\n\n   * path to config file (by default it search ``airflow_settings.yaml`` file)\n   * Airflow Server address (by default it tries to connect to localhost:8080)\n   * Airflow user login (with admin rights that allowed to set up Pools, Variables, Connections)\n   * Airflow user password (for login upper)\n\n   2.1 Run Airflow Helper from cli\n\n.. code-block:: console\n\n\n     # to get help\n\n     airflow-helper load -h\n\n     # to load config \n     airflow-helper load path/to/airflow_settings.yaml --host https://your-airflow-host --port 8080 -u airflow-user -p airflow-password\n\n   2.2. Run Airflow Helper from Python Code\n\n.. code-block:: python\n\n\n\n     from airflow_helper import ConfigUploader\n\n\n     # you can provide only url or host & port\n     ConfigUploader(\n       file_path=file_path, url=url, host=host, port=port, user=user, password=password\n       ).upload_config_to_server()\n\nInheritance (include one config in another)\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\nI love inheritance. So you can use it too. If you have some base vars/pools/connections for all environments and you don't want copy-paste same settings in multiple files - just use ``include:`` property at the start of your config. \n\nNote, that ``include`` allows you to include a list of files, they will be inherit one-by-one in order that you define under ``include`` arg from the top to the bottom.\n\nExample:\n\n\n#. Define your 'base' config, for example: airflow_settings_base.yaml\n\n.. code-block:: yaml\n\n\n     connections:\n     - conn_type: fs\n       connection_id: fs_default\n       host: localhost\n       login: fs_default\n       port: null\n     pools:\n     - description: Default pool\n       include_deferred: false\n       name: default_pool\n       slots: 120\n\n\n#. Now create your dev-env config : airflow_settings_dev.yaml (names can be any that you want) and use 'include:' property inside it\n\n.. code-block:: yaml\n\n\n   include: \n     - \"airflow_settings_base.yaml\"\n\n   # here put only dev-special variables/connections/pools\n   airflow:\n       variables:\n           pass\n\nThis mean that final config that will be uploaded to server will contain base settings + settings that you defined directly in airflow_settings_dev.yaml config\n\nLibrary Configuration\n^^^^^^^^^^^^^^^^^^^^^\n\nAirflow Helper uses a bunch of 'default' settings under the hood. Because library uses pydantic-settings, you can also overwrite those configurations settings with environment variables or with monkey patch python code. \n\nTo get full list of possible default settings - check file airflow_helper/settings.py.\n\nIf you never heard about pydantic-settings - check https://docs.pydantic.dev/latest/concepts/pydantic_settings/.\n\nExample, to overwrite default airflow host you should provide environment variable with prefix ``AIRFLOW_HELPER_`` and name ``HOST``\\ , so variable name should looks like ``AIRFLOW_HELPER_HOST``\n\nTODO\n^^^^\n\n\n#. Documentation website\n#. Getting Variables, Pools, Connections directly from Airflow DB (currently available only with Airflow REST API)\n#. Load configs from S3 and other cloud object storages\n#. Load configs from git\n#. Create overwrite mode for settings upload\n\nChangelog\n---------\n\n*0.2.0*\n\n\n#. \n   Added check for variables - now if variable already exists on server Airflow Helper will raise error if you tries to overwrite it from the config.\n   To overwrite existed Variables, Connections, Pools - use flag '--overwrite' or argument with same name, if you use Airflow Helper from Python.\n\n#. \n   Added flag --skip-existed to avoid raise error if variables/connections/pools exists already on Airflow Server - it will just add new one from config file.\n\n*0.1.2*\n\n\n#. Do not fail if some sections from config are not exists\n\n*0.1.1*\n\n\n#. Overwrite option added to ``airflow-helper load`` command\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "",
    "version": "0.2.0",
    "project_urls": {
        "Homepage": "https://github.com/xnuinside/airflow-helper",
        "Repository": "https://github.com/xnuinside/airflow-helper"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "82c1a338ab1a783725df49aef5d049ddfa501d5de59e854dcc588e495865fc8a",
                "md5": "e2f7b0e750a9a6f270340f234a31bd7d",
                "sha256": "2b076257cfc79504bba44ddc9a26bdff6e31b9ce11e585442eb35472d5aa8f6b"
            },
            "downloads": -1,
            "filename": "airflow_helper-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e2f7b0e750a9a6f270340f234a31bd7d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8,<3.12",
            "size": 13074,
            "upload_time": "2023-12-09T20:59:42",
            "upload_time_iso_8601": "2023-12-09T20:59:42.765063Z",
            "url": "https://files.pythonhosted.org/packages/82/c1/a338ab1a783725df49aef5d049ddfa501d5de59e854dcc588e495865fc8a/airflow_helper-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "df42bdf66aa1b5f8b76512ba27ebf24008c5e930c5831f50a0f7b6011065211d",
                "md5": "d19c9777da7c1415aa68c90e314b13df",
                "sha256": "1bb6f07b1d52a756b6ffc68d792f95fba43ec0fc1898918419990dc97b375ad0"
            },
            "downloads": -1,
            "filename": "airflow_helper-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "d19c9777da7c1415aa68c90e314b13df",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8,<3.12",
            "size": 10190,
            "upload_time": "2023-12-09T20:59:44",
            "upload_time_iso_8601": "2023-12-09T20:59:44.602886Z",
            "url": "https://files.pythonhosted.org/packages/df/42/bdf66aa1b5f8b76512ba27ebf24008c5e930c5831f50a0f7b6011065211d/airflow_helper-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-09 20:59:44",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "xnuinside",
    "github_project": "airflow-helper",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "airflow-helper"
}
        
Elapsed time: 0.18263s