gke-taskflow


Namegke-taskflow JSON
Version 1.0.0 PyPI version JSON
download
home_page
Summary
upload_time2024-01-11 14:29:33
maintainer
docs_urlNone
author
requires_python>=3.8
license
keywords gke-taskflow
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # gke-taskflow

Adds support for Taskflow to the GKEStartPodOperator in Airflow.  
This allows us to write cleaner, more pythonic DAGs.

## Installation
This backage will need to be installed into the same environment as 
Airflow in order to function correctly.  For Cloud Composer, we'll have 
to [follow the docs](https://cloud.google.com/composer/docs/how-to/using/installing-python-dependencies).

### PIP

Execute the following:

```bash
pip install gke-taskflow
```

## Use
After the package is installed, you can define your task using the 
`@task.gke_pod` decorator:

```python
from datetime import datetime
from airflow.decorators import dag, task

@dag(
        schedule=None,
        start_date=datetime(2023, 7, 15),
        tags=["testing"]
)
def example_dag_taskflow():
    @task.gke_pod(
            image="python:3.10-slim",
            task_id="test_flow",
            name="test_flow",
            cluster_name="test-cluster",
            namespace="composer-internal",
            location="us-central1",
            project_id="test-cluster-123abc",
    )
    def hello_world_from_container():
        print("hello world from container")

    hello_world_from_container()

example_dag_taskflow()
```

The keyword arguments supplied to `@task.gke_pod` are identical to those 
supplied to Google's GKEStartPodOperator, on which this work is based.
The [docs for that class](https://airflow.apache.org/docs/apache-airflow-providers-google/stable/operators/cloud/kubernetes_engine.html) 
are scant, but the [source code is available 
online](https://github.com/apache/airflow/blob/main/airflow/providers/google/cloud/operators/kubernetes_engine.py#L395) for review.


            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "gke-taskflow",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "gke-taskflow",
    "author": "",
    "author_email": "Recursion Pharmaceuticals <devs@recursionpharma.com>",
    "download_url": "https://files.pythonhosted.org/packages/33/1b/08406dba55fdb22b0e7312263c8681e05ef1cfd7dbe44f1286bba4dbf11a/gke-taskflow-1.0.0.tar.gz",
    "platform": null,
    "description": "# gke-taskflow\n\nAdds support for Taskflow to the GKEStartPodOperator in Airflow.  \nThis allows us to write cleaner, more pythonic DAGs.\n\n## Installation\nThis backage will need to be installed into the same environment as \nAirflow in order to function correctly.  For Cloud Composer, we'll have \nto [follow the docs](https://cloud.google.com/composer/docs/how-to/using/installing-python-dependencies).\n\n### PIP\n\nExecute the following:\n\n```bash\npip install gke-taskflow\n```\n\n## Use\nAfter the package is installed, you can define your task using the \n`@task.gke_pod` decorator:\n\n```python\nfrom datetime import datetime\nfrom airflow.decorators import dag, task\n\n@dag(\n        schedule=None,\n        start_date=datetime(2023, 7, 15),\n        tags=[\"testing\"]\n)\ndef example_dag_taskflow():\n    @task.gke_pod(\n            image=\"python:3.10-slim\",\n            task_id=\"test_flow\",\n            name=\"test_flow\",\n            cluster_name=\"test-cluster\",\n            namespace=\"composer-internal\",\n            location=\"us-central1\",\n            project_id=\"test-cluster-123abc\",\n    )\n    def hello_world_from_container():\n        print(\"hello world from container\")\n\n    hello_world_from_container()\n\nexample_dag_taskflow()\n```\n\nThe keyword arguments supplied to `@task.gke_pod` are identical to those \nsupplied to Google's GKEStartPodOperator, on which this work is based.\nThe [docs for that class](https://airflow.apache.org/docs/apache-airflow-providers-google/stable/operators/cloud/kubernetes_engine.html) \nare scant, but the [source code is available \nonline](https://github.com/apache/airflow/blob/main/airflow/providers/google/cloud/operators/kubernetes_engine.py#L395) for review.\n\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "",
    "version": "1.0.0",
    "project_urls": null,
    "split_keywords": [
        "gke-taskflow"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "331b08406dba55fdb22b0e7312263c8681e05ef1cfd7dbe44f1286bba4dbf11a",
                "md5": "e5dba005191db776a4d97d7707d7353a",
                "sha256": "78b6c3328cd1af4f1146f0e4d2d457573c19ce47610ef061ad243b18e190b321"
            },
            "downloads": -1,
            "filename": "gke-taskflow-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "e5dba005191db776a4d97d7707d7353a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 9236,
            "upload_time": "2024-01-11T14:29:33",
            "upload_time_iso_8601": "2024-01-11T14:29:33.916895Z",
            "url": "https://files.pythonhosted.org/packages/33/1b/08406dba55fdb22b0e7312263c8681e05ef1cfd7dbe44f1286bba4dbf11a/gke-taskflow-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-11 14:29:33",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "gke-taskflow"
}
        
Elapsed time: 0.46642s