ddeutil-workflow


Nameddeutil-workflow JSON
Version 0.0.48 PyPI version JSON
download
home_pageNone
SummaryLightweight workflow orchestration
upload_time2025-04-09 10:51:27
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9.13
licenseMIT
keywords orchestration workflow
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Workflow Orchestration

[![test](https://github.com/ddeutils/ddeutil-workflow/actions/workflows/tests.yml/badge.svg?branch=main)](https://github.com/ddeutils/ddeutil-workflow/actions/workflows/tests.yml)
[![codecov](https://codecov.io/gh/ddeutils/ddeutil-workflow/graph/badge.svg?token=3NDPN2I0H9)](https://codecov.io/gh/ddeutils/ddeutil-workflow)
[![pypi version](https://img.shields.io/pypi/v/ddeutil-workflow)](https://pypi.org/project/ddeutil-workflow/)
[![python support version](https://img.shields.io/pypi/pyversions/ddeutil-workflow)](https://pypi.org/project/ddeutil-workflow/)
[![size](https://img.shields.io/github/languages/code-size/ddeutils/ddeutil-workflow)](https://github.com/ddeutils/ddeutil-workflow)
[![gh license](https://img.shields.io/github/license/ddeutils/ddeutil-workflow)](https://github.com/ddeutils/ddeutil-workflow/blob/main/LICENSE)
[![code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)

The **Lightweight Workflow Orchestration** with fewer dependencies the was created
for easy to make a simple metadata driven data workflow. It can use for data operator
by a `.yaml` template.

> [!WARNING]
> This package provide only orchestration workload. That mean you should not
> use the workflow stage to process any large volume data which use a lot of compute
> resource :cold_sweat:.

In my opinion, I think it should not create duplicate workflow codes if I can
write with dynamic input parameters on the one template workflow that just change
the input parameters per use-case instead.
This way I can handle a lot of logical workflows in our orgs with only metadata
configuration. It called **Metadata Driven Data Workflow**.

---

**:pushpin: <u>Rules of This Workflow engine</u>**:

1. The Minimum frequency unit of built-in scheduling is **1 Minute** 🕘
2. **Can not** re-run only failed stage and its pending downstream ↩️
3. All parallel tasks inside workflow core engine use **Multi-Threading** pool
   (Python 3.13 unlock GIL 🐍🔓)

---

**:memo: <u>Workflow Diagrams</u>**:

This diagram show where is this application run on the production infrastructure.
You will see that this application do only running code with stress-less which mean
you should to set the data layer separate this core program before run this application.

```mermaid
flowchart LR
    A((fa:fa-user User))

    subgraph Docker Container
        direction TB
        G@{ shape: rounded, label: "📡Observe<br>Application" }
    end

    subgraph Docker Container
        direction TB
        B@{ shape: rounded, label: "🏃Workflow<br>Application" }
    end

    A <-->|action &<br>response| B
    B -...-> |response| G
    G -...-> |request| B

    subgraph Data Context
        D@{ shape: processes, label: "Logs" }
        E@{ shape: lin-cyl, label: "Audit<br>Logs" }
    end

    subgraph Config Context
        F@{ shape: tag-rect, label: "YAML<br>files" }
    end

    A ---> |push| H(Repo)
    H -.-> |pull| F

    B <-->|disable &<br>read| F

    B <-->|read &<br>write| E

    B -->|write| D

    D -.->|read| G
    E -.->|read| G
```

> [!WARNING]
> _**Disclaimer**_: I inspire the dynamic YAML statement from the [**GitHub Action**](https://github.com/features/actions),
> and all configs pattern from several data orchestration framework tools from
> my data engineering experience. :grimacing:

> [!NOTE]
> Other workflow orchestration tools that I interest and pick them to be inspiration
> some for this package:
>
> - [Google **Workflows**](https://cloud.google.com/workflows)
> - [AWS **Step Functions**](https://aws.amazon.com/step-functions/)

## 📦 Installation

This project need `ddeutil` and `ddeutil-io` extension namespace packages.
If you want to install this package with application add-ons, you should add
`app` in installation;

| Use-case       | Install Optional         |       Support       |
|----------------|--------------------------|:-------------------:|
| Python         | `ddeutil-workflow`       | :heavy_check_mark:  |
| FastAPI Server | `ddeutil-workflow[api]`  | :heavy_check_mark:  |

## :beers: Usage

This is examples that use workflow file for running common Data Engineering
use-case.

> [!IMPORTANT]
> I recommend you to use the `call` stage for all actions that you want to do
> with workflow activity that you want to orchestrate. Because it is able to
> dynamic an input argument with the same call function that make you use less
> time to maintenance your data workflows.

```yaml
run-py-local:

   # Validate model that use to parsing exists for template file
   type: Workflow
   on:
      # If workflow deploy to schedule, it will run every 5 minutes
      # with Asia/Bangkok timezone.
      - cronjob: '*/5 * * * *'
        timezone: "Asia/Bangkok"
   params:
      # Incoming execution parameters will validate with this type. It allows
      # to set default value or templating.
      source-extract: str
      run-date: datetime
   jobs:
      getting-api-data:
         runs-on:
            type: local
         stages:
            - name: "Retrieve API Data"
              id: retrieve-api
              uses: tasks/get-api-with-oauth-to-s3@requests
              with:
                 # Arguments of source data that want to retrieve.
                 method: post
                 url: https://finances/open-data/currency-pairs/
                 body:
                    resource: ${{ params.source-extract }}

                    # You can use filtering like Jinja template but this
                    # package does not use it.
                    filter: ${{ params.run-date | fmt(fmt='%Y%m%d') }}
                 auth:
                    type: bearer
                    keys: ${API_ACCESS_REFRESH_TOKEN}

                 # Arguments of target data that want to land.
                 writing_mode: flatten
                 aws_s3_path: my-data/open-data/${{ params.source-extract }}

                 # This Authentication code should implement with your custom call
                 # function. The template allow you to use environment variable.
                 aws_access_client_id: ${AWS_ACCESS_CLIENT_ID}
                 aws_access_client_secret: ${AWS_ACCESS_CLIENT_SECRET}
```

The above workflow template is main executor pipeline that you want to do. If you
want to schedule this workflow, you want to dynamic its parameters change base on
execution time such as `run-date` should change base on that workflow running date.

```python
from ddeutil.workflow import Workflow, Result

workflow: Workflow = Workflow.from_conf('run-py-local')
result: Result = workflow.execute(
   params={"source-extract": "USD-THB", "asat-dt": "2024-01-01"}
)
```

So, this package provide the `Schedule` template for this action, and you can dynamic
pass the parameters for changing align with that running time by the `release` prefix.

```yaml
schedule-run-local-wf:

   # Validate model that use to parsing exists for template file
   type: Schedule
   workflows:

      # Map existing workflow that want to deploy with scheduler application.
      # It allows you to pass release parameter that dynamic change depend on the
      # current context of this scheduler application releasing that time.
      - name: run-py-local
        params:
          source-extract: "USD-THB"
          asat-dt: "${{ release.logical_date }}"
```

The main method of the `Schedule` model that use to running is `pending`. If you
do not pass the `stop` date on this method, it will use config with `WORKFLOW_APP_STOP_BOUNDARY_DELTA`
key for generate this stop date.

```python
from ddeutil.workflow import Schedule

(
   Schedule
   .from_conf("schedule-run-local-wf")
   .pending(stop=None)
)
```

## :cookie: Configuration

The main configuration that use to dynamic changing this workflow engine for your
objective use environment variable only. If any configuration values do not set yet,
it will use default value and do not raise any error to you.

> [!IMPORTANT]
> The config value that you will set on the environment should combine with
> prefix, component, and name which is `WORKFLOW_{component}_{name}` (Upper case).

| Name                         | Component | Default                                                                                                                         | Description                                                                                                        |
|:-----------------------------|:---------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------|
| **ROOT_PATH**                |   Core    | `.`                                                                                                                             | Root path or the project path for this workflow engine.                                                            |
| **REGISTRY_CALLER**          |   Core    | `.`                                                                                                                             | List of importable string for the call stage.                                                                      |
| **REGISTRY_FILTER**          |   Core    | `ddeutil.workflow.templates`                                                                                                    | List of importable string for the filter template.                                                                 |
| **CONF_PATH**                |   Core    | `conf`                                                                                                                          | The config path that keep all template `.yaml` files.                                                              |
| **TIMEZONE**                 |   Core    | `Asia/Bangkok`                                                                                                                  | A Timezone string value that will pass to `ZoneInfo` object.                                                       |
| **STAGE_DEFAULT_ID**         |   Core    | `true`                                                                                                                          | A flag that enable default stage ID that use for catch an execution output.                                        |
| **STAGE_RAISE_ERROR**        |   Core    | `false`                                                                                                                         | A flag that all stage raise StageException from stage execution.                                                   |
| **JOB_DEFAULT_ID**           |   Core    | `false`                                                                                                                         | A flag that enable default job ID that use for catch an execution output. The ID that use will be sequence number. |
| **JOB_RAISE_ERROR**          |   Core    | `true`                                                                                                                          | A flag that all job raise JobException from job strategy execution.                                                |
| **MAX_CRON_PER_WORKFLOW**    |   Core    | `5`                                                                                                                             |                                                                                                                    |
| **MAX_QUEUE_COMPLETE_HIST**  |   Core    | `16`                                                                                                                            |                                                                                                                    |
| **GENERATE_ID_SIMPLE_MODE**  |   Core    | `true`                                                                                                                          | A flog that enable generating ID with `md5` algorithm.                                                             |
| **DEBUG_MODE**               |    Log    | `true`                                                                                                                          | A flag that enable logging with debug level mode.                                                                  |
| **FORMAT**                   |    Log    | `%(asctime)s.%(msecs)03d (%(name)-10s, %(process)-5d,%(thread)-5d) [%(levelname)-7s] %(message)-120s (%(filename)s:%(lineno)s)` |                                                                                                                    |
| **FORMAT_FILE**              |    Log    | `{datetime} ({process:5d}, {thread:5d}) {message:120s} ({filename}:{lineno})`                                                   |                                                                                                                    |
| **DATETIME_FORMAT**          |    Log    | `%Y-%m-%d %H:%M:%S`                                                                                                             |                                                                                                                    |
| **TRACE_PATH**               |    Log    | `./logs`                                                                                                                        | The log path of the workflow saving log.                                                                           |
| **TRACE_ENABLE_WRITE**       |    Log    | `false`                                                                                                                         |                                                                                                                    |
| **AUDIT_PATH**               |    Log    | `./audits`                                                                                                                      |                                                                                                                    |
| **AUDIT_ENABLE_WRITE**       |    Log    | `true`                                                                                                                          | A flag that enable logging object saving log to its destination.                                                   |
| **MAX_PROCESS**              |    App    | `2`                                                                                                                             | The maximum process worker number that run in scheduler app module.                                                |
| **MAX_SCHEDULE_PER_PROCESS** |    App    | `100`                                                                                                                           | A schedule per process that run parallel.                                                                          |
| **STOP_BOUNDARY_DELTA**      |    App    | `'{"minutes": 5, "seconds": 20}'`                                                                                               | A time delta value that use to stop scheduler app in json string format.                                           |

**API Application**:

This config part use for the workflow application that build from the FastAPI
only.

| Environment                |  Component  | Default | Description                                                                        |
|:---------------------------|:-----------:|---------|------------------------------------------------------------------------------------|
| **ENABLE_ROUTE_WORKFLOW**  |     API     | `true`  | A flag that enable workflow route to manage execute manually and workflow logging. |
| **ENABLE_ROUTE_SCHEDULE**  |     API     | `true`  | A flag that enable run scheduler.                                                  |

## :rocket: Deployment

This package able to run as an application service for receive manual trigger
from the master node via RestAPI or use to be Scheduler background service
like crontab job but via Python API.

### API Server

```shell
(venv) $ uvicorn ddeutil.workflow.api:app \
  --host 127.0.0.1 \
  --port 80 \
  --no-access-log
```

> [!NOTE]
> If this package already deploy, it is able to use multiprocess;
> `uvicorn ddeutil.workflow.api:app --host 127.0.0.1 --port 80 --workers 4`

### Docker Container

```shell
$ docker build -t ddeutil-workflow:latest -f .container/Dockerfile .
$ docker run -i ddeutil-workflow:latest ddeutil-workflow
```

## :speech_balloon: Contribute

I do not think this project will go around the world because it has specific propose,
and you can create by your coding without this project dependency for long term
solution. So, on this time, you can open [the GitHub issue on this project :raised_hands:](https://github.com/ddeutils/ddeutil-workflow/issues)
for fix bug or request new feature if you want it.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "ddeutil-workflow",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9.13",
    "maintainer_email": null,
    "keywords": "orchestration, workflow",
    "author": null,
    "author_email": "ddeutils <korawich.anu@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/7f/3f/e9cc9c72be7e3ebcd1acfe7e157a165847e11a58c3b693ef72a74cd1aae0/ddeutil_workflow-0.0.48.tar.gz",
    "platform": null,
    "description": "# Workflow Orchestration\n\n[![test](https://github.com/ddeutils/ddeutil-workflow/actions/workflows/tests.yml/badge.svg?branch=main)](https://github.com/ddeutils/ddeutil-workflow/actions/workflows/tests.yml)\n[![codecov](https://codecov.io/gh/ddeutils/ddeutil-workflow/graph/badge.svg?token=3NDPN2I0H9)](https://codecov.io/gh/ddeutils/ddeutil-workflow)\n[![pypi version](https://img.shields.io/pypi/v/ddeutil-workflow)](https://pypi.org/project/ddeutil-workflow/)\n[![python support version](https://img.shields.io/pypi/pyversions/ddeutil-workflow)](https://pypi.org/project/ddeutil-workflow/)\n[![size](https://img.shields.io/github/languages/code-size/ddeutils/ddeutil-workflow)](https://github.com/ddeutils/ddeutil-workflow)\n[![gh license](https://img.shields.io/github/license/ddeutils/ddeutil-workflow)](https://github.com/ddeutils/ddeutil-workflow/blob/main/LICENSE)\n[![code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n\nThe **Lightweight Workflow Orchestration** with fewer dependencies the was created\nfor easy to make a simple metadata driven data workflow. It can use for data operator\nby a `.yaml` template.\n\n> [!WARNING]\n> This package provide only orchestration workload. That mean you should not\n> use the workflow stage to process any large volume data which use a lot of compute\n> resource :cold_sweat:.\n\nIn my opinion, I think it should not create duplicate workflow codes if I can\nwrite with dynamic input parameters on the one template workflow that just change\nthe input parameters per use-case instead.\nThis way I can handle a lot of logical workflows in our orgs with only metadata\nconfiguration. It called **Metadata Driven Data Workflow**.\n\n---\n\n**:pushpin: <u>Rules of This Workflow engine</u>**:\n\n1. The Minimum frequency unit of built-in scheduling is **1 Minute** \ud83d\udd58\n2. **Can not** re-run only failed stage and its pending downstream \u21a9\ufe0f\n3. All parallel tasks inside workflow core engine use **Multi-Threading** pool\n   (Python 3.13 unlock GIL \ud83d\udc0d\ud83d\udd13)\n\n---\n\n**:memo: <u>Workflow Diagrams</u>**:\n\nThis diagram show where is this application run on the production infrastructure.\nYou will see that this application do only running code with stress-less which mean\nyou should to set the data layer separate this core program before run this application.\n\n```mermaid\nflowchart LR\n    A((fa:fa-user User))\n\n    subgraph Docker Container\n        direction TB\n        G@{ shape: rounded, label: \"\ud83d\udce1Observe<br>Application\" }\n    end\n\n    subgraph Docker Container\n        direction TB\n        B@{ shape: rounded, label: \"\ud83c\udfc3Workflow<br>Application\" }\n    end\n\n    A <-->|action &<br>response| B\n    B -...-> |response| G\n    G -...-> |request| B\n\n    subgraph Data Context\n        D@{ shape: processes, label: \"Logs\" }\n        E@{ shape: lin-cyl, label: \"Audit<br>Logs\" }\n    end\n\n    subgraph Config Context\n        F@{ shape: tag-rect, label: \"YAML<br>files\" }\n    end\n\n    A ---> |push| H(Repo)\n    H -.-> |pull| F\n\n    B <-->|disable &<br>read| F\n\n    B <-->|read &<br>write| E\n\n    B -->|write| D\n\n    D -.->|read| G\n    E -.->|read| G\n```\n\n> [!WARNING]\n> _**Disclaimer**_: I inspire the dynamic YAML statement from the [**GitHub Action**](https://github.com/features/actions),\n> and all configs pattern from several data orchestration framework tools from\n> my data engineering experience. :grimacing:\n\n> [!NOTE]\n> Other workflow orchestration tools that I interest and pick them to be inspiration\n> some for this package:\n>\n> - [Google **Workflows**](https://cloud.google.com/workflows)\n> - [AWS **Step Functions**](https://aws.amazon.com/step-functions/)\n\n## \ud83d\udce6 Installation\n\nThis project need `ddeutil` and `ddeutil-io` extension namespace packages.\nIf you want to install this package with application add-ons, you should add\n`app` in installation;\n\n| Use-case       | Install Optional         |       Support       |\n|----------------|--------------------------|:-------------------:|\n| Python         | `ddeutil-workflow`       | :heavy_check_mark:  |\n| FastAPI Server | `ddeutil-workflow[api]`  | :heavy_check_mark:  |\n\n## :beers: Usage\n\nThis is examples that use workflow file for running common Data Engineering\nuse-case.\n\n> [!IMPORTANT]\n> I recommend you to use the `call` stage for all actions that you want to do\n> with workflow activity that you want to orchestrate. Because it is able to\n> dynamic an input argument with the same call function that make you use less\n> time to maintenance your data workflows.\n\n```yaml\nrun-py-local:\n\n   # Validate model that use to parsing exists for template file\n   type: Workflow\n   on:\n      # If workflow deploy to schedule, it will run every 5 minutes\n      # with Asia/Bangkok timezone.\n      - cronjob: '*/5 * * * *'\n        timezone: \"Asia/Bangkok\"\n   params:\n      # Incoming execution parameters will validate with this type. It allows\n      # to set default value or templating.\n      source-extract: str\n      run-date: datetime\n   jobs:\n      getting-api-data:\n         runs-on:\n            type: local\n         stages:\n            - name: \"Retrieve API Data\"\n              id: retrieve-api\n              uses: tasks/get-api-with-oauth-to-s3@requests\n              with:\n                 # Arguments of source data that want to retrieve.\n                 method: post\n                 url: https://finances/open-data/currency-pairs/\n                 body:\n                    resource: ${{ params.source-extract }}\n\n                    # You can use filtering like Jinja template but this\n                    # package does not use it.\n                    filter: ${{ params.run-date | fmt(fmt='%Y%m%d') }}\n                 auth:\n                    type: bearer\n                    keys: ${API_ACCESS_REFRESH_TOKEN}\n\n                 # Arguments of target data that want to land.\n                 writing_mode: flatten\n                 aws_s3_path: my-data/open-data/${{ params.source-extract }}\n\n                 # This Authentication code should implement with your custom call\n                 # function. The template allow you to use environment variable.\n                 aws_access_client_id: ${AWS_ACCESS_CLIENT_ID}\n                 aws_access_client_secret: ${AWS_ACCESS_CLIENT_SECRET}\n```\n\nThe above workflow template is main executor pipeline that you want to do. If you\nwant to schedule this workflow, you want to dynamic its parameters change base on\nexecution time such as `run-date` should change base on that workflow running date.\n\n```python\nfrom ddeutil.workflow import Workflow, Result\n\nworkflow: Workflow = Workflow.from_conf('run-py-local')\nresult: Result = workflow.execute(\n   params={\"source-extract\": \"USD-THB\", \"asat-dt\": \"2024-01-01\"}\n)\n```\n\nSo, this package provide the `Schedule` template for this action, and you can dynamic\npass the parameters for changing align with that running time by the `release` prefix.\n\n```yaml\nschedule-run-local-wf:\n\n   # Validate model that use to parsing exists for template file\n   type: Schedule\n   workflows:\n\n      # Map existing workflow that want to deploy with scheduler application.\n      # It allows you to pass release parameter that dynamic change depend on the\n      # current context of this scheduler application releasing that time.\n      - name: run-py-local\n        params:\n          source-extract: \"USD-THB\"\n          asat-dt: \"${{ release.logical_date }}\"\n```\n\nThe main method of the `Schedule` model that use to running is `pending`. If you\ndo not pass the `stop` date on this method, it will use config with `WORKFLOW_APP_STOP_BOUNDARY_DELTA`\nkey for generate this stop date.\n\n```python\nfrom ddeutil.workflow import Schedule\n\n(\n   Schedule\n   .from_conf(\"schedule-run-local-wf\")\n   .pending(stop=None)\n)\n```\n\n## :cookie: Configuration\n\nThe main configuration that use to dynamic changing this workflow engine for your\nobjective use environment variable only. If any configuration values do not set yet,\nit will use default value and do not raise any error to you.\n\n> [!IMPORTANT]\n> The config value that you will set on the environment should combine with\n> prefix, component, and name which is `WORKFLOW_{component}_{name}` (Upper case).\n\n| Name                         | Component | Default                                                                                                                         | Description                                                                                                        |\n|:-----------------------------|:---------:|:--------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------|\n| **ROOT_PATH**                |   Core    | `.`                                                                                                                             | Root path or the project path for this workflow engine.                                                            |\n| **REGISTRY_CALLER**          |   Core    | `.`                                                                                                                             | List of importable string for the call stage.                                                                      |\n| **REGISTRY_FILTER**          |   Core    | `ddeutil.workflow.templates`                                                                                                    | List of importable string for the filter template.                                                                 |\n| **CONF_PATH**                |   Core    | `conf`                                                                                                                          | The config path that keep all template `.yaml` files.                                                              |\n| **TIMEZONE**                 |   Core    | `Asia/Bangkok`                                                                                                                  | A Timezone string value that will pass to `ZoneInfo` object.                                                       |\n| **STAGE_DEFAULT_ID**         |   Core    | `true`                                                                                                                          | A flag that enable default stage ID that use for catch an execution output.                                        |\n| **STAGE_RAISE_ERROR**        |   Core    | `false`                                                                                                                         | A flag that all stage raise StageException from stage execution.                                                   |\n| **JOB_DEFAULT_ID**           |   Core    | `false`                                                                                                                         | A flag that enable default job ID that use for catch an execution output. The ID that use will be sequence number. |\n| **JOB_RAISE_ERROR**          |   Core    | `true`                                                                                                                          | A flag that all job raise JobException from job strategy execution.                                                |\n| **MAX_CRON_PER_WORKFLOW**    |   Core    | `5`                                                                                                                             |                                                                                                                    |\n| **MAX_QUEUE_COMPLETE_HIST**  |   Core    | `16`                                                                                                                            |                                                                                                                    |\n| **GENERATE_ID_SIMPLE_MODE**  |   Core    | `true`                                                                                                                          | A flog that enable generating ID with `md5` algorithm.                                                             |\n| **DEBUG_MODE**               |    Log    | `true`                                                                                                                          | A flag that enable logging with debug level mode.                                                                  |\n| **FORMAT**                   |    Log    | `%(asctime)s.%(msecs)03d (%(name)-10s, %(process)-5d,%(thread)-5d) [%(levelname)-7s] %(message)-120s (%(filename)s:%(lineno)s)` |                                                                                                                    |\n| **FORMAT_FILE**              |    Log    | `{datetime} ({process:5d}, {thread:5d}) {message:120s} ({filename}:{lineno})`                                                   |                                                                                                                    |\n| **DATETIME_FORMAT**          |    Log    | `%Y-%m-%d %H:%M:%S`                                                                                                             |                                                                                                                    |\n| **TRACE_PATH**               |    Log    | `./logs`                                                                                                                        | The log path of the workflow saving log.                                                                           |\n| **TRACE_ENABLE_WRITE**       |    Log    | `false`                                                                                                                         |                                                                                                                    |\n| **AUDIT_PATH**               |    Log    | `./audits`                                                                                                                      |                                                                                                                    |\n| **AUDIT_ENABLE_WRITE**       |    Log    | `true`                                                                                                                          | A flag that enable logging object saving log to its destination.                                                   |\n| **MAX_PROCESS**              |    App    | `2`                                                                                                                             | The maximum process worker number that run in scheduler app module.                                                |\n| **MAX_SCHEDULE_PER_PROCESS** |    App    | `100`                                                                                                                           | A schedule per process that run parallel.                                                                          |\n| **STOP_BOUNDARY_DELTA**      |    App    | `'{\"minutes\": 5, \"seconds\": 20}'`                                                                                               | A time delta value that use to stop scheduler app in json string format.                                           |\n\n**API Application**:\n\nThis config part use for the workflow application that build from the FastAPI\nonly.\n\n| Environment                |  Component  | Default | Description                                                                        |\n|:---------------------------|:-----------:|---------|------------------------------------------------------------------------------------|\n| **ENABLE_ROUTE_WORKFLOW**  |     API     | `true`  | A flag that enable workflow route to manage execute manually and workflow logging. |\n| **ENABLE_ROUTE_SCHEDULE**  |     API     | `true`  | A flag that enable run scheduler.                                                  |\n\n## :rocket: Deployment\n\nThis package able to run as an application service for receive manual trigger\nfrom the master node via RestAPI or use to be Scheduler background service\nlike crontab job but via Python API.\n\n### API Server\n\n```shell\n(venv) $ uvicorn ddeutil.workflow.api:app \\\n  --host 127.0.0.1 \\\n  --port 80 \\\n  --no-access-log\n```\n\n> [!NOTE]\n> If this package already deploy, it is able to use multiprocess;\n> `uvicorn ddeutil.workflow.api:app --host 127.0.0.1 --port 80 --workers 4`\n\n### Docker Container\n\n```shell\n$ docker build -t ddeutil-workflow:latest -f .container/Dockerfile .\n$ docker run -i ddeutil-workflow:latest ddeutil-workflow\n```\n\n## :speech_balloon: Contribute\n\nI do not think this project will go around the world because it has specific propose,\nand you can create by your coding without this project dependency for long term\nsolution. So, on this time, you can open [the GitHub issue on this project :raised_hands:](https://github.com/ddeutils/ddeutil-workflow/issues)\nfor fix bug or request new feature if you want it.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Lightweight workflow orchestration",
    "version": "0.0.48",
    "project_urls": {
        "Homepage": "https://github.com/ddeutils/ddeutil-workflow/",
        "Source Code": "https://github.com/ddeutils/ddeutil-workflow/"
    },
    "split_keywords": [
        "orchestration",
        " workflow"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "07ccbff930799cad6e0fbac2d134c1c4754afbc960050ac189372fa481c34922",
                "md5": "f096ffb72b9dac1e2d16b6d35f17ebf2",
                "sha256": "98d70d149463749a9527bd735356eb85299bff823dc279be08d9a1f29846989e"
            },
            "downloads": -1,
            "filename": "ddeutil_workflow-0.0.48-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f096ffb72b9dac1e2d16b6d35f17ebf2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9.13",
            "size": 88749,
            "upload_time": "2025-04-09T10:51:25",
            "upload_time_iso_8601": "2025-04-09T10:51:25.430036Z",
            "url": "https://files.pythonhosted.org/packages/07/cc/bff930799cad6e0fbac2d134c1c4754afbc960050ac189372fa481c34922/ddeutil_workflow-0.0.48-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "7f3fe9cc9c72be7e3ebcd1acfe7e157a165847e11a58c3b693ef72a74cd1aae0",
                "md5": "1791078f9032d75de7635c81bf105618",
                "sha256": "12fbbca8a4aee7d0fe00d37f83f030373b7f4f84ae4ae47aae7a2ee0ba63138f"
            },
            "downloads": -1,
            "filename": "ddeutil_workflow-0.0.48.tar.gz",
            "has_sig": false,
            "md5_digest": "1791078f9032d75de7635c81bf105618",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9.13",
            "size": 107596,
            "upload_time": "2025-04-09T10:51:27",
            "upload_time_iso_8601": "2025-04-09T10:51:27.240548Z",
            "url": "https://files.pythonhosted.org/packages/7f/3f/e9cc9c72be7e3ebcd1acfe7e157a165847e11a58c3b693ef72a74cd1aae0/ddeutil_workflow-0.0.48.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-04-09 10:51:27",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ddeutils",
    "github_project": "ddeutil-workflow",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "ddeutil-workflow"
}
        
Elapsed time: 1.62975s