django-cog


Namedjango-cog JSON
Version 1.3.10 PyPI version JSON
download
home_pagehttps://github.com/david-pettifor-nd/django_cog.git
SummaryDjango library for launching pipelines of multiple stages and parallel tasks.
upload_time2023-08-07 16:59:07
maintainer
docs_urlNone
authorDavid W Pettifor
requires_python
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Django-Cog
A [django-celery-beat](https://github.com/celery/django-celery-beat) extension to build pipelines of chronological stages and parallel tasks.

## About
Using the Djano admin, this library allows you to create pipelines of multi-staged tasks.  Each pipeline is launched at a specific time, utilizing the `django-celery-beat` `CrontabSchedule` object to define the time of launch.  Once launched, the pipeline looks for the first stage(s) of parallel tasks.  Each task is submitted to a celery worker for completion.  Once all tasks of a stage complete, the stage is considered complete and any proceeding stages will launch (assuming all previous stages required are completed).

### Pipelines
A pipeline is a collection of stages.  It has an optional launch schedule tied to a `CronSchedule` that defines when the pipeline should be launched.  By default, a pipeline will not launch if it detects that a previous launch has not yet completed.

You can also leave the launch schedule blank should you not want a particular pipeline to run on its own.  Instead, clicking the `Launch Now` button on that pipeline's Django admin page will begin its execution.

_Note:_ If a pipeline was previously scheduled using a `CronSchedule` object but needs to be taken off the schedule, simply set the `Schedule` field of the pipeline to the first option of the drop-down menu (`---------`).  This will remove the generated `PeriodicTask` within the database scheduler, preventing further auto-execution of that pipeline.

### Stages
A stage is a collection of tasks that can all be ran independently (and in parallel) of each other.  It can be dependent on any number of previous stages to be complete before launching, but will be launched upon the completion of all prerequisite stages.

If a stage has no prerequisite stages, it will be launched at the start of the Pipeline's launch.  You can have multiple stages run at the same time.

Upon launching a stage, each task that belongs to it will be sent to the `celery` broker for execution.

### Tasks and Cogs

#### Cogs:
A `Cog` is a registered python function that can be used in a task.  To register a function, use the `@cog` function decorator:
```python
from django_cog import cog

@cog
def my_task():
    # do something in a celery worker
    pass
```

**NOTE**: These functions *must* be imported in your application's `__init__.py` file.  Otherwise the auto-discovery process will not find them.

Once Django starts, an auto-discovery process will find all functions with this decorator and create `Cog` records for them in the database.  This allows them to be reference in the Django admin.

#### Tasks:
Once you have cogs registered, you can create a task.  Tasks are specific execution definitions for a cog, and are tied to a stage.  This allows you to run the same function through multiple stages, if needed.

##### Parameters
If your function has parameters needed, you can set these in the Task creation.  See below for an example:

```python
from django_cog import cog

@cog
def add(a, b):
    # add the two numbers together
    return a + b
```

Then in the Task Django admin page, set these variables in the `Arguments as JSON:` field:
```json
{
    "a": 1,
    "b": 2
}
```

## Installation

**IMPORTANT**: It is required that the library is installed and migrations ran PRIOR to registering functions as `cogs`.
First install the library with:

```bash
pip install django-cog
```

Then add it to your Django application's `INSTALLED_APPS` inside your `settings.py` file:
```python
INSTALLED_APPS = [
    ...

    # Django-Cog:
    'django_cog.apps.DjangoCogConfig',

    # Required for nested inline child views in the Django Admin:
    'nested_inline',

    # Optional (recommended):
    'django_celery_beat',
]
```

Lastly, run migrations:
```python
python manage.py migrate django_cog
```

#### Cog Registration
Once migrations complete, it is safe to register your functions using the `cog` decorator:
```python
from django_cog import cog

@cog
def my_task():
    # do something in a celery worker
    pass
```

##### Manual Registration
If the library doesn't pick up your registered cogs and doesn't create `Cog` records in the admin automatically, you can call the `Cog` record creation function manually in the Django shell:
```python
from django_cog.apps import create_cog_records
create_cog_records()
```

_Note_: This does require the functions to be registered cogs.  You can list the functions `django_cog` has discovered with:
```python
from django_cog import cog
print(cog.all)
```
This will output a dictionary where the key is the function name it discovered, and the value is the actual function itself.

__If your function is not listed here:__ this means the registration was not called.  This is likely due to the function not being imported in your applications `__init__.py`.  Double check to make sure you are importing the function somewhere that gets called on Django's startup (`__init__.py` is the recommended place for this).

## Docker-Compose

Below is a sample docker-compose.yml segment to add the required services for Celery workers, Celery-Beat, and Redis:

```yml
version: '3'

# 3 name volumes are named here.
volumes:
    volume_postgresdata:        # Store the postgres database data. Only linked with postgres.
    volume_django_media:        # Store the Django media files. Volume is shared between djangoweb and nginx.
    volume_django_static:       # Store the Django static files. Volume is shared between djangoweb and nginx.

services:
    # Postgresql database settings.
    postgresdb:
        image: postgres:9.6-alpine
        environment:
            - POSTGRES_DB
            - POSTGRES_USER
            - POSTGRES_PASSWORD
        restart: unless-stopped
        volumes:
            - volume_postgresdata:/var/lib/postgresql/data
        ports:
            - "127.0.0.1:5432:5432"
        networks:
            - backend

    # Django settings.
    djangoweb:
        build:
            context: .
            args:
                - DJANGO_ENVIRONMENT=${DJANGO_ENVIRONMENT:-production}
        image: djangoapp:latest
        networks:
            - backend
            - celery
            - frontend
        volumes:
            - .:/app/
            - volume_django_static:/var/staticfiles
            - volume_django_media:/var/mediafiles
        ports: # IMPORTANT: Make sure to use 127.0.0.1 to keep it local. Otherwise, this will be broadcast to the web.
            - 127.0.0.1:8000:8000
        depends_on:
            - postgresdb
            - mailhog
        environment:
            - POSTGRES_DB
            - POSTGRES_USER
            - POSTGRES_PASSWORD
            - POSTGRES_HOST=postgresdb # Name of the postgresql service.
            - POSTGRES_PORT
            - DJANGO_SETTINGS_MODULE
            - FORCE_SCRIPT_NAME
            - DJANGO_ENVIRONMENT
            - SECRET_KEY
            - DJANGO_ALLOWED_HOSTS
        links:
            - "postgresdb"

    # add redis as a message broker
    redis:
        image: "redis:alpine"
        networks:
            - celery

    # celery worker process -- launches child celery processes equal to the number of available cores
    celery:
        build:
            context: .
            dockerfile: Dockerfile.celery
            args:
                - DJANGO_ENVIRONMENT=${DJANGO_ENVIRONMENT:-production}
        command: celery -A django_cog worker -l info
        image: djangoapp_celery:latest
        volumes:
            - .:/app/
        environment:
            - POSTGRES_DB
            - POSTGRES_USER
            - POSTGRES_PASSWORD
            - POSTGRES_HOST=postgresdb # Name of the postgresql service.
            - POSTGRES_PORT
            - FORCE_SCRIPT_NAME
            - DJANGO_ENVIRONMENT
            - DJANGO_SETTINGS_MODULE
            - SECRET_KEY
            - DJANGO_ALLOWED_HOSTS
        depends_on:
            - postgresdb
            - redis
        networks:
            - backend
            - celery
        links:
            - "postgresdb"

    # celery worker process -- launches child celery processes equal to the number of available cores
    celerybeat:
        build:
            context: .
            dockerfile: Dockerfile.celery
            args:
                - DJANGO_ENVIRONMENT=${DJANGO_ENVIRONMENT:-production}
        command: celery -A django_cog beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler
        image: djangoapp_celerybeat:latest
        volumes:
            - .:/app/
        environment:
            - POSTGRES_DB
            - POSTGRES_USER
            - POSTGRES_PASSWORD
            - POSTGRES_HOST=postgresdb # Name of the postgresql service.
            - POSTGRES_PORT
            - FORCE_SCRIPT_NAME
            - DJANGO_ENVIRONMENT
            - DJANGO_SETTINGS_MODULE
            - SECRET_KEY
            - DJANGO_ALLOWED_HOSTS
        depends_on:
            - celery
        networks:
            - backend
            - celery

networks:
    frontend:
        name: djangocog_frontend
    backend:
        name: djangocog_backend
    celery:
        name: djangocog_celery
```

And the matching `Dockerfile.celery` (of which the `celery` and `celerybeat` services will build from):
```dockerfile
FROM python:3.8
ENV PYTHONUNBUFFERED 1

ARG DJANGO_ENVIRONMENT

# Make the static/media folder.
RUN mkdir /var/staticfiles
RUN mkdir /var/mediafiles

# Make a location for all of our stuff to go into
RUN mkdir /app

# Set the working directory to this new location
WORKDIR /app

# Add our Django code
ADD . /app/

RUN pip install --upgrade pip

# Install requirements for Django
RUN pip install -r requirements/base.txt
RUN pip install -r requirements/${DJANGO_ENVIRONMENT}.txt
RUN pip install -r requirements/custom.txt

# No need for an entry point as they are defined in the docker-compose.yml services
```

## Queues

You can create specific queues within celery, and assign tasks to these queues.  Within the `Task` Django admin, you'll find a `queue` field that defaults to `celery`.  This is the default queue that celery works on.  This queue option is created automatically and is used as the default when you run the django-cog migrations.

#### Adding Queues

If you want to have a separate collection of workers dedicated for certain tasks, create a new `CeleryQueue` record in the Django admin and set your tasks to this queue.

Additionally, you'll need workers to work on this queue.  You can do this by adding a `-Q queue_name` parameter to the `command` call in the `docker-compose.yml` file's `celery` service:

```yml
services:
    celery:
        command: celery -A django_cog worker -l info -Q queue_name
```


## Task Optimization and Weights

In order to achieve better runtimes, tasks are queued in order of greatest "weight" first.  This works best in scenarios where you have some `n` number of tasks that have a longer run time, but have extra tasks whose summation runtimes do not exceed that of the longer tasks.  The longer tasks will be launched first, allowing the runtime of each stage to be in direct correlation of the longest running individual task, rather than an arbitrary coefficient of shorter tasks that happened to be queued up before it.

The weight of a task is updated at the end of each of its Pipeline run completions.  The `weight` field of a `Task` is literally the average number of total seconds it took to complete.  (NOTE: This field is limited to 11 decimal places with a precision of 5 points, which means the maximum runtime supported for a single task is roughly 11.5 days).

The sample size taken defaults to the past 10 runtimes of that task, but this can be adjusted by adding the following to your Django `settings.py`:
```python
DJANGO_COG_TASK_WEIGHT_SAMPLE_SIZE = 10
```

If you do not want to have the weight auto-calculated, you can disable this calculation by adding the following to your Django `settings.py`:
```python
DJANGO_COG_AUTO_WEIGHT = False
```

Note: Tasks will still be ordered by weight (descending) when queued, but they all default to a weight of `1` prior to any runs.  You can also manually adjust this weight through the `Task` Django admin, which enables you to determine the order of queuing yourself (which of course requires the disabling of auto-weight calculation, as described above).

## Error Handling

Each task can be assigned as `Critical` (default `True`).  This will cause the pipeline to halt should any critical task fail to execute.
_NOTE_: Currently running tasks will be allowed to complete, but no new tasks of the same stage will launch, and no further stages of that pipeline will begin.

This leaves the Pipeline in a `Failed` state, so if the pipeline has been marked as "Prevent overlapping runs": the pipeline cannot be launched again until the incomplete pipeline run object has been deleted.  (See below about failed pipelines.)  Once the last task of a failed pipeline completes, the task will update the Pipeline's `Completed On` timestamp to reflect when that last task completed.

#### Error Handlers

Each task call is wrapped in a generic `try/except` clause.  Within the `except` clause, an error handler function is called.  By default, this is the `defaultCogErrorHandler`, which records the error in the `CogErrors` table in the Django admin.  This stores information on the error type, the task run that failed, the date/time of the error, and the traceback of the error.

##### Custom Error Handlers

Each task can be assigned an error handler (if none are assigned, the `defaultCogErrorHandler` is used).  You can write your own error handler function and register it like so:

```python
from django_cog import cog_error_handler

@cog_error_handler
def myErrorHandler(error):
    """
    Do something with the error (Excemption type)
    """
    print(error)
```

Any custom error handler needs to take in _at least_ one positional parameter, that is: the Excemption error.

__OPTIONAL:__ You can also include a keyword argument `task_run` to have the failed task run object passed in:

```python
from django_cog import cog_error_handler

@cog_error_handler
def myErrorHandler(error, task_run):
    """
    Do something with the error (Excemption type)
    and know where it came from.
    """
    print("The following error came from the task:", task_run.task.name)
    print(error)
```

##### Failed Pipelines

Should a task marked `critical` fail, the task, its stage, and pipeline will fall into a `Failed` state.  By default, new pipeline runs cannot be called if the last run failed.  This is done intentionally so the same error does not keep repeating itself.

However, you _can_ override this safety feature by setting `DJANGO_COG_OVERLAP_FAILED = True` in your Django settings.  Doing this will allow a pipeline to launch even if the last run of that pipeline failed.


## Canceling Pipeline Runs

When a pipeline is currently running, you can visit the `Pipeline Runs` Django admin page to see which ones are running (displayed at the top of the list).  Clicking on the details of a running pipeline will show a red `Cancel Run` button at the top.  Clicking this will send the pipeline into a `Canceled` state.  Any currently running tasks will be allowed to complete.  As soon as the last one of the currently running tasks completes, it will update the stage's `Completed On` field to properly reflect the end execution timestamp of the last task.  The stage will then also fall into a `Canceled` state, and no further stages or tasks will be launched.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/david-pettifor-nd/django_cog.git",
    "name": "django-cog",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "David W Pettifor",
    "author_email": "dpettifo@nd.edu",
    "download_url": "https://files.pythonhosted.org/packages/1d/14/db8c31d0c75b9539353def55aeb6958ca6c3afeda77aa4a8c19a8f82d696/django-cog-1.3.10.tar.gz",
    "platform": null,
    "description": "# Django-Cog\nA [django-celery-beat](https://github.com/celery/django-celery-beat) extension to build pipelines of chronological stages and parallel tasks.\n\n## About\nUsing the Djano admin, this library allows you to create pipelines of multi-staged tasks.  Each pipeline is launched at a specific time, utilizing the `django-celery-beat` `CrontabSchedule` object to define the time of launch.  Once launched, the pipeline looks for the first stage(s) of parallel tasks.  Each task is submitted to a celery worker for completion.  Once all tasks of a stage complete, the stage is considered complete and any proceeding stages will launch (assuming all previous stages required are completed).\n\n### Pipelines\nA pipeline is a collection of stages.  It has an optional launch schedule tied to a `CronSchedule` that defines when the pipeline should be launched.  By default, a pipeline will not launch if it detects that a previous launch has not yet completed.\n\nYou can also leave the launch schedule blank should you not want a particular pipeline to run on its own.  Instead, clicking the `Launch Now` button on that pipeline's Django admin page will begin its execution.\n\n_Note:_ If a pipeline was previously scheduled using a `CronSchedule` object but needs to be taken off the schedule, simply set the `Schedule` field of the pipeline to the first option of the drop-down menu (`---------`).  This will remove the generated `PeriodicTask` within the database scheduler, preventing further auto-execution of that pipeline.\n\n### Stages\nA stage is a collection of tasks that can all be ran independently (and in parallel) of each other.  It can be dependent on any number of previous stages to be complete before launching, but will be launched upon the completion of all prerequisite stages.\n\nIf a stage has no prerequisite stages, it will be launched at the start of the Pipeline's launch.  You can have multiple stages run at the same time.\n\nUpon launching a stage, each task that belongs to it will be sent to the `celery` broker for execution.\n\n### Tasks and Cogs\n\n#### Cogs:\nA `Cog` is a registered python function that can be used in a task.  To register a function, use the `@cog` function decorator:\n```python\nfrom django_cog import cog\n\n@cog\ndef my_task():\n    # do something in a celery worker\n    pass\n```\n\n**NOTE**: These functions *must* be imported in your application's `__init__.py` file.  Otherwise the auto-discovery process will not find them.\n\nOnce Django starts, an auto-discovery process will find all functions with this decorator and create `Cog` records for them in the database.  This allows them to be reference in the Django admin.\n\n#### Tasks:\nOnce you have cogs registered, you can create a task.  Tasks are specific execution definitions for a cog, and are tied to a stage.  This allows you to run the same function through multiple stages, if needed.\n\n##### Parameters\nIf your function has parameters needed, you can set these in the Task creation.  See below for an example:\n\n```python\nfrom django_cog import cog\n\n@cog\ndef add(a, b):\n    # add the two numbers together\n    return a + b\n```\n\nThen in the Task Django admin page, set these variables in the `Arguments as JSON:` field:\n```json\n{\n    \"a\": 1,\n    \"b\": 2\n}\n```\n\n## Installation\n\n**IMPORTANT**: It is required that the library is installed and migrations ran PRIOR to registering functions as `cogs`.\nFirst install the library with:\n\n```bash\npip install django-cog\n```\n\nThen add it to your Django application's `INSTALLED_APPS` inside your `settings.py` file:\n```python\nINSTALLED_APPS = [\n    ...\n\n    # Django-Cog:\n    'django_cog.apps.DjangoCogConfig',\n\n    # Required for nested inline child views in the Django Admin:\n    'nested_inline',\n\n    # Optional (recommended):\n    'django_celery_beat',\n]\n```\n\nLastly, run migrations:\n```python\npython manage.py migrate django_cog\n```\n\n#### Cog Registration\nOnce migrations complete, it is safe to register your functions using the `cog` decorator:\n```python\nfrom django_cog import cog\n\n@cog\ndef my_task():\n    # do something in a celery worker\n    pass\n```\n\n##### Manual Registration\nIf the library doesn't pick up your registered cogs and doesn't create `Cog` records in the admin automatically, you can call the `Cog` record creation function manually in the Django shell:\n```python\nfrom django_cog.apps import create_cog_records\ncreate_cog_records()\n```\n\n_Note_: This does require the functions to be registered cogs.  You can list the functions `django_cog` has discovered with:\n```python\nfrom django_cog import cog\nprint(cog.all)\n```\nThis will output a dictionary where the key is the function name it discovered, and the value is the actual function itself.\n\n__If your function is not listed here:__ this means the registration was not called.  This is likely due to the function not being imported in your applications `__init__.py`.  Double check to make sure you are importing the function somewhere that gets called on Django's startup (`__init__.py` is the recommended place for this).\n\n## Docker-Compose\n\nBelow is a sample docker-compose.yml segment to add the required services for Celery workers, Celery-Beat, and Redis:\n\n```yml\nversion: '3'\n\n# 3 name volumes are named here.\nvolumes:\n    volume_postgresdata:        # Store the postgres database data. Only linked with postgres.\n    volume_django_media:        # Store the Django media files. Volume is shared between djangoweb and nginx.\n    volume_django_static:       # Store the Django static files. Volume is shared between djangoweb and nginx.\n\nservices:\n    # Postgresql database settings.\n    postgresdb:\n        image: postgres:9.6-alpine\n        environment:\n            - POSTGRES_DB\n            - POSTGRES_USER\n            - POSTGRES_PASSWORD\n        restart: unless-stopped\n        volumes:\n            - volume_postgresdata:/var/lib/postgresql/data\n        ports:\n            - \"127.0.0.1:5432:5432\"\n        networks:\n            - backend\n\n    # Django settings.\n    djangoweb:\n        build:\n            context: .\n            args:\n                - DJANGO_ENVIRONMENT=${DJANGO_ENVIRONMENT:-production}\n        image: djangoapp:latest\n        networks:\n            - backend\n            - celery\n            - frontend\n        volumes:\n            - .:/app/\n            - volume_django_static:/var/staticfiles\n            - volume_django_media:/var/mediafiles\n        ports: # IMPORTANT: Make sure to use 127.0.0.1 to keep it local. Otherwise, this will be broadcast to the web.\n            - 127.0.0.1:8000:8000\n        depends_on:\n            - postgresdb\n            - mailhog\n        environment:\n            - POSTGRES_DB\n            - POSTGRES_USER\n            - POSTGRES_PASSWORD\n            - POSTGRES_HOST=postgresdb # Name of the postgresql service.\n            - POSTGRES_PORT\n            - DJANGO_SETTINGS_MODULE\n            - FORCE_SCRIPT_NAME\n            - DJANGO_ENVIRONMENT\n            - SECRET_KEY\n            - DJANGO_ALLOWED_HOSTS\n        links:\n            - \"postgresdb\"\n\n    # add redis as a message broker\n    redis:\n        image: \"redis:alpine\"\n        networks:\n            - celery\n\n    # celery worker process -- launches child celery processes equal to the number of available cores\n    celery:\n        build:\n            context: .\n            dockerfile: Dockerfile.celery\n            args:\n                - DJANGO_ENVIRONMENT=${DJANGO_ENVIRONMENT:-production}\n        command: celery -A django_cog worker -l info\n        image: djangoapp_celery:latest\n        volumes:\n            - .:/app/\n        environment:\n            - POSTGRES_DB\n            - POSTGRES_USER\n            - POSTGRES_PASSWORD\n            - POSTGRES_HOST=postgresdb # Name of the postgresql service.\n            - POSTGRES_PORT\n            - FORCE_SCRIPT_NAME\n            - DJANGO_ENVIRONMENT\n            - DJANGO_SETTINGS_MODULE\n            - SECRET_KEY\n            - DJANGO_ALLOWED_HOSTS\n        depends_on:\n            - postgresdb\n            - redis\n        networks:\n            - backend\n            - celery\n        links:\n            - \"postgresdb\"\n\n    # celery worker process -- launches child celery processes equal to the number of available cores\n    celerybeat:\n        build:\n            context: .\n            dockerfile: Dockerfile.celery\n            args:\n                - DJANGO_ENVIRONMENT=${DJANGO_ENVIRONMENT:-production}\n        command: celery -A django_cog beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler\n        image: djangoapp_celerybeat:latest\n        volumes:\n            - .:/app/\n        environment:\n            - POSTGRES_DB\n            - POSTGRES_USER\n            - POSTGRES_PASSWORD\n            - POSTGRES_HOST=postgresdb # Name of the postgresql service.\n            - POSTGRES_PORT\n            - FORCE_SCRIPT_NAME\n            - DJANGO_ENVIRONMENT\n            - DJANGO_SETTINGS_MODULE\n            - SECRET_KEY\n            - DJANGO_ALLOWED_HOSTS\n        depends_on:\n            - celery\n        networks:\n            - backend\n            - celery\n\nnetworks:\n    frontend:\n        name: djangocog_frontend\n    backend:\n        name: djangocog_backend\n    celery:\n        name: djangocog_celery\n```\n\nAnd the matching `Dockerfile.celery` (of which the `celery` and `celerybeat` services will build from):\n```dockerfile\nFROM python:3.8\nENV PYTHONUNBUFFERED 1\n\nARG DJANGO_ENVIRONMENT\n\n# Make the static/media folder.\nRUN mkdir /var/staticfiles\nRUN mkdir /var/mediafiles\n\n# Make a location for all of our stuff to go into\nRUN mkdir /app\n\n# Set the working directory to this new location\nWORKDIR /app\n\n# Add our Django code\nADD . /app/\n\nRUN pip install --upgrade pip\n\n# Install requirements for Django\nRUN pip install -r requirements/base.txt\nRUN pip install -r requirements/${DJANGO_ENVIRONMENT}.txt\nRUN pip install -r requirements/custom.txt\n\n# No need for an entry point as they are defined in the docker-compose.yml services\n```\n\n## Queues\n\nYou can create specific queues within celery, and assign tasks to these queues.  Within the `Task` Django admin, you'll find a `queue` field that defaults to `celery`.  This is the default queue that celery works on.  This queue option is created automatically and is used as the default when you run the django-cog migrations.\n\n#### Adding Queues\n\nIf you want to have a separate collection of workers dedicated for certain tasks, create a new `CeleryQueue` record in the Django admin and set your tasks to this queue.\n\nAdditionally, you'll need workers to work on this queue.  You can do this by adding a `-Q queue_name` parameter to the `command` call in the `docker-compose.yml` file's `celery` service:\n\n```yml\nservices:\n    celery:\n        command: celery -A django_cog worker -l info -Q queue_name\n```\n\n\n## Task Optimization and Weights\n\nIn order to achieve better runtimes, tasks are queued in order of greatest \"weight\" first.  This works best in scenarios where you have some `n` number of tasks that have a longer run time, but have extra tasks whose summation runtimes do not exceed that of the longer tasks.  The longer tasks will be launched first, allowing the runtime of each stage to be in direct correlation of the longest running individual task, rather than an arbitrary coefficient of shorter tasks that happened to be queued up before it.\n\nThe weight of a task is updated at the end of each of its Pipeline run completions.  The `weight` field of a `Task` is literally the average number of total seconds it took to complete.  (NOTE: This field is limited to 11 decimal places with a precision of 5 points, which means the maximum runtime supported for a single task is roughly 11.5 days).\n\nThe sample size taken defaults to the past 10 runtimes of that task, but this can be adjusted by adding the following to your Django `settings.py`:\n```python\nDJANGO_COG_TASK_WEIGHT_SAMPLE_SIZE = 10\n```\n\nIf you do not want to have the weight auto-calculated, you can disable this calculation by adding the following to your Django `settings.py`:\n```python\nDJANGO_COG_AUTO_WEIGHT = False\n```\n\nNote: Tasks will still be ordered by weight (descending) when queued, but they all default to a weight of `1` prior to any runs.  You can also manually adjust this weight through the `Task` Django admin, which enables you to determine the order of queuing yourself (which of course requires the disabling of auto-weight calculation, as described above).\n\n## Error Handling\n\nEach task can be assigned as `Critical` (default `True`).  This will cause the pipeline to halt should any critical task fail to execute.\n_NOTE_: Currently running tasks will be allowed to complete, but no new tasks of the same stage will launch, and no further stages of that pipeline will begin.\n\nThis leaves the Pipeline in a `Failed` state, so if the pipeline has been marked as \"Prevent overlapping runs\": the pipeline cannot be launched again until the incomplete pipeline run object has been deleted.  (See below about failed pipelines.)  Once the last task of a failed pipeline completes, the task will update the Pipeline's `Completed On` timestamp to reflect when that last task completed.\n\n#### Error Handlers\n\nEach task call is wrapped in a generic `try/except` clause.  Within the `except` clause, an error handler function is called.  By default, this is the `defaultCogErrorHandler`, which records the error in the `CogErrors` table in the Django admin.  This stores information on the error type, the task run that failed, the date/time of the error, and the traceback of the error.\n\n##### Custom Error Handlers\n\nEach task can be assigned an error handler (if none are assigned, the `defaultCogErrorHandler` is used).  You can write your own error handler function and register it like so:\n\n```python\nfrom django_cog import cog_error_handler\n\n@cog_error_handler\ndef myErrorHandler(error):\n    \"\"\"\n    Do something with the error (Excemption type)\n    \"\"\"\n    print(error)\n```\n\nAny custom error handler needs to take in _at least_ one positional parameter, that is: the Excemption error.\n\n__OPTIONAL:__ You can also include a keyword argument `task_run` to have the failed task run object passed in:\n\n```python\nfrom django_cog import cog_error_handler\n\n@cog_error_handler\ndef myErrorHandler(error, task_run):\n    \"\"\"\n    Do something with the error (Excemption type)\n    and know where it came from.\n    \"\"\"\n    print(\"The following error came from the task:\", task_run.task.name)\n    print(error)\n```\n\n##### Failed Pipelines\n\nShould a task marked `critical` fail, the task, its stage, and pipeline will fall into a `Failed` state.  By default, new pipeline runs cannot be called if the last run failed.  This is done intentionally so the same error does not keep repeating itself.\n\nHowever, you _can_ override this safety feature by setting `DJANGO_COG_OVERLAP_FAILED = True` in your Django settings.  Doing this will allow a pipeline to launch even if the last run of that pipeline failed.\n\n\n## Canceling Pipeline Runs\n\nWhen a pipeline is currently running, you can visit the `Pipeline Runs` Django admin page to see which ones are running (displayed at the top of the list).  Clicking on the details of a running pipeline will show a red `Cancel Run` button at the top.  Clicking this will send the pipeline into a `Canceled` state.  Any currently running tasks will be allowed to complete.  As soon as the last one of the currently running tasks completes, it will update the stage's `Completed On` field to properly reflect the end execution timestamp of the last task.  The stage will then also fall into a `Canceled` state, and no further stages or tasks will be launched.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Django library for launching pipelines of multiple stages and parallel tasks.",
    "version": "1.3.10",
    "project_urls": {
        "Homepage": "https://github.com/david-pettifor-nd/django_cog.git"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9ee27888cdb98c781e90ab5b28a1099139baa0f58bb98b75e3b26026656e25a2",
                "md5": "803dd0a7dd2790e53270370d64d9333d",
                "sha256": "bd4248149986984df634241bb9809caa6ebd3541e7c75160d1c77490d80fde48"
            },
            "downloads": -1,
            "filename": "django_cog-1.3.10-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "803dd0a7dd2790e53270370d64d9333d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 27814,
            "upload_time": "2023-08-07T16:59:05",
            "upload_time_iso_8601": "2023-08-07T16:59:05.633614Z",
            "url": "https://files.pythonhosted.org/packages/9e/e2/7888cdb98c781e90ab5b28a1099139baa0f58bb98b75e3b26026656e25a2/django_cog-1.3.10-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1d14db8c31d0c75b9539353def55aeb6958ca6c3afeda77aa4a8c19a8f82d696",
                "md5": "dfc81644e23186b8d4d13565d3cb8c1d",
                "sha256": "42c4c868eb66b7d79ec5e2418e47143849c5516df49a66306343692fb7beb12d"
            },
            "downloads": -1,
            "filename": "django-cog-1.3.10.tar.gz",
            "has_sig": false,
            "md5_digest": "dfc81644e23186b8d4d13565d3cb8c1d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 23303,
            "upload_time": "2023-08-07T16:59:07",
            "upload_time_iso_8601": "2023-08-07T16:59:07.289405Z",
            "url": "https://files.pythonhosted.org/packages/1d/14/db8c31d0c75b9539353def55aeb6958ca6c3afeda77aa4a8c19a8f82d696/django-cog-1.3.10.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-07 16:59:07",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "david-pettifor-nd",
    "github_project": "django_cog",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "django-cog"
}
        
Elapsed time: 0.89770s