# Peasy Jobs
[![PyPI - Version](https://img.shields.io/pypi/v/peasy-jobs.svg)](https://pypi.org/project/peasy-jobs)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/peasy-jobs.svg)](https://pypi.org/project/peasy-jobs)
![tests status badge](https://github.com/d-flood/peasy-jobs/actions/workflows/tests.yml/badge.svg?branch=main)
---
An _incredibly_ simple database queue and background worker for Django. It is an "easy peasy" option for those who do not have a very busy task queue.
**Table of Contents**
- [Installation](#installation)
- [Quickstart](#quickstart)
- [Django Optional Settings](#django-optional-settings)
- [Tracking a Job's Status](#tracking-a-jobs-status)
- [Cancelling Jobs and Gracefully Shutting Down the Job Runner](#cancelling-jobs-and-gracefully-shutting-down-the-job-runner)
- [License](#license)
## Quickstart
Like other task queue runners (e.g., [Celery](https://docs.celeryq.dev/en/stable/index.html), [Huey](https://huey.readthedocs.io/en/latest/), etc.), Peasy runs in a separate process from your main Django application and processes jobs from a queue. It runs the same code as the application but with a different startup command.
1. Install then add `peasy_jobs` to `installed_apps`
```console
> pip install peasy-jobs
```
```Python
# settings.py
INSTALLED_APPS = [
...
"peasy_jobs",
...
]
```
2. Run migrations `python manage.py migrate`. This adds the job queue table/result backend to your database.
3. Decorate functions that should be run in the background and outside of the request-response cycle:
```Python
# jobs.py
from peasy_jobs.peasy_jobs import peasy
@peasy.job("export data to s3")
def generate_large_data_export():
data = gather_data()
upload_data_to_s3(data)
```
Calling this function from your Django application (likely during the handling of a request) will add the function to the queue, along with its positional and keyword arguments. It will remain in the queue until it is processed by the job runner.
4. Start your Django application, e.g. `./manage.py runserver`, `gunicorn myproject.wsgi:application`, etc.
5. Start the Peasy job runner `python manage.py run_peasy`. Peasy will read jobs from the database queue table, execute them, and store both the status and return value in the same object.
## Django Optional settings
These are optional settings and their defaults if not defined.
```Python
# settings.py
PEASY_MAX_COMPLETED = 10 # max number of completed job objects to keep in the db
PEASY_MAX_FAILED = 10 # max number of failed job objects to keep in the db
PEASY_MAX_CANCELLED = 10 # max number of cancelled job objects to keep in the db
PEASY_POLLING_INTERVAL = 2 # seconds to wait between checks for enqueued jobs in the db
PEASY_CONCURRENCY = 1 # max number of worker threads or processes
PEASY_WORKER_TYPE = "process" # "thread" or "process"
PEASY_SHUTDOWN_TIMEOUT = 30 # grace seconds to wait for jobs to complete after receiving a sigint before terminating them
```
## Tracking a Job's Status
When a job is called by the main Django application, a `PeasyJobQueue` object is created and added to the database. You can query this object both to track whether a background job was successful _and_ to manually add status updates to it (progress percentage, for example).
Peasy will update the status of the job on the edges of its work, i.e., the beginning and ending of the job. This includes whether the job succeeded, failed, or was cancelled. However, you can use `PeasyJobQueue.status_msg` and `PeasyJobQueue.extra` to store a string and arbitrary dictionary (respectively) for updating the job status throughout a background task.
Peasy will conveniently inject the `PeasyJobQueue` `pk` as an argument to your job function _if_ you add `job_pk` as an argument (and obviously, don't supply a value for it yourself from the calling code).
```Python
# jobs.py
from peasy_jobs.models import PeasyJobQueue
from peasy_jobs.peasy_jobs import peasy
@peasy.job("export data to s3")
def generate_large_data_export(job_pk: int): # add job_pk here and it will automatically be injected.
data = gather_data()
peasy.update_status( # a convenience method for updating a peasy job status
job_pk,
status_msg="Succesfully gathered data. Now uploading data to s3.",
extra={"progress": 50}) # use the `extra` field to optionally store an arbitrary dictionary.
upload_data_to_s3(data)
```
If your job returns a value, it will be stored in `PeasyJobQueue.result`.
You can also view job statuses in the admin with no additional configuration:
![PeasyJobQueue job status Django admin list view showing a tabular layout with job completions and failures clearly indicated.](./.github/images/admin_listview.png)
## Cancelling Jobs and Gracefully Shutting Down the Job Runner
### How to Cancel an Enqueued Job
To cancel an enqueued job before it has been read by the job runner, simply set its status to cancelled and it will not be processed. **Note:** Peasy Jobs does not provide a way to target and cancel a specific _running_ job.
```Python
PeasyJobQueue.objects.filter(pk=job_pk).update(status=PeasyJobQueue.CANCELLED)
```
### Gracefully Shutting Down
To gracefully shutdown the Peasy Job runner, send its main process a `sigint` (e.g. CTRL+C). It will not immediately shutdown, rather, it will stop scheduling enqueued jobs for executing and wait for ongoing jobs to complete. Once there are no running jobs it will promptly exit. This grace period in which ongoing jobs are given a chance to complete is controlled by the `PEASY_SHUTDOWN_TIMEOUT` [setting variable](#django-optional-settings). If not defined, the grace period will last 30 seconds.
The graceful shutdown is intended to make it easier to update application without killing in-process jobs. This pairs well with Docker and Docker Compose since it also uses `sigint` when stopping a running service.
## Arguments and Result Data Gotcha
There is only one restriction concerning the positional arguments, keyword arguments, and return values: They must all be pickleable. The arguments and return values are stored in the database, which cannot store arbitrary Python objects _except_ as far as we can get with pickling. It is recommended that you try to keep arguments and return values JSON serializable since security concerns could force Peasy Jobs into using JSON, JSONB, or other safer serialization protocols.
## License
`peasy-jobs` is distributed under the terms of the [MIT](https://spdx.org/licenses/MIT.html) license.
Raw data
{
"_id": null,
"home_page": null,
"name": "peasy-jobs",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "background, database, django, job, jobs, peasy, queue, task, worker",
"author": null,
"author_email": "David Flood <69060117+d-flood@users.noreply.github.com>",
"download_url": "https://files.pythonhosted.org/packages/ea/34/e4a7c33523d35ee45c43a73f7c13fea38e95d81588b00af9f5f439246fec/peasy_jobs-0.4.1.tar.gz",
"platform": null,
"description": "# Peasy Jobs\n\n[![PyPI - Version](https://img.shields.io/pypi/v/peasy-jobs.svg)](https://pypi.org/project/peasy-jobs)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/peasy-jobs.svg)](https://pypi.org/project/peasy-jobs)\n![tests status badge](https://github.com/d-flood/peasy-jobs/actions/workflows/tests.yml/badge.svg?branch=main)\n\n---\n\nAn _incredibly_ simple database queue and background worker for Django. It is an \"easy peasy\" option for those who do not have a very busy task queue.\n\n**Table of Contents**\n\n- [Installation](#installation)\n- [Quickstart](#quickstart)\n- [Django Optional Settings](#django-optional-settings)\n- [Tracking a Job's Status](#tracking-a-jobs-status)\n- [Cancelling Jobs and Gracefully Shutting Down the Job Runner](#cancelling-jobs-and-gracefully-shutting-down-the-job-runner)\n\n- [License](#license)\n\n## Quickstart\n\nLike other task queue runners (e.g., [Celery](https://docs.celeryq.dev/en/stable/index.html), [Huey](https://huey.readthedocs.io/en/latest/), etc.), Peasy runs in a separate process from your main Django application and processes jobs from a queue. It runs the same code as the application but with a different startup command.\n\n1. Install then add `peasy_jobs` to `installed_apps`\n ```console\n > pip install peasy-jobs\n ```\n ```Python\n # settings.py\n INSTALLED_APPS = [\n ...\n \"peasy_jobs\",\n ...\n ]\n ```\n2. Run migrations `python manage.py migrate`. This adds the job queue table/result backend to your database.\n3. Decorate functions that should be run in the background and outside of the request-response cycle:\n\n ```Python\n # jobs.py\n from peasy_jobs.peasy_jobs import peasy\n\n @peasy.job(\"export data to s3\")\n def generate_large_data_export():\n data = gather_data()\n upload_data_to_s3(data)\n ```\n\n Calling this function from your Django application (likely during the handling of a request) will add the function to the queue, along with its positional and keyword arguments. It will remain in the queue until it is processed by the job runner.\n\n4. Start your Django application, e.g. `./manage.py runserver`, `gunicorn myproject.wsgi:application`, etc.\n5. Start the Peasy job runner `python manage.py run_peasy`. Peasy will read jobs from the database queue table, execute them, and store both the status and return value in the same object.\n\n## Django Optional settings\n\nThese are optional settings and their defaults if not defined.\n\n```Python\n# settings.py\nPEASY_MAX_COMPLETED = 10 # max number of completed job objects to keep in the db\nPEASY_MAX_FAILED = 10 # max number of failed job objects to keep in the db\nPEASY_MAX_CANCELLED = 10 # max number of cancelled job objects to keep in the db\nPEASY_POLLING_INTERVAL = 2 # seconds to wait between checks for enqueued jobs in the db\nPEASY_CONCURRENCY = 1 # max number of worker threads or processes\nPEASY_WORKER_TYPE = \"process\" # \"thread\" or \"process\"\nPEASY_SHUTDOWN_TIMEOUT = 30 # grace seconds to wait for jobs to complete after receiving a sigint before terminating them\n```\n\n## Tracking a Job's Status\n\nWhen a job is called by the main Django application, a `PeasyJobQueue` object is created and added to the database. You can query this object both to track whether a background job was successful _and_ to manually add status updates to it (progress percentage, for example).\n\nPeasy will update the status of the job on the edges of its work, i.e., the beginning and ending of the job. This includes whether the job succeeded, failed, or was cancelled. However, you can use `PeasyJobQueue.status_msg` and `PeasyJobQueue.extra` to store a string and arbitrary dictionary (respectively) for updating the job status throughout a background task.\n\nPeasy will conveniently inject the `PeasyJobQueue` `pk` as an argument to your job function _if_ you add `job_pk` as an argument (and obviously, don't supply a value for it yourself from the calling code).\n\n```Python\n# jobs.py\nfrom peasy_jobs.models import PeasyJobQueue\nfrom peasy_jobs.peasy_jobs import peasy\n\n@peasy.job(\"export data to s3\")\ndef generate_large_data_export(job_pk: int): # add job_pk here and it will automatically be injected.\n data = gather_data()\n peasy.update_status( # a convenience method for updating a peasy job status\n job_pk,\n status_msg=\"Succesfully gathered data. Now uploading data to s3.\",\n extra={\"progress\": 50}) # use the `extra` field to optionally store an arbitrary dictionary.\n upload_data_to_s3(data)\n```\n\nIf your job returns a value, it will be stored in `PeasyJobQueue.result`.\n\nYou can also view job statuses in the admin with no additional configuration:\n![PeasyJobQueue job status Django admin list view showing a tabular layout with job completions and failures clearly indicated.](./.github/images/admin_listview.png)\n\n## Cancelling Jobs and Gracefully Shutting Down the Job Runner\n\n### How to Cancel an Enqueued Job\n\nTo cancel an enqueued job before it has been read by the job runner, simply set its status to cancelled and it will not be processed. **Note:** Peasy Jobs does not provide a way to target and cancel a specific _running_ job.\n\n```Python\nPeasyJobQueue.objects.filter(pk=job_pk).update(status=PeasyJobQueue.CANCELLED)\n```\n\n### Gracefully Shutting Down\n\nTo gracefully shutdown the Peasy Job runner, send its main process a `sigint` (e.g. CTRL+C). It will not immediately shutdown, rather, it will stop scheduling enqueued jobs for executing and wait for ongoing jobs to complete. Once there are no running jobs it will promptly exit. This grace period in which ongoing jobs are given a chance to complete is controlled by the `PEASY_SHUTDOWN_TIMEOUT` [setting variable](#django-optional-settings). If not defined, the grace period will last 30 seconds.\n\nThe graceful shutdown is intended to make it easier to update application without killing in-process jobs. This pairs well with Docker and Docker Compose since it also uses `sigint` when stopping a running service.\n\n## Arguments and Result Data Gotcha\n\nThere is only one restriction concerning the positional arguments, keyword arguments, and return values: They must all be pickleable. The arguments and return values are stored in the database, which cannot store arbitrary Python objects _except_ as far as we can get with pickling. It is recommended that you try to keep arguments and return values JSON serializable since security concerns could force Peasy Jobs into using JSON, JSONB, or other safer serialization protocols.\n\n## License\n\n`peasy-jobs` is distributed under the terms of the [MIT](https://spdx.org/licenses/MIT.html) license.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "An incredibly simple database-based background worker for Django.",
"version": "0.4.1",
"project_urls": {
"Documentation": "https://github.com/d-flood/peasy-jobs#readme",
"Issues": "https://github.com/d-flood/peasy-jobs/issues",
"Source": "https://github.com/d-flood/peasy-jobs"
},
"split_keywords": [
"background",
" database",
" django",
" job",
" jobs",
" peasy",
" queue",
" task",
" worker"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "af21c3665d663b182f115917803ae29205d31ff3da915b8ca8583dd05f05ab42",
"md5": "709435a142caabe69e2d2d9d33856a7b",
"sha256": "1c2c1ef3f7d2028d94f3402456be6f16b6780e66bdab527e9e8adf32da9b428a"
},
"downloads": -1,
"filename": "peasy_jobs-0.4.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "709435a142caabe69e2d2d9d33856a7b",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 12777,
"upload_time": "2024-12-11T05:09:02",
"upload_time_iso_8601": "2024-12-11T05:09:02.004527Z",
"url": "https://files.pythonhosted.org/packages/af/21/c3665d663b182f115917803ae29205d31ff3da915b8ca8583dd05f05ab42/peasy_jobs-0.4.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "ea34e4a7c33523d35ee45c43a73f7c13fea38e95d81588b00af9f5f439246fec",
"md5": "c028dc43ae51fd851810ca6a6b03a367",
"sha256": "aea1c83f3cf5a5aac048b04a867d6b9cdd1ebf5208d7bc52b47e0021389e8aca"
},
"downloads": -1,
"filename": "peasy_jobs-0.4.1.tar.gz",
"has_sig": false,
"md5_digest": "c028dc43ae51fd851810ca6a6b03a367",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 124244,
"upload_time": "2024-12-11T05:08:59",
"upload_time_iso_8601": "2024-12-11T05:08:59.281889Z",
"url": "https://files.pythonhosted.org/packages/ea/34/e4a7c33523d35ee45c43a73f7c13fea38e95d81588b00af9f5f439246fec/peasy_jobs-0.4.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-11 05:08:59",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "d-flood",
"github_project": "peasy-jobs#readme",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "peasy-jobs"
}