django-task-worker


Namedjango-task-worker JSON
Version 0.0.6 PyPI version JSON
download
home_pageNone
SummaryTask worker for Django using Redis and a database queue
upload_time2025-02-10 10:39:54
maintainerNone
docs_urlNone
authorNone
requires_python>=3.6
licenseApache License 2.0
keywords django task async worker queue redis
VCS
bugtrack_url
requirements django redis shortuuid stopit gunicorn python-dotenv psycopg2-binary setuptools
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # django-task-worker

A Django-based task worker that uses the database as a persistent queue and Redis for Pub/Sub messaging. This project is designed to solve common issues with traditional task queues like Celery by offering a lightweight, reliable, and cost-effective solution.

---

## **Motivation**

Traditional task queues like [Celery](https://docs.celeryproject.org/) rely on external message brokers (e.g., Redis, RabbitMQ) to persist task queues and results. While this approach is powerful, it comes with significant challenges:

1. **Single Point of Failure**: The message broker (e.g., Redis) becomes a critical dependency. Restarting it can lead to task loss if not properly configured.
2. **Cluster Complexity**: Setting up a high-availability cluster for Redis or RabbitMQ is complex and resource-intensive.
3. **Cost**: Cloud-hosted Redis instances are expensive, especially for small-scale projects that only need basic task queuing.

### **Why django-task-worker?**

This project aims to address these issues by:

- **Persisting the task queue in Django's database**: Tasks are stored reliably in the database, ensuring no data is lost even if Redis is restarted or stopped.
- **Using Redis only for Pub/Sub**: Redis is used exclusively for real-time job creation and completion notifications. Redis can be safely flushed or restarted without affecting task data.
- **Simplifying deployment**: By eliminating the need for complex message broker setups, this worker integrates seamlessly with Django projects.

---

## **Features**

- **Database-Backed Queue**: Tasks are stored persistently in a Django model (`DatabaseTask`), ensuring no data loss even if Redis is restarted or flushed. This eliminates the need for Redis persistence.
- **Redis Pub/Sub for Real-Time Notifications**: Redis is used exclusively for lightweight Pub/Sub messaging, sending notifications for task creation and completion. The task queue itself is stored and managed in the database.
- **Task Status Management**: The system uses four statuses to track task progress:
  - **`PENDING`**: Task is waiting to be processed.
  - **`PROGRESS`**: Task is currently being processed by a worker.
  - **`COMPLETED`**: Task has been successfully processed.
  - **`FAILED`**: Task has failed due to an error or timeout.
- **Timeout Handling**: Tasks can have a configurable `timeout` (default: 300 seconds). If a task exceeds its timeout, it is forcefully terminated to prevent it from hanging indefinitely and marked as `FAILED`.
- **Retry Logic**: Failed tasks are retried automatically up to a configurable maximum retry count (`MAX_RETRIES`). Once retries are exhausted, the task is permanently marked as `FAILED`.
- **Stale Task Detection**: If a worker crashes while processing a task (`PROGRESS`), the system detects and marks it as `FAILED` or re-queues it for retry based on its retry count. This ensures no task is left incomplete.
- **Race Condition Prevention for Clusters**: Multiple workers can run in parallel in a clustered setup, with safeguards to prevent race conditions:
  - Redis-based locks ensure only one worker processes a task at a time.
  - Database `select_for_update()` locks prevent concurrent updates to task rows.
- **Graceful Shutdown**: Workers listen for termination signals (e.g., `SIGINT`, `SIGTERM`) and shut down gracefully. Pending tasks are finished before stopping, ensuring no interruptions during processing.
- **Execution Order**: After a worker restart, all **`PENDING`** tasks are processed first, followed by retryable **`FAILED`** tasks. This ensures new tasks receive immediate attention while failed tasks are retried in order.
- **Task Execution Insights**: Each task includes the following timestamps for transparency and debugging:
  - **`created_at`**: When the task was created.
  - **`started_at`**: When the task started processing.
  - **`finished_at`**: When the task finished processing.
  - **`duration`**: Total time (in seconds) spent processing the task.

---

## **Installation**

1. Install the package:

    ```bash
    pip install django-task-worker
    ```

2. Add `worker` to your `INSTALLED_APPS` in `settings.py`:

    ```python
    INSTALLED_APPS = [
       ...,
       "django_task_worker",
    ]
    ```

3. Configure Redis in your `settings.py`:

    ```python
    import os
    ...
    # Worker settings
    REDIS_URL = os.environ.get('REDIS_URL', 'redis://localhost:6379/0')
    REDIS_PASSWORD = os.environ.get('REDIS_PASSWORD', None)
    ```

4. Run migrations to create the `DatabaseTask` table:

    ```bash
    python manage.py makemigrations django_task_worker
    python manage.py migrate
    ```

5. Start the worker process using the management command:

    ```bash
    python manage.py run_worker --retry 1 --concurrency 2
    ```
    - `--retry`: Maximum number of retries for failed tasks (default: 0).
    - `--concurrency`: Number of threads to process tasks concurrently (default: 1).

---
## **Usage**

### **How Task Functions are Executed**

The worker dynamically imports and executes the task functions specified in the `name` field of the task. The `name` must be in the format:

```
module_name.function_name
```

The worker assumes all modules are accessible from the Django project's root directory.

---

### **Directory Structure Example**

Your Django project should be organized as follows:

```
your_project/
├── config/
│   ├── settings.py          # Django settings file
│   └── wsgi.py
├── manage.py                # Django management script
├── your_app/
│   ├── your_tasks.py        # Define task functions here
│   └── models.py
└── django_task_worker/  # Which is installed via pip
    ├── models.py            # Includes DatabaseTask
    ├── client.py            # Provides create_task and wait_for_completion
    └── worker.py            # Worker logic
```

Define task functions in a module like `your_app/your_tasks.py`.

#### **Example Task Definition**
```
# your_app/your_tasks.py

def add_numbers(a, b):
    return a + b
```

---

### **How to Create and Run a Task**

#### **1. Create a Task**

Use `create_task` to add a task to the database and notify the worker:

```
from django_task_worker.client import create_task

task = create_task(
    name="your_app.your_tasks.add_numbers",  # Function path
    args=[10, 20],                           # Positional arguments
    kwargs={},                               # Keyword arguments
    timeout=300                              # Timeout in seconds
)

print(f"Task {task.id} created with status: {task.status}")
```

---

#### **2. Run the Worker**

Start the worker using the Django management command:

```
python manage.py run_worker
```

The worker will process tasks in the background.

---

#### **3. Wait for Task Completion**

Use `wait_for_completion` to wait for a task to finish:

```
from django_task_worker.client import wait_for_completion

result = wait_for_completion(task_id=task.id, timeout=10)

if result:
    print(f"Task {result.id} completed with status: {result.status}")
    print(f"Result: {result.result}")
else:
    print("Task did not complete within the timeout.")
```

---

### **API Reference**

#### **`create_task`**
```
def create_task(name, args=None, kwargs=None, timeout=300) -> DatabaseTask:
    """
    Create a task in the database and notify the worker via Redis.

    Args:
        name (str): Function to execute (e.g., 'module_name.function_name').
        args (list, optional): Positional arguments for the function. Defaults to an empty list.
        kwargs (dict, optional): Keyword arguments for the function. Defaults to an empty dict.
        timeout (int, optional): Task timeout in seconds. Defaults to 300.

    Returns:
        DatabaseTask: The created task object.
    """
```

#### **`wait_for_completion`**
```
def wait_for_completion(task_id, timeout=300) -> DatabaseTask | None:
    """
    Wait for a task to complete or fail within the given timeout.

    Args:
        task_id (int): The ID of the task to wait for.
        timeout (int, optional): Maximum time to wait in seconds. Defaults to 300.

    Returns:
        DatabaseTask: The task object if completed successfully.
        None: If the task does not complete within the timeout.
    """
```

---

### **Task Model**

All tasks are stored in the database using the `DatabaseTask` model:

```
from django_task_worker.models import DatabaseTask
```

#### **DatabaseTask Fields**:
- `id` (str): Short UUID for the task.
- `name` (str): The task function in the format `module_name.function_name`.
- `args` (JSON): Positional arguments for the task.
- `kwargs` (JSON): Keyword arguments for the task.
- `timeout` (int): Time in seconds before the task times out.
- `status` (str): Current status (`PENDING`, `PROGRESS`, `COMPLETED`, or `FAILED`).
- `result` (str): Task result after completion.
- `error` (str): Error message if the task fails.
- `retry_count` (int): Number of times the task has been retried.'
- `created_at` (DateTime): Task creation timestamp.
- `updated_at` (DateTime): Task last update timestamp.
- `started_at` (DateTime): Task start timestamp.
- `finished_at` (DateTime): Task finish timestamp.
- `duration` (float): Total time spent processing the task (in seconds).

---

### **Quick Example**

1. **Define a Task** in `app/tasks.py`:
    ```
    def multiply_numbers(a, b):
       return a * b
    ```

2. **Create and Run the Task**:
    ```
    from django_task_worker.client import create_task, wait_for_completion
    
    # Create a task
    task = create_task("app.tasks.multiply_numbers", args=[2, 3])
    
    # Wait for completion
    result = wait_for_completion(task.id, timeout=10)
    if result:
       print(f"Task Result: {result.result}")
    ```

3. **Run the Worker**:
    ```
    python manage.py run_worker
    ```

4. **Test using Django Shell**:
    ```
    python manage.py shell
    ```
    ```
    from django_task_worker.client import create_task, wait_for_completion
    task = create_task("app.tasks.multiply_numbers", args=[2, 3])
    result = wait_for_completion(task.id, timeout=10)
    print(result)  # Task srzm5AdyjhEGJVeL3WZiWN: app.tasks.multiply_numbers (COMPLETED)
    print(result.result)  # "6"
    ```

---

### **Example docker-compose configuration**

docker-compose.yml
```yaml
services:
  db:
    image: postgres:latest
    restart: always
    environment:
      POSTGRES_USER: ${DB_USER}
      POSTGRES_PASSWORD: ${DB_PASSWORD}
      POSTGRES_DB: ${DB_NAME}
    volumes:
      - postgres_data:/var/lib/postgresql/data
    ports:
      - "5432:5432"
    stop_grace_period: 30s
    logging:
      driver: "json-file"
      options:
        max-size: "10m"
        max-file: "1"

  redis:
    image: redis:latest
    restart: always
    volumes:
      - redis_data:/data
    ports:
      - "6379:6379"
    stop_grace_period: 30s
    logging:
      driver: "json-file"
      options:
        max-size: "10m"
        max-file: "1"

  backend:
    build:
      context: .
      dockerfile: Dockerfile
    restart: always
    environment:
      DB_HOST: db
      DB_PORT: ${DB_PORT}
      DB_USER: ${DB_USER}
      DB_PASSWORD: ${DB_PASSWORD}
      DB_NAME: ${DB_NAME}
      REDIS_URL: ${REDIS_URL}
      SECRET_KEY: ${SECRET_KEY}
    networks:
      - default
    stop_grace_period: 30s
    logging:
      driver: "json-file"
      options:
        max-size: "10m"
        max-file: "1"
    depends_on:
      - db
      - redis

  worker:
    build:
      context: .
      dockerfile: Dockerfile
    command: ["python", "manage.py", "run_worker"]
    restart: always
    environment:
      DB_HOST: db
      DB_PORT: 5432
      DB_USER: ${DB_USER}
      DB_PASSWORD: ${DB_PASSWORD}
      DB_NAME: ${DB_NAME}
      REDIS_URL: ${REDIS_URL}
      SECRET_KEY: ${SECRET_KEY}
    networks:
      - default
    stop_grace_period: 300s
    logging:
      driver: "json-file"
      options:
        max-size: "10m"
        max-file: "1"
    depends_on:
      - db
      - redis

networks:
  default:
    driver: bridge

volumes:
  postgres_data:
  redis_data:
```

Dockerfile
```Dockerfile
FROM python:3.12-slim

WORKDIR /app

RUN apt-get update && apt-get install -y curl nano git

COPY requirements.txt /app/
RUN pip install --no-cache-dir -r requirements.txt

COPY . /app/

EXPOSE 8000

CMD ["gunicorn", "-b", "0.0.0.0:8000", "config.wsgi:application"]
```

### **TODO List**
- [x] Redis Authentication
- [x] Concurrency
- [x] Exponential back-offs
- [ ] Scheduled tasks
- [ ] Advanced django admin
- [ ] Detailed error logging

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "django-task-worker",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "Yoongi Kim <yoongi@yoongi.kim>",
    "keywords": "django, task, async, worker, queue, redis",
    "author": null,
    "author_email": "Yoongi Kim <yoongi@yoongi.kim>",
    "download_url": "https://files.pythonhosted.org/packages/9d/82/ac58590149b6d13407492fa82213fcc1c179a7f0ab8cfa0e954b4be85dfc/django_task_worker-0.0.6.tar.gz",
    "platform": null,
    "description": "# django-task-worker\n\nA Django-based task worker that uses the database as a persistent queue and Redis for Pub/Sub messaging. This project is designed to solve common issues with traditional task queues like Celery by offering a lightweight, reliable, and cost-effective solution.\n\n---\n\n## **Motivation**\n\nTraditional task queues like [Celery](https://docs.celeryproject.org/) rely on external message brokers (e.g., Redis, RabbitMQ) to persist task queues and results. While this approach is powerful, it comes with significant challenges:\n\n1. **Single Point of Failure**: The message broker (e.g., Redis) becomes a critical dependency. Restarting it can lead to task loss if not properly configured.\n2. **Cluster Complexity**: Setting up a high-availability cluster for Redis or RabbitMQ is complex and resource-intensive.\n3. **Cost**: Cloud-hosted Redis instances are expensive, especially for small-scale projects that only need basic task queuing.\n\n### **Why django-task-worker?**\n\nThis project aims to address these issues by:\n\n- **Persisting the task queue in Django's database**: Tasks are stored reliably in the database, ensuring no data is lost even if Redis is restarted or stopped.\n- **Using Redis only for Pub/Sub**: Redis is used exclusively for real-time job creation and completion notifications. Redis can be safely flushed or restarted without affecting task data.\n- **Simplifying deployment**: By eliminating the need for complex message broker setups, this worker integrates seamlessly with Django projects.\n\n---\n\n## **Features**\n\n- **Database-Backed Queue**: Tasks are stored persistently in a Django model (`DatabaseTask`), ensuring no data loss even if Redis is restarted or flushed. This eliminates the need for Redis persistence.\n- **Redis Pub/Sub for Real-Time Notifications**: Redis is used exclusively for lightweight Pub/Sub messaging, sending notifications for task creation and completion. The task queue itself is stored and managed in the database.\n- **Task Status Management**: The system uses four statuses to track task progress:\n  - **`PENDING`**: Task is waiting to be processed.\n  - **`PROGRESS`**: Task is currently being processed by a worker.\n  - **`COMPLETED`**: Task has been successfully processed.\n  - **`FAILED`**: Task has failed due to an error or timeout.\n- **Timeout Handling**: Tasks can have a configurable `timeout` (default: 300 seconds). If a task exceeds its timeout, it is forcefully terminated to prevent it from hanging indefinitely and marked as `FAILED`.\n- **Retry Logic**: Failed tasks are retried automatically up to a configurable maximum retry count (`MAX_RETRIES`). Once retries are exhausted, the task is permanently marked as `FAILED`.\n- **Stale Task Detection**: If a worker crashes while processing a task (`PROGRESS`), the system detects and marks it as `FAILED` or re-queues it for retry based on its retry count. This ensures no task is left incomplete.\n- **Race Condition Prevention for Clusters**: Multiple workers can run in parallel in a clustered setup, with safeguards to prevent race conditions:\n  - Redis-based locks ensure only one worker processes a task at a time.\n  - Database `select_for_update()` locks prevent concurrent updates to task rows.\n- **Graceful Shutdown**: Workers listen for termination signals (e.g., `SIGINT`, `SIGTERM`) and shut down gracefully. Pending tasks are finished before stopping, ensuring no interruptions during processing.\n- **Execution Order**: After a worker restart, all **`PENDING`** tasks are processed first, followed by retryable **`FAILED`** tasks. This ensures new tasks receive immediate attention while failed tasks are retried in order.\n- **Task Execution Insights**: Each task includes the following timestamps for transparency and debugging:\n  - **`created_at`**: When the task was created.\n  - **`started_at`**: When the task started processing.\n  - **`finished_at`**: When the task finished processing.\n  - **`duration`**: Total time (in seconds) spent processing the task.\n\n---\n\n## **Installation**\n\n1. Install the package:\n\n    ```bash\n    pip install django-task-worker\n    ```\n\n2. Add `worker` to your `INSTALLED_APPS` in `settings.py`:\n\n    ```python\n    INSTALLED_APPS = [\n       ...,\n       \"django_task_worker\",\n    ]\n    ```\n\n3. Configure Redis in your `settings.py`:\n\n    ```python\n    import os\n    ...\n    # Worker settings\n    REDIS_URL = os.environ.get('REDIS_URL', 'redis://localhost:6379/0')\n    REDIS_PASSWORD = os.environ.get('REDIS_PASSWORD', None)\n    ```\n\n4. Run migrations to create the `DatabaseTask` table:\n\n    ```bash\n    python manage.py makemigrations django_task_worker\n    python manage.py migrate\n    ```\n\n5. Start the worker process using the management command:\n\n    ```bash\n    python manage.py run_worker --retry 1 --concurrency 2\n    ```\n    - `--retry`: Maximum number of retries for failed tasks (default: 0).\n    - `--concurrency`: Number of threads to process tasks concurrently (default: 1).\n\n---\n## **Usage**\n\n### **How Task Functions are Executed**\n\nThe worker dynamically imports and executes the task functions specified in the `name` field of the task. The `name` must be in the format:\n\n```\nmodule_name.function_name\n```\n\nThe worker assumes all modules are accessible from the Django project's root directory.\n\n---\n\n### **Directory Structure Example**\n\nYour Django project should be organized as follows:\n\n```\nyour_project/\n\u251c\u2500\u2500 config/\n\u2502   \u251c\u2500\u2500 settings.py          # Django settings file\n\u2502   \u2514\u2500\u2500 wsgi.py\n\u251c\u2500\u2500 manage.py                # Django management script\n\u251c\u2500\u2500 your_app/\n\u2502   \u251c\u2500\u2500 your_tasks.py        # Define task functions here\n\u2502   \u2514\u2500\u2500 models.py\n\u2514\u2500\u2500 django_task_worker/  # Which is installed via pip\n    \u251c\u2500\u2500 models.py            # Includes DatabaseTask\n    \u251c\u2500\u2500 client.py            # Provides create_task and wait_for_completion\n    \u2514\u2500\u2500 worker.py            # Worker logic\n```\n\nDefine task functions in a module like `your_app/your_tasks.py`.\n\n#### **Example Task Definition**\n```\n# your_app/your_tasks.py\n\ndef add_numbers(a, b):\n    return a + b\n```\n\n---\n\n### **How to Create and Run a Task**\n\n#### **1. Create a Task**\n\nUse `create_task` to add a task to the database and notify the worker:\n\n```\nfrom django_task_worker.client import create_task\n\ntask = create_task(\n    name=\"your_app.your_tasks.add_numbers\",  # Function path\n    args=[10, 20],                           # Positional arguments\n    kwargs={},                               # Keyword arguments\n    timeout=300                              # Timeout in seconds\n)\n\nprint(f\"Task {task.id} created with status: {task.status}\")\n```\n\n---\n\n#### **2. Run the Worker**\n\nStart the worker using the Django management command:\n\n```\npython manage.py run_worker\n```\n\nThe worker will process tasks in the background.\n\n---\n\n#### **3. Wait for Task Completion**\n\nUse `wait_for_completion` to wait for a task to finish:\n\n```\nfrom django_task_worker.client import wait_for_completion\n\nresult = wait_for_completion(task_id=task.id, timeout=10)\n\nif result:\n    print(f\"Task {result.id} completed with status: {result.status}\")\n    print(f\"Result: {result.result}\")\nelse:\n    print(\"Task did not complete within the timeout.\")\n```\n\n---\n\n### **API Reference**\n\n#### **`create_task`**\n```\ndef create_task(name, args=None, kwargs=None, timeout=300) -> DatabaseTask:\n    \"\"\"\n    Create a task in the database and notify the worker via Redis.\n\n    Args:\n        name (str): Function to execute (e.g., 'module_name.function_name').\n        args (list, optional): Positional arguments for the function. Defaults to an empty list.\n        kwargs (dict, optional): Keyword arguments for the function. Defaults to an empty dict.\n        timeout (int, optional): Task timeout in seconds. Defaults to 300.\n\n    Returns:\n        DatabaseTask: The created task object.\n    \"\"\"\n```\n\n#### **`wait_for_completion`**\n```\ndef wait_for_completion(task_id, timeout=300) -> DatabaseTask | None:\n    \"\"\"\n    Wait for a task to complete or fail within the given timeout.\n\n    Args:\n        task_id (int): The ID of the task to wait for.\n        timeout (int, optional): Maximum time to wait in seconds. Defaults to 300.\n\n    Returns:\n        DatabaseTask: The task object if completed successfully.\n        None: If the task does not complete within the timeout.\n    \"\"\"\n```\n\n---\n\n### **Task Model**\n\nAll tasks are stored in the database using the `DatabaseTask` model:\n\n```\nfrom django_task_worker.models import DatabaseTask\n```\n\n#### **DatabaseTask Fields**:\n- `id` (str): Short UUID for the task.\n- `name` (str): The task function in the format `module_name.function_name`.\n- `args` (JSON): Positional arguments for the task.\n- `kwargs` (JSON): Keyword arguments for the task.\n- `timeout` (int): Time in seconds before the task times out.\n- `status` (str): Current status (`PENDING`, `PROGRESS`, `COMPLETED`, or `FAILED`).\n- `result` (str): Task result after completion.\n- `error` (str): Error message if the task fails.\n- `retry_count` (int): Number of times the task has been retried.'\n- `created_at` (DateTime): Task creation timestamp.\n- `updated_at` (DateTime): Task last update timestamp.\n- `started_at` (DateTime): Task start timestamp.\n- `finished_at` (DateTime): Task finish timestamp.\n- `duration` (float): Total time spent processing the task (in seconds).\n\n---\n\n### **Quick Example**\n\n1. **Define a Task** in `app/tasks.py`:\n    ```\n    def multiply_numbers(a, b):\n       return a * b\n    ```\n\n2. **Create and Run the Task**:\n    ```\n    from django_task_worker.client import create_task, wait_for_completion\n    \n    # Create a task\n    task = create_task(\"app.tasks.multiply_numbers\", args=[2, 3])\n    \n    # Wait for completion\n    result = wait_for_completion(task.id, timeout=10)\n    if result:\n       print(f\"Task Result: {result.result}\")\n    ```\n\n3. **Run the Worker**:\n    ```\n    python manage.py run_worker\n    ```\n\n4. **Test using Django Shell**:\n    ```\n    python manage.py shell\n    ```\n    ```\n    from django_task_worker.client import create_task, wait_for_completion\n    task = create_task(\"app.tasks.multiply_numbers\", args=[2, 3])\n    result = wait_for_completion(task.id, timeout=10)\n    print(result)  # Task srzm5AdyjhEGJVeL3WZiWN: app.tasks.multiply_numbers (COMPLETED)\n    print(result.result)  # \"6\"\n    ```\n\n---\n\n### **Example docker-compose configuration**\n\ndocker-compose.yml\n```yaml\nservices:\n  db:\n    image: postgres:latest\n    restart: always\n    environment:\n      POSTGRES_USER: ${DB_USER}\n      POSTGRES_PASSWORD: ${DB_PASSWORD}\n      POSTGRES_DB: ${DB_NAME}\n    volumes:\n      - postgres_data:/var/lib/postgresql/data\n    ports:\n      - \"5432:5432\"\n    stop_grace_period: 30s\n    logging:\n      driver: \"json-file\"\n      options:\n        max-size: \"10m\"\n        max-file: \"1\"\n\n  redis:\n    image: redis:latest\n    restart: always\n    volumes:\n      - redis_data:/data\n    ports:\n      - \"6379:6379\"\n    stop_grace_period: 30s\n    logging:\n      driver: \"json-file\"\n      options:\n        max-size: \"10m\"\n        max-file: \"1\"\n\n  backend:\n    build:\n      context: .\n      dockerfile: Dockerfile\n    restart: always\n    environment:\n      DB_HOST: db\n      DB_PORT: ${DB_PORT}\n      DB_USER: ${DB_USER}\n      DB_PASSWORD: ${DB_PASSWORD}\n      DB_NAME: ${DB_NAME}\n      REDIS_URL: ${REDIS_URL}\n      SECRET_KEY: ${SECRET_KEY}\n    networks:\n      - default\n    stop_grace_period: 30s\n    logging:\n      driver: \"json-file\"\n      options:\n        max-size: \"10m\"\n        max-file: \"1\"\n    depends_on:\n      - db\n      - redis\n\n  worker:\n    build:\n      context: .\n      dockerfile: Dockerfile\n    command: [\"python\", \"manage.py\", \"run_worker\"]\n    restart: always\n    environment:\n      DB_HOST: db\n      DB_PORT: 5432\n      DB_USER: ${DB_USER}\n      DB_PASSWORD: ${DB_PASSWORD}\n      DB_NAME: ${DB_NAME}\n      REDIS_URL: ${REDIS_URL}\n      SECRET_KEY: ${SECRET_KEY}\n    networks:\n      - default\n    stop_grace_period: 300s\n    logging:\n      driver: \"json-file\"\n      options:\n        max-size: \"10m\"\n        max-file: \"1\"\n    depends_on:\n      - db\n      - redis\n\nnetworks:\n  default:\n    driver: bridge\n\nvolumes:\n  postgres_data:\n  redis_data:\n```\n\nDockerfile\n```Dockerfile\nFROM python:3.12-slim\n\nWORKDIR /app\n\nRUN apt-get update && apt-get install -y curl nano git\n\nCOPY requirements.txt /app/\nRUN pip install --no-cache-dir -r requirements.txt\n\nCOPY . /app/\n\nEXPOSE 8000\n\nCMD [\"gunicorn\", \"-b\", \"0.0.0.0:8000\", \"config.wsgi:application\"]\n```\n\n### **TODO List**\n- [x] Redis Authentication\n- [x] Concurrency\n- [x] Exponential back-offs\n- [ ] Scheduled tasks\n- [ ] Advanced django admin\n- [ ] Detailed error logging\n",
    "bugtrack_url": null,
    "license": "Apache License 2.0",
    "summary": "Task worker for Django using Redis and a database queue",
    "version": "0.0.6",
    "project_urls": {
        "Bug Tracker": "https://github.com/YoongiKim/django-task-worker/issues",
        "Homepage": "https://github.com/YoongiKim/django-task-worker",
        "Repository": "https://github.com/YoongiKim/django-task-worker"
    },
    "split_keywords": [
        "django",
        " task",
        " async",
        " worker",
        " queue",
        " redis"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "3aa10a7aed66179104f8b8c2e74b2c91d2a4e31a6ca0d27928047f4d10642509",
                "md5": "24c29ce44d31f3c9a51ab165930c810f",
                "sha256": "bc4751a79c40a663946a44c43a3fd7b5efa3720408e5a968f1c319ed09708885"
            },
            "downloads": -1,
            "filename": "django_task_worker-0.0.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "24c29ce44d31f3c9a51ab165930c810f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 15332,
            "upload_time": "2025-02-10T10:39:51",
            "upload_time_iso_8601": "2025-02-10T10:39:51.816267Z",
            "url": "https://files.pythonhosted.org/packages/3a/a1/0a7aed66179104f8b8c2e74b2c91d2a4e31a6ca0d27928047f4d10642509/django_task_worker-0.0.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "9d82ac58590149b6d13407492fa82213fcc1c179a7f0ab8cfa0e954b4be85dfc",
                "md5": "af3cd1ba0110bd8efe9349c16313e647",
                "sha256": "b2d78070ff8e406ebc60e6b0333a3b0080fc07db52896d8b9e1097b3b50ff232"
            },
            "downloads": -1,
            "filename": "django_task_worker-0.0.6.tar.gz",
            "has_sig": false,
            "md5_digest": "af3cd1ba0110bd8efe9349c16313e647",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 16599,
            "upload_time": "2025-02-10T10:39:54",
            "upload_time_iso_8601": "2025-02-10T10:39:54.023458Z",
            "url": "https://files.pythonhosted.org/packages/9d/82/ac58590149b6d13407492fa82213fcc1c179a7f0ab8cfa0e954b4be85dfc/django_task_worker-0.0.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-10 10:39:54",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "YoongiKim",
    "github_project": "django-task-worker",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "django",
            "specs": []
        },
        {
            "name": "redis",
            "specs": []
        },
        {
            "name": "shortuuid",
            "specs": []
        },
        {
            "name": "stopit",
            "specs": []
        },
        {
            "name": "gunicorn",
            "specs": []
        },
        {
            "name": "python-dotenv",
            "specs": []
        },
        {
            "name": "psycopg2-binary",
            "specs": []
        },
        {
            "name": "setuptools",
            "specs": []
        }
    ],
    "lcname": "django-task-worker"
}
        
Elapsed time: 0.49502s