| Name | dtpyworker JSON |
| Version |
0.1.2
JSON |
| download |
| home_page | None |
| Summary | A package to improved api library to have better development experience. |
| upload_time | 2025-02-08 14:03:36 |
| maintainer | None |
| docs_url | None |
| author | Reza Shirazi |
| requires_python | <4.0,>=3.11 |
| license | MIT |
| keywords |
|
| VCS |
|
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
# Celery Worker with Redis Integration
This package provides a simple implementation to register and manage tasks for a Celery worker, with Redis as the broker and result backend. It includes functionality for registering regular tasks, periodic tasks, and configuring various aspects of the Celery worker and Redis connection.
## Features
- Register and manage tasks with optional queues.
- Register periodic tasks with customizable schedules using crontab or timedelta.
- Configure Celery worker settings such as time zone, task serializers, result backend, and retry options.
- Set up Redis as the broker and backend with SSL support for secure connections.
- Automatically discover tasks for the Celery worker.
## Requirements
- Python 3.11+
- Celery
- dtpyredis
## Requirements
```bash
pip install dtpyworker
```
## Usage
### Define Tasks
To define tasks, create a Task object and register your routes:
```python
from dtpyworker.task import Task, crontab
task_manager = (
Task()
.register(route="my_task_route")
.register_periodic_task(
route="my_periodic_task_route",
schedule=crontab(minute="*/5"),
queue="default_queue"
)
)
```
### Set Up Worker
Create a Worker instance, configure it with the Redis instance and registered tasks, and then create the Celery app
```python
from dtpyworker.worker import Worker
from dtpyredis.config import RedisConfig
from dtpyredis.connection import RedisInstance
# Initialize Redis connection
redis_config = RedisConfig()
redis_config.set_redis_host('localhost')
redis_config.set_redis_port(6379)
redis_config.set_redis_db(0)
redis_instance = RedisInstance(redis_config=redis_config)
# Initialize Worker and configure
worker = (
Worker()
.set_redis(redis_instance)
.set_task(task_manager)
.set_name("my_worker")
.set_timezone("UTC")
)
# Create Celery app
celery_app = worker.create()
```
### Task Example
Define a simple task that can be executed by the worker:
```python
from dtpyworker.task import shared_task
@shared_task
def my_task():
print("Executing my task!")
```
### Running the Worker
To start the Celery worker, run the following command in your terminal:
```bash
celery -A your_package.celery_app worker --loglevel=info
```
For periodic tasks, run the Celery Beat scheduler alongside the worker:
```bash
celery -A your_package.celery_app beat --loglevel=info
```
Raw data
{
"_id": null,
"home_page": null,
"name": "dtpyworker",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.11",
"maintainer_email": null,
"keywords": null,
"author": "Reza Shirazi",
"author_email": "reza@datgate.com",
"download_url": "https://files.pythonhosted.org/packages/21/f5/73f28c121f8556807d3e598f99210f51d02ef1ea33da79cb8415a6630594/dtpyworker-0.1.2.tar.gz",
"platform": null,
"description": "# Celery Worker with Redis Integration\n\nThis package provides a simple implementation to register and manage tasks for a Celery worker, with Redis as the broker and result backend. It includes functionality for registering regular tasks, periodic tasks, and configuring various aspects of the Celery worker and Redis connection.\n\n## Features\n\n- Register and manage tasks with optional queues.\n- Register periodic tasks with customizable schedules using crontab or timedelta.\n- Configure Celery worker settings such as time zone, task serializers, result backend, and retry options.\n- Set up Redis as the broker and backend with SSL support for secure connections.\n- Automatically discover tasks for the Celery worker.\n\n## Requirements\n\n- Python 3.11+\n- Celery\n- dtpyredis\n\n## Requirements\n\n```bash\npip install dtpyworker\n```\n\n## Usage\n\n### Define Tasks\n\nTo define tasks, create a Task object and register your routes:\n\n```python\nfrom dtpyworker.task import Task, crontab\n\ntask_manager = (\n Task()\n .register(route=\"my_task_route\")\n .register_periodic_task(\n route=\"my_periodic_task_route\",\n schedule=crontab(minute=\"*/5\"),\n queue=\"default_queue\"\n )\n)\n```\n\n### Set Up Worker\n\nCreate a Worker instance, configure it with the Redis instance and registered tasks, and then create the Celery app\n\n```python\nfrom dtpyworker.worker import Worker\nfrom dtpyredis.config import RedisConfig\nfrom dtpyredis.connection import RedisInstance\n\n# Initialize Redis connection\nredis_config = RedisConfig()\nredis_config.set_redis_host('localhost')\nredis_config.set_redis_port(6379)\nredis_config.set_redis_db(0)\n\nredis_instance = RedisInstance(redis_config=redis_config)\n\n# Initialize Worker and configure\nworker = (\n Worker()\n .set_redis(redis_instance)\n .set_task(task_manager)\n .set_name(\"my_worker\")\n .set_timezone(\"UTC\")\n)\n\n# Create Celery app\ncelery_app = worker.create()\n```\n\n### Task Example\n\nDefine a simple task that can be executed by the worker:\n\n```python\nfrom dtpyworker.task import shared_task\n\n@shared_task\ndef my_task():\n print(\"Executing my task!\")\n```\n\n### Running the Worker\n\nTo start the Celery worker, run the following command in your terminal:\n\n```bash\ncelery -A your_package.celery_app worker --loglevel=info\n```\n\nFor periodic tasks, run the Celery Beat scheduler alongside the worker:\n\n```bash\ncelery -A your_package.celery_app beat --loglevel=info\n```\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A package to improved api library to have better development experience.",
"version": "0.1.2",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "c5cb1ceb76021b30a8903c48ba75e1d6cf2d97104c2bf8ed7e702487147e2cfb",
"md5": "7b5ff2c0ad877dd69e87b4d70f83d2c5",
"sha256": "a9cabae46de69ceb087934794da7e12dfdaf089dd45d3ba1ba729d7ee88b5c08"
},
"downloads": -1,
"filename": "dtpyworker-0.1.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7b5ff2c0ad877dd69e87b4d70f83d2c5",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.11",
"size": 3974,
"upload_time": "2025-02-08T14:03:35",
"upload_time_iso_8601": "2025-02-08T14:03:35.220745Z",
"url": "https://files.pythonhosted.org/packages/c5/cb/1ceb76021b30a8903c48ba75e1d6cf2d97104c2bf8ed7e702487147e2cfb/dtpyworker-0.1.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "21f573f28c121f8556807d3e598f99210f51d02ef1ea33da79cb8415a6630594",
"md5": "4cefafcd091988d964491d8ac9b2a496",
"sha256": "551795358d61d34af69a24aebd4ea568793454060af520714e14daba477b82d8"
},
"downloads": -1,
"filename": "dtpyworker-0.1.2.tar.gz",
"has_sig": false,
"md5_digest": "4cefafcd091988d964491d8ac9b2a496",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.11",
"size": 2914,
"upload_time": "2025-02-08T14:03:36",
"upload_time_iso_8601": "2025-02-08T14:03:36.989877Z",
"url": "https://files.pythonhosted.org/packages/21/f5/73f28c121f8556807d3e598f99210f51d02ef1ea33da79cb8415a6630594/dtpyworker-0.1.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-08 14:03:36",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "dtpyworker"
}