Name | asyncio-queue JSON |
Version |
0.2
JSON |
| download |
home_page | |
Summary | A simple worker task queue with async support |
upload_time | 2023-11-21 03:57:28 |
maintainer | |
docs_url | None |
author | Chewer3618 |
requires_python | >=3.8 |
license | |
keywords |
async
workers
queue
redis
distributed
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# AsyncQ
A simple worker task queue with async support
## Installation
```bash
pip install asyncio_queue
```
## Features
1. Submit asynchronously task
2. Task routing
3. Distributed async worker process
## Use Case: Asynchronous Task Queue for I/O-Bound Task Processing
### Overview
In scenarios where you need to execute I/O-bound tasks asynchronously within a single application, the "async_q" library provides a straightforward solution. This use case illustrates how to create and utilize the library for such purposes.
### Prerequisites
Before using the "async_q" library, ensure you have Python 3.8 and Redis (Version 5.0 to current) installed, along with the required dependencies.
### Setting Up the Application
#### 1. Creating the Async Task Queue App
Begin by creating an instance of the `AsyncTaskQueue` to set up your application. In this example, we will name our app "async_q_app" and configure it to use Redis as the message broker.
```python
# main_app.py
from asyncio_queue import AsyncTaskQueue, RedisBuilder
# Define Async Task Queue App
async_q_app = AsyncTaskQueue(
redis_builder=RedisBuilder(
port='6379',
)
)
```
#### 2. Defining the Task Function
Next, define a task function that will be submitted to the queue for processing. In this example, we have a task function named `my_task`. This function simulates I/O waiting with a specified delay.
```python
# my_task.py
import logging
import asyncio
from asyncio_queue import submit_task
# For initializing the app
from main_app import async_q_app
async def my_task(idx, delay=2, *args, **kwargs):
# Simulate I/O waiting
await asyncio.sleep(delay)
logging.info(f'{idx} has finished the task. Task ID: {kwargs.get("task_id")}')
```
#### 3. Submitting Tasks
To submit tasks for processing, you can use the `submit_task` function. In this example, we submit 20 tasks to be processed by the queue.
```python
if __name__ == '__main__':
for i in range(20):
submit_task(my_task, kwargs={'idx': i, 'delay': 10})
```
In the previous example, the `submit_task` function also includes a `queue` argument with the default value set to `default` . This offers flexibility in case you want to specify a different queue for a particular task, but if no queue is provided, it will use the default value of `default`.
```python
if __name__ == '__main__':
for i in range(20):
# Submit a task with a default queue value
submit_task(my_task, kwargs={'idx': i, 'delay': 10, 'queue': 'default'})
```
### Running Worker Processes
#### 4. Initializing Worker Processes
In order to efficiently handle incoming tasks, it's crucial to set up and launch worker processes. The level of concurrency can be precisely determined by employing the `-c` flag. In the following example, we will initiate five worker processes that target the `async_q_app` module specified in `main_app.py`. By default, these workers will exclusively process tasks from the `default` queue.
```bash
$ python -m asyncio_queue -a main_app.py:async_q_app -c 5
```
However, you also have the flexibility to assign specific workers to process tasks from different queues, as demonstrated below:
```bash
$ python -m asyncio_queue -a main_app.py:async_q_app -c 5 -q mail_ps
```
By making these adjustments, you can tailor the behavior of your worker processes to suit your application's specific requirements.
#### 5. Submitting Tasks for Processing
With the worker processes running, you can now submit tasks for processing. Use the following command to execute the `my_task.py` script, which submits tasks to the queue.
```bash
$ python my_task.py
```
### Result
The worker processes will asynchronously process the submitted tasks with the specified delays. You can monitor the progress and completion of tasks through log messages generated by the `my_task` function. The "Async Queue" library is suitable for I/O-bound workloads that benefit from asynchronous processing within a single application.
## Todo
1. Test and check back compatibility
Raw data
{
"_id": null,
"home_page": "",
"name": "asyncio-queue",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "",
"keywords": "async,workers,queue,redis,distributed",
"author": "Chewer3618",
"author_email": "Chewer3618 <weciwam372@cabose.com>",
"download_url": "https://files.pythonhosted.org/packages/17/3b/37ac1bc3e6dca5d3149c94e13220aae34c8e268437e82f5c6ec8ae75045b/asyncio_queue-0.2.tar.gz",
"platform": null,
"description": "# AsyncQ\r\n\r\nA simple worker task queue with async support\r\n\r\n## Installation\r\n\r\n```bash\r\npip install asyncio_queue\r\n```\r\n\r\n## Features\r\n\r\n1. Submit asynchronously task\r\n2. Task routing\r\n3. Distributed async worker process\r\n\r\n## Use Case: Asynchronous Task Queue for I/O-Bound Task Processing\r\n\r\n### Overview\r\n\r\nIn scenarios where you need to execute I/O-bound tasks asynchronously within a single application, the \"async_q\" library provides a straightforward solution. This use case illustrates how to create and utilize the library for such purposes.\r\n\r\n### Prerequisites\r\n\r\nBefore using the \"async_q\" library, ensure you have Python 3.8 and Redis (Version 5.0 to current) installed, along with the required dependencies.\r\n\r\n### Setting Up the Application\r\n\r\n#### 1. Creating the Async Task Queue App\r\n\r\nBegin by creating an instance of the `AsyncTaskQueue` to set up your application. In this example, we will name our app \"async_q_app\" and configure it to use Redis as the message broker.\r\n\r\n```python\r\n# main_app.py\r\n\r\nfrom asyncio_queue import AsyncTaskQueue, RedisBuilder\r\n\r\n# Define Async Task Queue App\r\nasync_q_app = AsyncTaskQueue(\r\n redis_builder=RedisBuilder(\r\n port='6379',\r\n )\r\n)\r\n```\r\n\r\n#### 2. Defining the Task Function\r\n\r\nNext, define a task function that will be submitted to the queue for processing. In this example, we have a task function named `my_task`. This function simulates I/O waiting with a specified delay.\r\n\r\n```python\r\n# my_task.py\r\n\r\nimport logging\r\nimport asyncio\r\nfrom asyncio_queue import submit_task\r\n\r\n# For initializing the app\r\nfrom main_app import async_q_app\r\n\r\nasync def my_task(idx, delay=2, *args, **kwargs):\r\n # Simulate I/O waiting\r\n await asyncio.sleep(delay)\r\n\r\n logging.info(f'{idx} has finished the task. Task ID: {kwargs.get(\"task_id\")}')\r\n```\r\n\r\n#### 3. Submitting Tasks\r\n\r\nTo submit tasks for processing, you can use the `submit_task` function. In this example, we submit 20 tasks to be processed by the queue.\r\n\r\n```python\r\nif __name__ == '__main__':\r\n for i in range(20):\r\n submit_task(my_task, kwargs={'idx': i, 'delay': 10})\r\n```\r\n\r\nIn the previous example, the `submit_task` function also includes a `queue` argument with the default value set to `default` . This offers flexibility in case you want to specify a different queue for a particular task, but if no queue is provided, it will use the default value of `default`.\r\n\r\n```python\r\nif __name__ == '__main__':\r\n for i in range(20):\r\n # Submit a task with a default queue value\r\n submit_task(my_task, kwargs={'idx': i, 'delay': 10, 'queue': 'default'})\r\n```\r\n\r\n### Running Worker Processes\r\n\r\n#### 4. Initializing Worker Processes\r\n\r\nIn order to efficiently handle incoming tasks, it's crucial to set up and launch worker processes. The level of concurrency can be precisely determined by employing the `-c` flag. In the following example, we will initiate five worker processes that target the `async_q_app` module specified in `main_app.py`. By default, these workers will exclusively process tasks from the `default` queue.\r\n\r\n```bash\r\n$ python -m asyncio_queue -a main_app.py:async_q_app -c 5\r\n```\r\n\r\nHowever, you also have the flexibility to assign specific workers to process tasks from different queues, as demonstrated below:\r\n\r\n```bash\r\n$ python -m asyncio_queue -a main_app.py:async_q_app -c 5 -q mail_ps\r\n```\r\n\r\nBy making these adjustments, you can tailor the behavior of your worker processes to suit your application's specific requirements.\r\n\r\n#### 5. Submitting Tasks for Processing\r\n\r\nWith the worker processes running, you can now submit tasks for processing. Use the following command to execute the `my_task.py` script, which submits tasks to the queue.\r\n\r\n```bash\r\n$ python my_task.py\r\n```\r\n\r\n### Result\r\n\r\nThe worker processes will asynchronously process the submitted tasks with the specified delays. You can monitor the progress and completion of tasks through log messages generated by the `my_task` function. The \"Async Queue\" library is suitable for I/O-bound workloads that benefit from asynchronous processing within a single application.\r\n\r\n## Todo\r\n\r\n1. Test and check back compatibility\r\n",
"bugtrack_url": null,
"license": "",
"summary": "A simple worker task queue with async support",
"version": "0.2",
"project_urls": {
"Homepage": "https://github.com/tz01x/async_q"
},
"split_keywords": [
"async",
"workers",
"queue",
"redis",
"distributed"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "173b37ac1bc3e6dca5d3149c94e13220aae34c8e268437e82f5c6ec8ae75045b",
"md5": "570553809f30c264d388dbab5bec451a",
"sha256": "fcdd1e9078214fb015b4d9f1619bba6fd33e6372f3c5a43712df451255ff5e1d"
},
"downloads": -1,
"filename": "asyncio_queue-0.2.tar.gz",
"has_sig": false,
"md5_digest": "570553809f30c264d388dbab5bec451a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 4390,
"upload_time": "2023-11-21T03:57:28",
"upload_time_iso_8601": "2023-11-21T03:57:28.339602Z",
"url": "https://files.pythonhosted.org/packages/17/3b/37ac1bc3e6dca5d3149c94e13220aae34c8e268437e82f5c6ec8ae75045b/asyncio_queue-0.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-11-21 03:57:28",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "tz01x",
"github_project": "async_q",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "asyncio-queue"
}