async-q


Nameasync-q JSON
Version 0.2 PyPI version JSON
download
home_page
SummaryA simple worker task queue with async support
upload_time2023-09-12 16:34:29
maintainer
docs_urlNone
author
requires_python>=3.8
license
keywords async workers queue redis distributed
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # AsyncQ
A simple worker task queue with async support
## Installation
```bash
pip install async_q
```

## Features
1. Submit asynchronously task 
2. Task routing
3. Distributed async worker process

## Use Case: Asynchronous Task Queue for I/O-Bound Task Processing

### Overview
In scenarios where you need to execute I/O-bound tasks asynchronously within a single application, the "async_q" library provides a straightforward solution. This use case illustrates how to create and utilize the library for such purposes.

### Prerequisites
Before using the "async_q" library, ensure you have Python 3.8 and Redis (Version 5.0 to current) installed, along with the required dependencies.


### Setting Up the Application

#### 1. Creating the Async Task Queue App
Begin by creating an instance of the `AsyncTaskQueue` to set up your application. In this example, we will name our app "async_q_app" and configure it to use Redis as the message broker.

```python
# main_app.py

from async_q import AsyncTaskQueue, RedisBuilder

# Define Async Task Queue App
async_q_app = AsyncTaskQueue(
    redis_builder=RedisBuilder(
        port='6379',
    )
)
```

#### 2. Defining the Task Function
Next, define a task function that will be submitted to the queue for processing. In this example, we have a task function named `my_task`. This function simulates I/O waiting with a specified delay.

```python
# my_task.py

import logging
import asyncio
from async_q import submit_task

# For initializing the app
from main_app import async_q_app

async def my_task(idx, delay=2, *args, **kwargs):
    # Simulate I/O waiting
    await asyncio.sleep(delay)

    logging.info(f'{idx} has finished the task. Task ID: {kwargs.get("task_id")}')
```

#### 3. Submitting Tasks
To submit tasks for processing, you can use the `submit_task` function. In this example, we submit 20 tasks to be processed by the queue.

```python
if __name__ == '__main__':
    for i in range(20):
        submit_task(my_task, kwargs={'idx': i, 'delay': 10})
```

In the previous example, the `submit_task` function also includes a `queue` argument with the default value set to `default` .  This offers flexibility in case you want to specify a different queue for a particular task, but if no queue is provided, it will use the default value of `default`.

```python
if __name__ == '__main__':
    for i in range(20):
        # Submit a task with a default queue value
        submit_task(my_task, kwargs={'idx': i, 'delay': 10, 'queue': 'default'})
```

### Running Worker Processes



#### 4. Initializing Worker Processes

In order to efficiently handle incoming tasks, it's crucial to set up and launch worker processes. The level of concurrency can be precisely determined by employing the `-c` flag. In the following example, we will initiate five worker processes that target the `async_q_app` module specified in `main_app.py`. By default, these workers will exclusively process tasks from the `default` queue.

```bash
$ python -m async_q -a main_app.py:async_q_app -c 5
```

However, you also have the flexibility to assign specific workers to process tasks from different queues, as demonstrated below:

```bash
$ python -m async_q -a main_app.py:async_q_app -c 5 -q mail_ps
```

By making these adjustments, you can tailor the behavior of your worker processes to suit your application's specific requirements.

#### 5. Submitting Tasks for Processing
With the worker processes running, you can now submit tasks for processing. Use the following command to execute the `my_task.py` script, which submits tasks to the queue.

```bash
$ python my_task.py
```

### Result
The worker processes will asynchronously process the submitted tasks with the specified delays. You can monitor the progress and completion of tasks through log messages generated by the `my_task` function. The "Async Queue" library is suitable for I/O-bound workloads that benefit from asynchronous processing within a single application.

## Todo
1. Test and check back compatibility


            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "async-q",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "async,workers,queue,redis,distributed",
    "author": "",
    "author_email": "tumzied_ <abdur.rahman.tumzied.ewu@gmail.com>",
    "download_url": "",
    "platform": null,
    "description": "# AsyncQ\r\nA simple worker task queue with async support\r\n## Installation\r\n```bash\r\npip install async_q\r\n```\r\n\r\n## Features\r\n1. Submit asynchronously task \r\n2. Task routing\r\n3. Distributed async worker process\r\n\r\n## Use Case: Asynchronous Task Queue for I/O-Bound Task Processing\r\n\r\n### Overview\r\nIn scenarios where you need to execute I/O-bound tasks asynchronously within a single application, the \"async_q\" library provides a straightforward solution. This use case illustrates how to create and utilize the library for such purposes.\r\n\r\n### Prerequisites\r\nBefore using the \"async_q\" library, ensure you have Python 3.8 and Redis (Version 5.0 to current) installed, along with the required dependencies.\r\n\r\n\r\n### Setting Up the Application\r\n\r\n#### 1. Creating the Async Task Queue App\r\nBegin by creating an instance of the `AsyncTaskQueue` to set up your application. In this example, we will name our app \"async_q_app\" and configure it to use Redis as the message broker.\r\n\r\n```python\r\n# main_app.py\r\n\r\nfrom async_q import AsyncTaskQueue, RedisBuilder\r\n\r\n# Define Async Task Queue App\r\nasync_q_app = AsyncTaskQueue(\r\n    redis_builder=RedisBuilder(\r\n        port='6379',\r\n    )\r\n)\r\n```\r\n\r\n#### 2. Defining the Task Function\r\nNext, define a task function that will be submitted to the queue for processing. In this example, we have a task function named `my_task`. This function simulates I/O waiting with a specified delay.\r\n\r\n```python\r\n# my_task.py\r\n\r\nimport logging\r\nimport asyncio\r\nfrom async_q import submit_task\r\n\r\n# For initializing the app\r\nfrom main_app import async_q_app\r\n\r\nasync def my_task(idx, delay=2, *args, **kwargs):\r\n    # Simulate I/O waiting\r\n    await asyncio.sleep(delay)\r\n\r\n    logging.info(f'{idx} has finished the task. Task ID: {kwargs.get(\"task_id\")}')\r\n```\r\n\r\n#### 3. Submitting Tasks\r\nTo submit tasks for processing, you can use the `submit_task` function. In this example, we submit 20 tasks to be processed by the queue.\r\n\r\n```python\r\nif __name__ == '__main__':\r\n    for i in range(20):\r\n        submit_task(my_task, kwargs={'idx': i, 'delay': 10})\r\n```\r\n\r\nIn the previous example, the `submit_task` function also includes a `queue` argument with the default value set to `default` .  This offers flexibility in case you want to specify a different queue for a particular task, but if no queue is provided, it will use the default value of `default`.\r\n\r\n```python\r\nif __name__ == '__main__':\r\n    for i in range(20):\r\n        # Submit a task with a default queue value\r\n        submit_task(my_task, kwargs={'idx': i, 'delay': 10, 'queue': 'default'})\r\n```\r\n\r\n### Running Worker Processes\r\n\r\n\r\n\r\n#### 4. Initializing Worker Processes\r\n\r\nIn order to efficiently handle incoming tasks, it's crucial to set up and launch worker processes. The level of concurrency can be precisely determined by employing the `-c` flag. In the following example, we will initiate five worker processes that target the `async_q_app` module specified in `main_app.py`. By default, these workers will exclusively process tasks from the `default` queue.\r\n\r\n```bash\r\n$ python -m async_q -a main_app.py:async_q_app -c 5\r\n```\r\n\r\nHowever, you also have the flexibility to assign specific workers to process tasks from different queues, as demonstrated below:\r\n\r\n```bash\r\n$ python -m async_q -a main_app.py:async_q_app -c 5 -q mail_ps\r\n```\r\n\r\nBy making these adjustments, you can tailor the behavior of your worker processes to suit your application's specific requirements.\r\n\r\n#### 5. Submitting Tasks for Processing\r\nWith the worker processes running, you can now submit tasks for processing. Use the following command to execute the `my_task.py` script, which submits tasks to the queue.\r\n\r\n```bash\r\n$ python my_task.py\r\n```\r\n\r\n### Result\r\nThe worker processes will asynchronously process the submitted tasks with the specified delays. You can monitor the progress and completion of tasks through log messages generated by the `my_task` function. The \"Async Queue\" library is suitable for I/O-bound workloads that benefit from asynchronous processing within a single application.\r\n\r\n## Todo\r\n1. Test and check back compatibility\r\n\r\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "A simple worker task queue with async support",
    "version": "0.2",
    "project_urls": {
        "Homepage": "https://github.com/tz01x/async_q"
    },
    "split_keywords": [
        "async",
        "workers",
        "queue",
        "redis",
        "distributed"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4b675bdee790e1da0cae080fc22759dbd9a4b3bb45dc94a3850d362d81de1a7e",
                "md5": "5ab3efd84cbd3e1cee580f3559796547",
                "sha256": "9aeafbf8213edd4f9b70bfa8824c5e76ae0ec3aa8780438b9b738e72692ddf17"
            },
            "downloads": -1,
            "filename": "async_q-0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "5ab3efd84cbd3e1cee580f3559796547",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 8872,
            "upload_time": "2023-09-12T16:34:29",
            "upload_time_iso_8601": "2023-09-12T16:34:29.097489Z",
            "url": "https://files.pythonhosted.org/packages/4b/67/5bdee790e1da0cae080fc22759dbd9a4b3bb45dc94a3850d362d81de1a7e/async_q-0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-09-12 16:34:29",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "tz01x",
    "github_project": "async_q",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "async-q"
}
        
Elapsed time: 1.68732s