narq


Namenarq JSON
Version 0.2.1 PyPI version JSON
download
home_pagehttps://github.com/kita99/narq.git
SummaryA distributed task queue built with asyncio and redis, with built-in web interface
upload_time2023-11-12 01:29:19
maintainer
docs_urlNone
authorAdrian
requires_python>=3.8
licenseApache-2.0
keywords asyncio task arq queue distributed
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
    <a href="https://narq.readthedocs.io/"><img src="https://github.com/kita99/narq/blob/master/images/logo.png?raw=true" width="600px" alt="narq" /></a>
</p>

[![image](https://img.shields.io/pypi/v/narq.svg?style=flat)](https://pypi.python.org/pypi/narq)
[![image](https://img.shields.io/github/license/kita99/narq)](https://github.com/kita99/narq)
[![image](https://github.com/kita99/narq/workflows/pypi/badge.svg)](https://github.com/kita99/narq/actions?query=workflow:pypi)
[![image](https://github.com/kita99/narq/workflows/ci/badge.svg)](https://github.com/kita99/narq/actions?query=workflow:ci)

## Introduction

narq is a distributed task queue with asyncio and redis, which is built upon
[ReArq](https://github.com/samuelcolvin/arq) (itself a rewrite of [arq](https://github.com/samuelcolvin/arq))


## Motivations

This project is an independent fork of ReArq because it is fundamentally different in its goals. Narq is intended as a
simple to reason about production-grade task queue.

## Features

- AsyncIO support, easy integration with [FastAPI](https://github.com/tiangolo/fastapi).
- Delayed tasks, cron tasks and async task support.
- Full-featured built-in web interface.
- Built-in distributed task lock to ensure a given task is ran one at a time.
- Other powerful features to be discovered.

## Web Interface

![dashboard](./images/dashboard.png)

## Requirements

- Redis >= 5.0


## Quick Start

### Task Definition

```python
# main.py
from narq import Narq

narq = Narq(db_url='mysql://root:123456@127.0.0.1:3306/narq')


@narq.on_shutdown
async def on_shutdown():
    # you can do some clean up work here like close db and so on...
    print("shutdown")


@narq.on_startup
async def on_startup():
    # you can do some initialization work here
    print("startup")


@narq.task(queue="q1")
async def add(self, a, b):
    return a + b


@narq.task(cron="*/5 * * * * * *")  # run task per 5 seconds
async def timer(self):
    return "timer"
```

### Run narq worker

```shell
> narq main:narq worker -q q1 -q q2 # consume tasks from q1 and q2 as the same time
```

```log
2021-03-29 09:54:50.464 | INFO     | narq.worker:_main:95 - Started worker successfully on queue: narq:queue:default
2021-03-29 09:54:50.465 | INFO     | narq.worker:_main:96 - Registered tasks: add, sleep, timer_add
2021-03-29 09:54:50.465 | INFO     | narq.worker:log_redis_info:86 - redis_version=6.2.1 mem_usage=1.43M clients_connected=5 db_keys=6
```

### Run narq timer

If you have timing task or delay task, you should run another command also:

```shell
> narq main:narq timer
```

```log
2021-03-29 09:54:43.878 | INFO     | narq.worker:_main:275 - Start timer successfully
2021-03-29 09:54:43.887 | INFO     | narq.worker:_main:277 - Registered timer tasks: timer_add
2021-03-29 09:54:43.894 | INFO     | narq.worker:log_redis_info:86 - redis_version=6.2.1 mem_usage=1.25M clients_connected=2 db_keys=6
```

Also, you can run timer with worker together by `narq main:narq worker -t`.

### Integration with FastAPI

```python
from fastapi import FastAPI

app = FastAPI()


@app.on_event("shutdown")
async def shutdown() -> None:
    await narq.close()


# then run task in view
@app.get("/test")
async def test():
    job = await add.delay(args=(1, 2))
    # or
    job = await add.delay(kwargs={"a": 1, "b": 2})
    # or
    job = await add.delay(1, 2)
    # or
    job = await add.delay(a=1, b=2)
    result = await job.result(timeout=5)  # wait result for 5 seconds
    print(result.result)
    return result
```


## Start web interface

```shell
> narq main:narq server
Usage: narq server [OPTIONS]

  Start rest api server.

Options:
  --host TEXT         Listen host.  [default: 0.0.0.0]
  -p, --port INTEGER  Listen port.  [default: 8000]
  -h, --help          Show this message and exit..
```

After starting the server, check [https://127.0.0.1:8000/docs](https://127.0.0.1:8000/docs) to see all endpoints and
[https://127.0.0.1:8000](https://127.0.0.1:8000) to use the web interface.

Other options will be passed into `uvicorn` directly, such as `--root-path` etc.

```shell
narq main:narq server --host 0.0.0.0 --root-path /narq
```

### Mount as FastAPI sub app

If you have an existing FastAPI service, to simplify your deployment you might want to mount the narq server as a FastAPI sub app.

```python

from fastapi import FastAPI

from examples.tasks import narq
from narq.server.app import app as narq

app = FastAPI()

app.mount("/narq", narq_app)
narq_app.set_narq(narq)
```

### Start worker inside app

You can also start worker inside your app.

```python
@app.on_event("startup")
async def startup():
    await narq.init()
    await narq.start_worker(with_timer=True, block=False)
```

## ThanksTo

- [arq](https://github.com/samuelcolvin/arq), Fast job queuing and RPC in python with asyncio and redis.
- [ReArq](https://github.com/samuelcolvin/arq), Improved arq rewrite with an API + web interface


## License

This project is licensed under the [Apache-2.0](./LICENSE) License.


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kita99/narq.git",
    "name": "narq",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "asyncio,task,arq,queue,distributed",
    "author": "Adrian",
    "author_email": "adriandinis99@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/57/bd/cf7f1994fdb266976948915cde1282b6e9270d8dc54ecca39ff64fecacdf/narq-0.2.1.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n    <a href=\"https://narq.readthedocs.io/\"><img src=\"https://github.com/kita99/narq/blob/master/images/logo.png?raw=true\" width=\"600px\" alt=\"narq\" /></a>\n</p>\n\n[![image](https://img.shields.io/pypi/v/narq.svg?style=flat)](https://pypi.python.org/pypi/narq)\n[![image](https://img.shields.io/github/license/kita99/narq)](https://github.com/kita99/narq)\n[![image](https://github.com/kita99/narq/workflows/pypi/badge.svg)](https://github.com/kita99/narq/actions?query=workflow:pypi)\n[![image](https://github.com/kita99/narq/workflows/ci/badge.svg)](https://github.com/kita99/narq/actions?query=workflow:ci)\n\n## Introduction\n\nnarq is a distributed task queue with asyncio and redis, which is built upon\n[ReArq](https://github.com/samuelcolvin/arq) (itself a rewrite of [arq](https://github.com/samuelcolvin/arq))\n\n\n## Motivations\n\nThis project is an independent fork of ReArq because it is fundamentally different in its goals. Narq is intended as a\nsimple to reason about production-grade task queue.\n\n## Features\n\n- AsyncIO support, easy integration with [FastAPI](https://github.com/tiangolo/fastapi).\n- Delayed tasks, cron tasks and async task support.\n- Full-featured built-in web interface.\n- Built-in distributed task lock to ensure a given task is ran one at a time.\n- Other powerful features to be discovered.\n\n## Web Interface\n\n![dashboard](./images/dashboard.png)\n\n## Requirements\n\n- Redis >= 5.0\n\n\n## Quick Start\n\n### Task Definition\n\n```python\n# main.py\nfrom narq import Narq\n\nnarq = Narq(db_url='mysql://root:123456@127.0.0.1:3306/narq')\n\n\n@narq.on_shutdown\nasync def on_shutdown():\n    # you can do some clean up work here like close db and so on...\n    print(\"shutdown\")\n\n\n@narq.on_startup\nasync def on_startup():\n    # you can do some initialization work here\n    print(\"startup\")\n\n\n@narq.task(queue=\"q1\")\nasync def add(self, a, b):\n    return a + b\n\n\n@narq.task(cron=\"*/5 * * * * * *\")  # run task per 5 seconds\nasync def timer(self):\n    return \"timer\"\n```\n\n### Run narq worker\n\n```shell\n> narq main:narq worker -q q1 -q q2 # consume tasks from q1 and q2 as the same time\n```\n\n```log\n2021-03-29 09:54:50.464 | INFO     | narq.worker:_main:95 - Started worker successfully on queue: narq:queue:default\n2021-03-29 09:54:50.465 | INFO     | narq.worker:_main:96 - Registered tasks: add, sleep, timer_add\n2021-03-29 09:54:50.465 | INFO     | narq.worker:log_redis_info:86 - redis_version=6.2.1 mem_usage=1.43M clients_connected=5 db_keys=6\n```\n\n### Run narq timer\n\nIf you have timing task or delay task, you should run another command also:\n\n```shell\n> narq main:narq timer\n```\n\n```log\n2021-03-29 09:54:43.878 | INFO     | narq.worker:_main:275 - Start timer successfully\n2021-03-29 09:54:43.887 | INFO     | narq.worker:_main:277 - Registered timer tasks: timer_add\n2021-03-29 09:54:43.894 | INFO     | narq.worker:log_redis_info:86 - redis_version=6.2.1 mem_usage=1.25M clients_connected=2 db_keys=6\n```\n\nAlso, you can run timer with worker together by `narq main:narq worker -t`.\n\n### Integration with FastAPI\n\n```python\nfrom fastapi import FastAPI\n\napp = FastAPI()\n\n\n@app.on_event(\"shutdown\")\nasync def shutdown() -> None:\n    await narq.close()\n\n\n# then run task in view\n@app.get(\"/test\")\nasync def test():\n    job = await add.delay(args=(1, 2))\n    # or\n    job = await add.delay(kwargs={\"a\": 1, \"b\": 2})\n    # or\n    job = await add.delay(1, 2)\n    # or\n    job = await add.delay(a=1, b=2)\n    result = await job.result(timeout=5)  # wait result for 5 seconds\n    print(result.result)\n    return result\n```\n\n\n## Start web interface\n\n```shell\n> narq main:narq server\nUsage: narq server [OPTIONS]\n\n  Start rest api server.\n\nOptions:\n  --host TEXT         Listen host.  [default: 0.0.0.0]\n  -p, --port INTEGER  Listen port.  [default: 8000]\n  -h, --help          Show this message and exit..\n```\n\nAfter starting the server, check [https://127.0.0.1:8000/docs](https://127.0.0.1:8000/docs) to see all endpoints and\n[https://127.0.0.1:8000](https://127.0.0.1:8000) to use the web interface.\n\nOther options will be passed into `uvicorn` directly, such as `--root-path` etc.\n\n```shell\nnarq main:narq server --host 0.0.0.0 --root-path /narq\n```\n\n### Mount as FastAPI sub app\n\nIf you have an existing FastAPI service, to simplify your deployment you might want to mount the narq server as a FastAPI sub app.\n\n```python\n\nfrom fastapi import FastAPI\n\nfrom examples.tasks import narq\nfrom narq.server.app import app as narq\n\napp = FastAPI()\n\napp.mount(\"/narq\", narq_app)\nnarq_app.set_narq(narq)\n```\n\n### Start worker inside app\n\nYou can also start worker inside your app.\n\n```python\n@app.on_event(\"startup\")\nasync def startup():\n    await narq.init()\n    await narq.start_worker(with_timer=True, block=False)\n```\n\n## ThanksTo\n\n- [arq](https://github.com/samuelcolvin/arq), Fast job queuing and RPC in python with asyncio and redis.\n- [ReArq](https://github.com/samuelcolvin/arq), Improved arq rewrite with an API + web interface\n\n\n## License\n\nThis project is licensed under the [Apache-2.0](./LICENSE) License.\n\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "A distributed task queue built with asyncio and redis, with built-in web interface",
    "version": "0.2.1",
    "project_urls": {
        "Documentation": "https://github.com/kita99/narq/blob/master/README.md",
        "Homepage": "https://github.com/kita99/narq.git",
        "Repository": "https://github.com/kita99/narq.git"
    },
    "split_keywords": [
        "asyncio",
        "task",
        "arq",
        "queue",
        "distributed"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6b4939c375ea8a07caa516e2ef4e7b6432119ff53937cf6f00be2a1adbab1e2c",
                "md5": "e352f9f551c1ddf86fbd7e739fe127e0",
                "sha256": "7177619650bbc9e3b549f2fd362e6d42abdbc6bb06b8dee783a89cb8e1c29737"
            },
            "downloads": -1,
            "filename": "narq-0.2.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e352f9f551c1ddf86fbd7e739fe127e0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 49252,
            "upload_time": "2023-11-12T01:29:18",
            "upload_time_iso_8601": "2023-11-12T01:29:18.079064Z",
            "url": "https://files.pythonhosted.org/packages/6b/49/39c375ea8a07caa516e2ef4e7b6432119ff53937cf6f00be2a1adbab1e2c/narq-0.2.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "57bdcf7f1994fdb266976948915cde1282b6e9270d8dc54ecca39ff64fecacdf",
                "md5": "3dd6d4716bdf18bd8f734fd4905bac56",
                "sha256": "c08321e8396d33902f556256c901850757af6bb7ec7b5dcea081c02b7e0ad11a"
            },
            "downloads": -1,
            "filename": "narq-0.2.1.tar.gz",
            "has_sig": false,
            "md5_digest": "3dd6d4716bdf18bd8f734fd4905bac56",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 33335,
            "upload_time": "2023-11-12T01:29:19",
            "upload_time_iso_8601": "2023-11-12T01:29:19.628312Z",
            "url": "https://files.pythonhosted.org/packages/57/bd/cf7f1994fdb266976948915cde1282b6e9270d8dc54ecca39ff64fecacdf/narq-0.2.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-11-12 01:29:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kita99",
    "github_project": "narq",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "narq"
}
        
Elapsed time: 0.40747s