rq


Namerq JSON
Version 1.16.2 PyPI version JSON
download
home_pageNone
SummaryRQ is a simple, lightweight, library for creating background jobs, and processing them.
upload_time2024-05-01 07:13:14
maintainerSelwin Ong
docs_urlNone
authorNone
requires_python>=3.7
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            RQ (_Redis Queue_) is a simple Python library for queueing jobs and processing
them in the background with workers.  It is backed by Redis and it is designed
to have a low barrier to entry.  It should be integrated in your web stack
easily.

RQ requires Redis >= 3.0.0.

[![Build status](https://github.com/rq/rq/workflows/Test%20rq/badge.svg)](https://github.com/rq/rq/actions?query=workflow%3A%22Test+rq%22)
[![PyPI](https://img.shields.io/pypi/pyversions/rq.svg)](https://pypi.python.org/pypi/rq)
[![Coverage](https://codecov.io/gh/rq/rq/branch/master/graph/badge.svg)](https://codecov.io/gh/rq/rq)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)


Full documentation can be found [here][d].


## Support RQ

If you find RQ useful, please consider supporting this project via [Tidelift](https://tidelift.com/subscription/pkg/pypi-rq?utm_source=pypi-rq&utm_medium=referral&utm_campaign=readme).


## Getting started

First, run a Redis server, of course:

```console
$ redis-server
```

To put jobs on queues, you don't have to do anything special, just define
your typically lengthy or blocking function:

```python
import requests

def count_words_at_url(url):
    """Just an example function that's called async."""
    resp = requests.get(url)
    return len(resp.text.split())
```

You do use the excellent [requests][r] package, don't you?

Then, create an RQ queue:

```python
from redis import Redis
from rq import Queue

queue = Queue(connection=Redis())
```

And enqueue the function call:

```python
from my_module import count_words_at_url
job = queue.enqueue(count_words_at_url, 'http://nvie.com')
```

Scheduling jobs are also similarly easy:

```python
# Schedule job to run at 9:15, October 10th
job = queue.enqueue_at(datetime(2019, 10, 10, 9, 15), say_hello)

# Schedule job to run in 10 seconds
job = queue.enqueue_in(timedelta(seconds=10), say_hello)
```

Retrying failed jobs is also supported:

```python
from rq import Retry

# Retry up to 3 times, failed job will be requeued immediately
queue.enqueue(say_hello, retry=Retry(max=3))

# Retry up to 3 times, with configurable intervals between retries
queue.enqueue(say_hello, retry=Retry(max=3, interval=[10, 30, 60]))
```

For a more complete example, refer to the [docs][d].  But this is the essence.


### The worker

To start executing enqueued function calls in the background, start a worker
from your project's directory:

```console
$ rq worker --with-scheduler
*** Listening for work on default
Got count_words_at_url('http://nvie.com') from default
Job result = 818
*** Listening for work on default
```

That's about it.


## Installation

Simply use the following command to install the latest released version:

    pip install rq

If you want the cutting edge version (that may well be broken), use this:

    pip install git+https://github.com/rq/rq.git@master#egg=rq


## Related Projects

Check out these below repos which might be useful in your rq based project.

- [rq-dashboard](https://github.com/Parallels/rq-dashboard)
- [rqmonitor](https://github.com/pranavgupta1234/rqmonitor)
- [django-rq](https://github.com/rq/django-rq)
- [Flask-RQ2](https://github.com/rq/Flask-RQ2)
- [rq-scheduler](https://github.com/rq/rq-scheduler)


## Project history

This project has been inspired by the good parts of [Celery][1], [Resque][2]
and [this snippet][3], and has been created as a lightweight alternative to the
heaviness of Celery or other AMQP-based queueing implementations.


[r]: http://python-requests.org
[d]: http://python-rq.org/
[m]: http://pypi.python.org/pypi/mailer
[p]: http://docs.python.org/library/pickle.html
[1]: http://docs.celeryq.dev/
[2]: https://github.com/resque/resque
[3]: https://github.com/fengsp/flask-snippets/blob/1f65833a4291c5b833b195a09c365aa815baea4e/utilities/rq.py

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "rq",
    "maintainer": "Selwin Ong",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "Selwin Ong <selwin.ong@gmail.com>, Vincent Driessen <vincent@3rdcloud.com>",
    "download_url": "https://files.pythonhosted.org/packages/c5/1d/a020e8f3559293635dd6283d037bfc15f866136498da3db1e5416c65773d/rq-1.16.2.tar.gz",
    "platform": null,
    "description": "RQ (_Redis Queue_) is a simple Python library for queueing jobs and processing\nthem in the background with workers.  It is backed by Redis and it is designed\nto have a low barrier to entry.  It should be integrated in your web stack\neasily.\n\nRQ requires Redis >= 3.0.0.\n\n[![Build status](https://github.com/rq/rq/workflows/Test%20rq/badge.svg)](https://github.com/rq/rq/actions?query=workflow%3A%22Test+rq%22)\n[![PyPI](https://img.shields.io/pypi/pyversions/rq.svg)](https://pypi.python.org/pypi/rq)\n[![Coverage](https://codecov.io/gh/rq/rq/branch/master/graph/badge.svg)](https://codecov.io/gh/rq/rq)\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n\n\nFull documentation can be found [here][d].\n\n\n## Support RQ\n\nIf you find RQ useful, please consider supporting this project via [Tidelift](https://tidelift.com/subscription/pkg/pypi-rq?utm_source=pypi-rq&utm_medium=referral&utm_campaign=readme).\n\n\n## Getting started\n\nFirst, run a Redis server, of course:\n\n```console\n$ redis-server\n```\n\nTo put jobs on queues, you don't have to do anything special, just define\nyour typically lengthy or blocking function:\n\n```python\nimport requests\n\ndef count_words_at_url(url):\n    \"\"\"Just an example function that's called async.\"\"\"\n    resp = requests.get(url)\n    return len(resp.text.split())\n```\n\nYou do use the excellent [requests][r] package, don't you?\n\nThen, create an RQ queue:\n\n```python\nfrom redis import Redis\nfrom rq import Queue\n\nqueue = Queue(connection=Redis())\n```\n\nAnd enqueue the function call:\n\n```python\nfrom my_module import count_words_at_url\njob = queue.enqueue(count_words_at_url, 'http://nvie.com')\n```\n\nScheduling jobs are also similarly easy:\n\n```python\n# Schedule job to run at 9:15, October 10th\njob = queue.enqueue_at(datetime(2019, 10, 10, 9, 15), say_hello)\n\n# Schedule job to run in 10 seconds\njob = queue.enqueue_in(timedelta(seconds=10), say_hello)\n```\n\nRetrying failed jobs is also supported:\n\n```python\nfrom rq import Retry\n\n# Retry up to 3 times, failed job will be requeued immediately\nqueue.enqueue(say_hello, retry=Retry(max=3))\n\n# Retry up to 3 times, with configurable intervals between retries\nqueue.enqueue(say_hello, retry=Retry(max=3, interval=[10, 30, 60]))\n```\n\nFor a more complete example, refer to the [docs][d].  But this is the essence.\n\n\n### The worker\n\nTo start executing enqueued function calls in the background, start a worker\nfrom your project's directory:\n\n```console\n$ rq worker --with-scheduler\n*** Listening for work on default\nGot count_words_at_url('http://nvie.com') from default\nJob result = 818\n*** Listening for work on default\n```\n\nThat's about it.\n\n\n## Installation\n\nSimply use the following command to install the latest released version:\n\n    pip install rq\n\nIf you want the cutting edge version (that may well be broken), use this:\n\n    pip install git+https://github.com/rq/rq.git@master#egg=rq\n\n\n## Related Projects\n\nCheck out these below repos which might be useful in your rq based project.\n\n- [rq-dashboard](https://github.com/Parallels/rq-dashboard)\n- [rqmonitor](https://github.com/pranavgupta1234/rqmonitor)\n- [django-rq](https://github.com/rq/django-rq)\n- [Flask-RQ2](https://github.com/rq/Flask-RQ2)\n- [rq-scheduler](https://github.com/rq/rq-scheduler)\n\n\n## Project history\n\nThis project has been inspired by the good parts of [Celery][1], [Resque][2]\nand [this snippet][3], and has been created as a lightweight alternative to the\nheaviness of Celery or other AMQP-based queueing implementations.\n\n\n[r]: http://python-requests.org\n[d]: http://python-rq.org/\n[m]: http://pypi.python.org/pypi/mailer\n[p]: http://docs.python.org/library/pickle.html\n[1]: http://docs.celeryq.dev/\n[2]: https://github.com/resque/resque\n[3]: https://github.com/fengsp/flask-snippets/blob/1f65833a4291c5b833b195a09c365aa815baea4e/utilities/rq.py\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "RQ is a simple, lightweight, library for creating background jobs, and processing them.",
    "version": "1.16.2",
    "project_urls": {
        "changelog": "https://github.com/rq/rq/blob/master/CHANGES.md",
        "documentation": "https://python-rq.org/docs/",
        "homepage": "https://python-rq.org/",
        "repository": "https://github.com/rq/rq/"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4635db396caf7cfe71a9e661cea520b276f1472df25015e7cb9143c65a1dca6d",
                "md5": "00ce3a406e3b9e22752d660a7772790d",
                "sha256": "52e619f6cb469b00e04da74305045d244b75fecb2ecaa4f26422add57d3c5f09"
            },
            "downloads": -1,
            "filename": "rq-1.16.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "00ce3a406e3b9e22752d660a7772790d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 90911,
            "upload_time": "2024-05-01T07:13:11",
            "upload_time_iso_8601": "2024-05-01T07:13:11.158982Z",
            "url": "https://files.pythonhosted.org/packages/46/35/db396caf7cfe71a9e661cea520b276f1472df25015e7cb9143c65a1dca6d/rq-1.16.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c51da020e8f3559293635dd6283d037bfc15f866136498da3db1e5416c65773d",
                "md5": "e3d3d0132166c98009bed226d3f853a7",
                "sha256": "5c5b9ad5fbaf792b8fada25cc7627f4d206a9a4455aced371d4f501cc3f13b34"
            },
            "downloads": -1,
            "filename": "rq-1.16.2.tar.gz",
            "has_sig": false,
            "md5_digest": "e3d3d0132166c98009bed226d3f853a7",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 629881,
            "upload_time": "2024-05-01T07:13:14",
            "upload_time_iso_8601": "2024-05-01T07:13:14.136105Z",
            "url": "https://files.pythonhosted.org/packages/c5/1d/a020e8f3559293635dd6283d037bfc15f866136498da3db1e5416c65773d/rq-1.16.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-01 07:13:14",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "rq",
    "github_project": "rq",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "tox": true,
    "lcname": "rq"
}
        
Elapsed time: 0.24338s