# RestQ
RestQ is a lightweight, and fully async task queue built on top of Redis. It provides a simple yet powerful way to handle task job processing in your Python applications. Think of it as your application's personal assistant that diligently processes tasks whenever you need them done (and Redis is up and running π
).
I built this for 3 reasons:
1. For Fun π€
2. To finally land a job... by building a job queue π―
3. I needed a way to separate the task enqueueing process from the worker execution i.e workers can live anywhere, even in different projects/repos, and donβt need your app logic baked in.
## Installation
You can install RestQ using Poetry:
```bash
poetry add restq
```
Or using pip:
```bash
pip install restq
```
## Requirements
- Python >= 3.9
- Redis server
## Quick Start
Here's a simple example of how to use RestQ:
### Define the worker (worker.py)
```python
import asyncio
from restq import task, Worker
REDIS_URL = "redis://localhost:6379/0"
@task(name="MyTask")
async def handler(foo: str) -> None:
print(f"Sending to ....{foo}")
async def main() -> None:
worker = Worker(queue_name="your-unique-queue-name", url=REDIS_URL, tasks=[handler])
await worker.start()
asyncio.run(main())
```
### Define the Queue (queue.py)
```python
from restq import Queue, Task
# Initialize the queue
REDIS_URL = "redis://localhost:6379/0"
queue = Queue(name="your-unique-queue-name", url=REDIS_URL)
# Enqueue a task
queue.add(task_name="MyTask", kwargs={"foo": "bar"}, mode="json")
```
## Advanced Usage
### Task Retries
```python
from restq import task
@task(max_retries=3, retry_delay=60)
def sensitive_operation():
# Your code here
pass
```
## Configuration
### Queue Configuration
The `Queue` class is the main entry point for adding tasks to your queue. It provides both synchronous and asynchronous implementations through `Queue` and `AsyncQueue` respectively.
```python
from restq import Queue, AsyncQueue
# Synchronous Queue
queue = Queue(
name="your-queue-name", # Unique name for your queue
url="redis://localhost:6379/0" # Redis connection URL
)
# Asynchronous Queue
async_queue = AsyncQueue(
name="your-queue-name",
url="redis://localhost:6379/0"
)
```
### Adding Tasks
The `add` method allows you to enqueue tasks with various options:
```python
# Basic task addition
queue.add(
task_name="MyTask", # Name of the task to execute
kwargs={"key": "value"}, # Task arguments (optional)
mode="json", # Serialization mode: "json" (default) or "pickle"
delay=None # Delay execution (optional)
)
# Task with delay (seconds)
queue.add(
task_name="DelayedTask",
kwargs={"key": "value"},
delay=60 # Task will execute after 60 seconds
)
# Task with timedelta delay
from datetime import timedelta
queue.add(
task_name="DelayedTask",
kwargs={"key": "value"},
delay=timedelta(minutes=5) # Task will execute after 5 minutes
)
```
### Why kwargs and JSON?
RestQ uses kwargs (keyword arguments) for task data and JSON serialization by default. Here's why:
#### Universal Communication
JSON works everywhere - Python, Node.js, Go, Rust, this means:
- Your Python app can queue tasks today
- Your Node.js service can queue and run tasks tomorrow
- Your Go microservice can queue and run tasks next week
- Workers would process them all the same way!
#### Javascript
``` javascript
await queue.add("process_order", { orderId: "123", amount: 99.99 })
```
#### Go
``` go
queue.Add("process_order", map[string]interface{}{"order_id": "123", "amount": 99.99})
```
#### Serialization Modes
- `json` (default): Uses orjson for fast JSON serialization. Best for most use cases and future language clients.
- `pickle`: Allows serialization of complex Python objects. Use with trusted input only, and only when you need Python-specific features.
### Worker Configuration
Workers are responsible for executing tasks from the queue. They can be configured with various options:
```python
from restq import Worker, task
# Define your task
@task(
name="MyTask", # Task name (required)
max_retry=3, # Maximum retry attempts (optional)
retry_delay=5 # Delay between retries in seconds (optional)
)
async def my_task(key: str) -> None:
print(f"Processing {key}")
# Initialize the worker
worker = Worker(
queue_name="your-queue-name", # Queue to listen to
url="redis://localhost:6379/0", # Redis connection URL
tasks=[my_task], # List of task handlers
name="worker-1" # Optional worker name
)
# Start the worker
await worker.start(concurrency=1) # Number of concurrent tasks (default: 1)
```
#### Worker Features
- Automatic task retries with configurable delay
- Delayed task execution
- Task persistence through Redis streams
- Automatic recovery of pending tasks
- Distributed task processing across multiple workers
## Dependencies
- redis==5.3.1
- orjson==^3.11.1
- colorama==^0.4.6
- pydantic==^2.11.7
- anyio==^4.10.0
## Development
To set up the development environment:
```bash
# Clone the repository
git clone https://github.com/yourusername/restq.git
cd restq
# Install dependencies
poetry install
```
## Future Features
- Task status monitoring
- Multi process handling
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
## License
This project is licensed under the MIT License - see the LICENSE file for details.
## Author
- dakohhh (wiizzydreadmill@gmail.com)
## Support
If you encounter any issues or have questions, please file an issue on the GitHub repository.
Raw data
{
"_id": null,
"home_page": "https://github.com/dakohhh/restq",
"name": "restq-py",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": null,
"keywords": "queue, task, worker, lightweight",
"author": "dakohhh",
"author_email": "wiizzydreadmill@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/b9/d7/fc4680049f9f9fc48454e1974c8249607d94ce270f263f17415c6ba66418/restq_py-0.1.0.tar.gz",
"platform": null,
"description": "# RestQ\n\nRestQ is a lightweight, and fully async task queue built on top of Redis. It provides a simple yet powerful way to handle task job processing in your Python applications. Think of it as your application's personal assistant that diligently processes tasks whenever you need them done (and Redis is up and running \ud83d\ude05).\n\nI built this for 3 reasons:\n1. For Fun \ud83e\udd17\n2. To finally land a job... by building a job queue \ud83c\udfaf\n3. I needed a way to separate the task enqueueing process from the worker execution i.e workers can live anywhere, even in different projects/repos, and don\u2019t need your app logic baked in.\n\n## Installation\n\nYou can install RestQ using Poetry:\n\n```bash\npoetry add restq\n```\n\nOr using pip:\n\n```bash\npip install restq\n```\n\n## Requirements\n\n- Python >= 3.9\n- Redis server\n\n## Quick Start\n\nHere's a simple example of how to use RestQ:\n\n\n### Define the worker (worker.py)\n\n```python\nimport asyncio\nfrom restq import task, Worker\n\nREDIS_URL = \"redis://localhost:6379/0\"\n\n@task(name=\"MyTask\")\nasync def handler(foo: str) -> None:\n print(f\"Sending to ....{foo}\")\n\n\nasync def main() -> None:\n worker = Worker(queue_name=\"your-unique-queue-name\", url=REDIS_URL, tasks=[handler])\n\n await worker.start()\n\n\nasyncio.run(main())\n```\n\n\n### Define the Queue (queue.py)\n\n```python\nfrom restq import Queue, Task\n\n# Initialize the queue\nREDIS_URL = \"redis://localhost:6379/0\"\n\nqueue = Queue(name=\"your-unique-queue-name\", url=REDIS_URL)\n\n\n# Enqueue a task\nqueue.add(task_name=\"MyTask\", kwargs={\"foo\": \"bar\"}, mode=\"json\")\n```\n\n\n## Advanced Usage\n\n### Task Retries\n\n```python\nfrom restq import task\n\n@task(max_retries=3, retry_delay=60)\ndef sensitive_operation():\n # Your code here\n pass\n```\n\n## Configuration\n\n### Queue Configuration\n\nThe `Queue` class is the main entry point for adding tasks to your queue. It provides both synchronous and asynchronous implementations through `Queue` and `AsyncQueue` respectively.\n\n```python\nfrom restq import Queue, AsyncQueue\n\n# Synchronous Queue\nqueue = Queue(\n name=\"your-queue-name\", # Unique name for your queue\n url=\"redis://localhost:6379/0\" # Redis connection URL\n)\n\n# Asynchronous Queue\nasync_queue = AsyncQueue(\n name=\"your-queue-name\",\n url=\"redis://localhost:6379/0\"\n)\n```\n\n### Adding Tasks\n\nThe `add` method allows you to enqueue tasks with various options:\n\n```python\n# Basic task addition\nqueue.add(\n task_name=\"MyTask\", # Name of the task to execute\n kwargs={\"key\": \"value\"}, # Task arguments (optional)\n mode=\"json\", # Serialization mode: \"json\" (default) or \"pickle\"\n delay=None # Delay execution (optional)\n)\n\n# Task with delay (seconds)\nqueue.add(\n task_name=\"DelayedTask\",\n kwargs={\"key\": \"value\"},\n delay=60 # Task will execute after 60 seconds\n)\n\n# Task with timedelta delay\nfrom datetime import timedelta\n\nqueue.add(\n task_name=\"DelayedTask\",\n kwargs={\"key\": \"value\"},\n delay=timedelta(minutes=5) # Task will execute after 5 minutes\n)\n```\n\n### Why kwargs and JSON?\n\nRestQ uses kwargs (keyword arguments) for task data and JSON serialization by default. Here's why:\n\n#### Universal Communication\nJSON works everywhere - Python, Node.js, Go, Rust, this means:\n- Your Python app can queue tasks today\n- Your Node.js service can queue and run tasks tomorrow\n- Your Go microservice can queue and run tasks next week\n- Workers would process them all the same way!\n\n#### Javascript\n``` javascript\nawait queue.add(\"process_order\", { orderId: \"123\", amount: 99.99 })\n```\n\n#### Go\n``` go\nqueue.Add(\"process_order\", map[string]interface{}{\"order_id\": \"123\", \"amount\": 99.99})\n```\n\n\n#### Serialization Modes\n- `json` (default): Uses orjson for fast JSON serialization. Best for most use cases and future language clients.\n- `pickle`: Allows serialization of complex Python objects. Use with trusted input only, and only when you need Python-specific features.\n\n### Worker Configuration\n\nWorkers are responsible for executing tasks from the queue. They can be configured with various options:\n\n```python\nfrom restq import Worker, task\n\n# Define your task\n@task(\n name=\"MyTask\", # Task name (required)\n max_retry=3, # Maximum retry attempts (optional)\n retry_delay=5 # Delay between retries in seconds (optional)\n)\nasync def my_task(key: str) -> None:\n print(f\"Processing {key}\")\n\n# Initialize the worker\nworker = Worker(\n queue_name=\"your-queue-name\", # Queue to listen to\n url=\"redis://localhost:6379/0\", # Redis connection URL\n tasks=[my_task], # List of task handlers\n name=\"worker-1\" # Optional worker name\n)\n\n# Start the worker\nawait worker.start(concurrency=1) # Number of concurrent tasks (default: 1)\n```\n\n#### Worker Features\n- Automatic task retries with configurable delay\n- Delayed task execution\n- Task persistence through Redis streams\n- Automatic recovery of pending tasks\n- Distributed task processing across multiple workers\n\n## Dependencies\n\n- redis==5.3.1\n- orjson==^3.11.1\n- colorama==^0.4.6\n- pydantic==^2.11.7\n- anyio==^4.10.0\n\n## Development\n\nTo set up the development environment:\n\n```bash\n# Clone the repository\ngit clone https://github.com/yourusername/restq.git\ncd restq\n\n# Install dependencies\npoetry install\n\n```\n\n## Future Features\n- Task status monitoring\n- Multi process handling\n\n## Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.\n\n## License\n\nThis project is licensed under the MIT License - see the LICENSE file for details.\n\n## Author\n\n- dakohhh (wiizzydreadmill@gmail.com)\n\n## Support\n\nIf you encounter any issues or have questions, please file an issue on the GitHub repository.\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Restq is a lightweight high-performance task queue built on top of Redis",
"version": "0.1.0",
"project_urls": {
"Homepage": "https://github.com/dakohhh/restq",
"Repository": "https://github.com/dakohhh/restq"
},
"split_keywords": [
"queue",
" task",
" worker",
" lightweight"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "4c328aa7b5d62cbf0e83378ed102e316cef1d20ee2450a2edc43b578dff1e6b6",
"md5": "1098738c40ecf8ccdc51576b961fd091",
"sha256": "61fce4a816b09e73dca1f47bbf5ed1112afe5550ec24bb5b3827879258458328"
},
"downloads": -1,
"filename": "restq_py-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1098738c40ecf8ccdc51576b961fd091",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 10275,
"upload_time": "2025-08-28T16:27:46",
"upload_time_iso_8601": "2025-08-28T16:27:46.363626Z",
"url": "https://files.pythonhosted.org/packages/4c/32/8aa7b5d62cbf0e83378ed102e316cef1d20ee2450a2edc43b578dff1e6b6/restq_py-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b9d7fc4680049f9f9fc48454e1974c8249607d94ce270f263f17415c6ba66418",
"md5": "a9b152c236ed321bd7b210297de3d532",
"sha256": "247db209bbeec55a2a67b5a735064f683b9cef0333ac803ea96ad6d7305308ef"
},
"downloads": -1,
"filename": "restq_py-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "a9b152c236ed321bd7b210297de3d532",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 10610,
"upload_time": "2025-08-28T16:27:47",
"upload_time_iso_8601": "2025-08-28T16:27:47.500747Z",
"url": "https://files.pythonhosted.org/packages/b9/d7/fc4680049f9f9fc48454e1974c8249607d94ce270f263f17415c6ba66418/restq_py-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-28 16:27:47",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "dakohhh",
"github_project": "restq",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "restq-py"
}