# RatelimitIO
A Python library for rate limiting, designed to handle both incoming and outgoing requests efficiently. Supports both synchronous and asynchronous paradigms. Powered by Redis, this library provides decorators and easy integration with APIs to manage request limits with precision.
#### Project Information
[![Tests & Lint](https://github.com/bagowix/ratelimit-io/actions/workflows/actions.yml/badge.svg)](https://github.com/bagowix/ratelimit-io/actions/workflows/actions.yml)
[![image](https://img.shields.io/pypi/v/ratelimit-io/0.6.3.svg)](https://pypi.python.org/pypi/ratelimit-io)
[![Test Coverage](https://img.shields.io/badge/dynamic/json?color=blueviolet&label=coverage&query=%24.totals.percent_covered_display&suffix=%25&url=https%3A%2F%2Fraw.githubusercontent.com%2Fbagowix%2Fratelimit-io%2Fmain%2Fcoverage.json)](https://github.com/bagowix/ratelimit-io/blob/main/coverage.json)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ratelimit-io)](https://pypi.org/project/ratelimit-io/)
[![License](https://img.shields.io/pypi/l/ratelimit-io)](LICENSE)
[![Downloads](https://pepy.tech/badge/ratelimit-io)](https://pepy.tech/project/ratelimit-io)
[![Formatter](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)
---
## Features
- **Incoming and Outgoing Request Management**:
- Handle inbound limits (e.g., API requests) with immediate error raising.
- Manage outbound limits (e.g., client requests to external APIs) with intelligent waiting.
- **Synchronous and Asynchronous Support**:
- Seamlessly integrate with both blocking and async applications.
- **Asynchronous Context Manager**:
- Use `async with` for automatic Lua script loading and resource management.
- **Customizable Rate Limits**:
- Define limits by requests, time periods, or custom keys.
- **Redis Backend**:
- Leverages Redis for fast and scalable rate limiting.
- **Decorators**:
- Apply rate limits easily to functions or methods.
- **Automatic Error Handling**:
- Easily manage 429 Too Many Requests errors in popular frameworks like Flask, Django, and FastAPI.
- **Granular Key Management**:
- Priority-based key resolution for efficient bucket management.
- **Ease of Use**:
- Simple and intuitive integration into Python applications.
---
## Installation
Install via pip:
```bash
pip install ratelimit-io
```
## Quick Start
### Using as a Synchronous Decorator
```python
from ratelimit_io import RatelimitIO, LimitSpec
from redis import Redis
redis_client = Redis(host="localhost", port=6379)
limiter = RatelimitIO(
backend=redis_client,
default_limit=LimitSpec(requests=10, seconds=60),
default_key="global_limit",
)
@limiter
def fetch_data():
return "Request succeeded!"
# Use the decorated function
fetch_data()
```
### Using as a Asynchronous Decorator
```python
from ratelimit_io import RatelimitIO, LimitSpec
from redis.asyncio import Redis as AsyncRedis
async_redis_client = AsyncRedis(host="localhost", port=6379)
async_limiter = RatelimitIO(
backend=async_redis_client,
default_limit=LimitSpec(requests=10, seconds=60),
default_key="global_limit",
)
@async_limiter
async def fetch_data():
return "Request succeeded!"
# Use the decorated function
await fetch_data()
```
### Asynchronous Context Manager
Simplify usage with the asynchronous context manager:
```python
from ratelimit_io import RatelimitIO, LimitSpec
from redis.asyncio import Redis as AsyncRedis
async def main():
redis_client = AsyncRedis(host="localhost", port=6379)
async with RatelimitIO(
backend=redis_client,
default_limit=LimitSpec(requests=5, seconds=10),
) as limiter:
# Lua script is automatically loaded here
await limiter.a_wait("test_key")
print("Request processed within limit!")
# The Redis connection remains open after exiting the context
```
### Incoming vs. Outgoing Request Handling (`is_incoming`)
- The `default_key` or a dynamically generated key (e.g., based on `unique_key` or `kwargs["ip"]`) determines the rate limit bucket.
- When `is_incoming=True`, the rate limiter will immediately raise a `RatelimitIOError` when limits are exceeded.
- When `is_incoming=False` (default), the rate limiter will wait until a slot becomes available.
```python
# Incoming request example (throws an error on limit breach)
limiter = RatelimitIO(backend=redis_client, is_incoming=True)
@limiter(LimitSpec(requests=5, seconds=10))
def fetch_data():
return "Request succeeded!"
# Outgoing request example (waits if limits are exceeded)
outgoing_limiter = RatelimitIO(backend=redis_client)
@outgoing_limiter(LimitSpec(requests=5, seconds=10))
def fetch_data_outgoing():
return "Request succeeded!"
```
## Dynamic `is_incoming` Detection
Automatically determines request type (incoming or outgoing) based on the context. Incoming requests (default) raise `RatelimitIOError` if limits are exceeded, while outgoing requests wait for a slot.
Examples:
```python
# Default behavior (is_incoming=True)
@limiter
def incoming_request():
return "Handled incoming request!"
for _ in range(5):
incoming_request()
# Raises RatelimitIOError after 5 requests
incoming_request()
# Outgoing request handling
limiter.is_incoming = False
for _ in range(5):
limiter.wait("outgoing_request", LimitSpec(requests=5, seconds=1))
# Waits for a slot to become available
limiter.wait("outgoing_request", LimitSpec(requests=5, seconds=1))
```
## Error Handling for 429 Responses in Frameworks
### FastAPI Example
```python
from fastapi import FastAPI, HTTPException, Request
from fastapi.responses import JSONResponse
from ratelimit_io import RatelimitIO, RatelimitIOError, LimitSpec
from redis.asyncio import Redis as AsyncRedis
app = FastAPI()
redis_client = AsyncRedis(host="localhost", port=6379)
limiter = RatelimitIO(
backend=redis_client,
default_limit=LimitSpec(requests=5, seconds=1),
is_incoming=True
)
@app.middleware("http")
async def ratelimit_middleware(request: Request, call_next):
try:
response = await call_next(request)
return response
except RatelimitIOError as exc:
return JSONResponse(
status_code=exc.status_code,
content={"detail": exc.detail}
)
@app.get("/fetch")
@limit
async def fetch_data():
return {"message": "Request succeeded!"}
```
### Django Middleware Example
```python
from django.http import JsonResponse
from django.utils.deprecation import MiddlewareMixin
from ratelimit_io import RatelimitIO, LimitSpec, RatelimitIOError
from redis import Redis
from rest_framework import status
from rest_framework.response import Response
from rest_framework.views import APIView
redis = Redis("localhost", port=6379)
limit = RatelimitIO(
backend=redis,
default_limit=LimitSpec(requests=5, seconds=1),
is_incoming=True,
)
class RatelimitMiddleware(MiddlewareMixin):
def process_exception(self, request, exception):
if isinstance(exception, RatelimitIOError):
return JsonResponse(
{"detail": exception.detail},
status=exception.status_code,
)
return None
class Foo(APIView):
permission_classes = ()
@limit
def get(self, request, *args, **kwargs):
return Response(data={"message": "ok"}, status=status.HTTP_200_OK)
```
### Flask Example
```python
from flask import Flask, jsonify
from ratelimit_io import RatelimitIO, RatelimitIOError, LimitSpec
from redis import Redis
app = Flask(__name__)
redis_client = Redis(host="localhost", port=6379)
limiter = RatelimitIO(backend=redis_client, is_incoming=True)
@app.errorhandler(RatelimitIOError)
def handle_ratelimit_error(error):
return jsonify({"error": error.detail}), error.status_code
@app.route("/fetch")
@limiter
def fetch_data():
return jsonify({"message": "Request succeeded!"})
```
## Key Handling Priority
The rate limiter determines the key for requests in the following priority:
1. `provided_key` (directly passed to `wait`, `a_wait`, or decorator).
2. `unique_key` (e.g., from a decorator argument).
3. `default_key` (set during `RatelimitIO` initialization).
4. `kwargs.get("ip")` (from additional arguments passed to the decorated function).
5. `"unknown_key"` (fallback if no other key is found).
This flexible approach ensures minimal configuration while maintaining granular control when needed.
## License
[MIT](https://github.com/bagowix/ratelimit-io/blob/main/LICENSE)
## Contribution
Contributions are welcome! Follow the Contribution Guide for details.
Raw data
{
"_id": null,
"home_page": "https://github.com/bagowix/ratelimit-io",
"name": "ratelimit-io",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.8",
"maintainer_email": null,
"keywords": "rate-limit, ratelimit, rate limit, throttle, rate control, rate-limiting, ratelimiting, limit, limiting, asyncio, I/O, redis, aioredis",
"author": "Galushko Bogdan",
"author_email": "galushko355@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/29/6a/e6f685546a9b40118c836efc71577787c4ba89487e9eaad2c273fd070fa9/ratelimit_io-0.6.3.tar.gz",
"platform": null,
"description": "# RatelimitIO\n\nA Python library for rate limiting, designed to handle both incoming and outgoing requests efficiently. Supports both synchronous and asynchronous paradigms. Powered by Redis, this library provides decorators and easy integration with APIs to manage request limits with precision.\n\n#### Project Information\n[![Tests & Lint](https://github.com/bagowix/ratelimit-io/actions/workflows/actions.yml/badge.svg)](https://github.com/bagowix/ratelimit-io/actions/workflows/actions.yml)\n[![image](https://img.shields.io/pypi/v/ratelimit-io/0.6.3.svg)](https://pypi.python.org/pypi/ratelimit-io)\n[![Test Coverage](https://img.shields.io/badge/dynamic/json?color=blueviolet&label=coverage&query=%24.totals.percent_covered_display&suffix=%25&url=https%3A%2F%2Fraw.githubusercontent.com%2Fbagowix%2Fratelimit-io%2Fmain%2Fcoverage.json)](https://github.com/bagowix/ratelimit-io/blob/main/coverage.json)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ratelimit-io)](https://pypi.org/project/ratelimit-io/)\n[![License](https://img.shields.io/pypi/l/ratelimit-io)](LICENSE)\n[![Downloads](https://pepy.tech/badge/ratelimit-io)](https://pepy.tech/project/ratelimit-io)\n[![Formatter](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)\n\n---\n\n## Features\n\n- **Incoming and Outgoing Request Management**:\n - Handle inbound limits (e.g., API requests) with immediate error raising.\n - Manage outbound limits (e.g., client requests to external APIs) with intelligent waiting.\n- **Synchronous and Asynchronous Support**:\n - Seamlessly integrate with both blocking and async applications.\n- **Asynchronous Context Manager**:\n - Use `async with` for automatic Lua script loading and resource management.\n- **Customizable Rate Limits**:\n - Define limits by requests, time periods, or custom keys.\n- **Redis Backend**:\n - Leverages Redis for fast and scalable rate limiting.\n- **Decorators**:\n - Apply rate limits easily to functions or methods.\n- **Automatic Error Handling**:\n - Easily manage 429 Too Many Requests errors in popular frameworks like Flask, Django, and FastAPI.\n- **Granular Key Management**:\n - Priority-based key resolution for efficient bucket management.\n- **Ease of Use**:\n - Simple and intuitive integration into Python applications.\n\n---\n\n## Installation\n\nInstall via pip:\n\n```bash\npip install ratelimit-io\n```\n\n## Quick Start\n\n### Using as a Synchronous Decorator\n\n```python\nfrom ratelimit_io import RatelimitIO, LimitSpec\nfrom redis import Redis\n\nredis_client = Redis(host=\"localhost\", port=6379)\nlimiter = RatelimitIO(\n backend=redis_client,\n default_limit=LimitSpec(requests=10, seconds=60),\n default_key=\"global_limit\",\n)\n\n\n@limiter\ndef fetch_data():\n return \"Request succeeded!\"\n\n\n# Use the decorated function\nfetch_data()\n```\n\n### Using as a Asynchronous Decorator\n\n```python\nfrom ratelimit_io import RatelimitIO, LimitSpec\nfrom redis.asyncio import Redis as AsyncRedis\n\nasync_redis_client = AsyncRedis(host=\"localhost\", port=6379)\nasync_limiter = RatelimitIO(\n backend=async_redis_client,\n default_limit=LimitSpec(requests=10, seconds=60),\n default_key=\"global_limit\",\n)\n\n\n@async_limiter\nasync def fetch_data():\n return \"Request succeeded!\"\n\n\n# Use the decorated function\nawait fetch_data()\n```\n\n### Asynchronous Context Manager\n\nSimplify usage with the asynchronous context manager:\n\n```python\nfrom ratelimit_io import RatelimitIO, LimitSpec\nfrom redis.asyncio import Redis as AsyncRedis\n\nasync def main():\n redis_client = AsyncRedis(host=\"localhost\", port=6379)\n\n async with RatelimitIO(\n backend=redis_client,\n default_limit=LimitSpec(requests=5, seconds=10),\n ) as limiter:\n # Lua script is automatically loaded here\n await limiter.a_wait(\"test_key\")\n print(\"Request processed within limit!\")\n\n# The Redis connection remains open after exiting the context\n```\n\n### Incoming vs. Outgoing Request Handling (`is_incoming`)\n\n- The `default_key` or a dynamically generated key (e.g., based on `unique_key` or `kwargs[\"ip\"]`) determines the rate limit bucket.\n- When `is_incoming=True`, the rate limiter will immediately raise a `RatelimitIOError` when limits are exceeded.\n- When `is_incoming=False` (default), the rate limiter will wait until a slot becomes available.\n\n```python\n# Incoming request example (throws an error on limit breach)\nlimiter = RatelimitIO(backend=redis_client, is_incoming=True)\n\n@limiter(LimitSpec(requests=5, seconds=10))\ndef fetch_data():\n return \"Request succeeded!\"\n\n# Outgoing request example (waits if limits are exceeded)\noutgoing_limiter = RatelimitIO(backend=redis_client)\n\n@outgoing_limiter(LimitSpec(requests=5, seconds=10))\ndef fetch_data_outgoing():\n return \"Request succeeded!\"\n```\n\n## Dynamic `is_incoming` Detection\n\nAutomatically determines request type (incoming or outgoing) based on the context. Incoming requests (default) raise `RatelimitIOError` if limits are exceeded, while outgoing requests wait for a slot.\n\nExamples:\n\n```python\n# Default behavior (is_incoming=True)\n@limiter\ndef incoming_request():\n return \"Handled incoming request!\"\n\nfor _ in range(5):\n incoming_request()\n\n# Raises RatelimitIOError after 5 requests\nincoming_request()\n\n\n# Outgoing request handling\nlimiter.is_incoming = False\n\nfor _ in range(5):\n limiter.wait(\"outgoing_request\", LimitSpec(requests=5, seconds=1))\n\n# Waits for a slot to become available\nlimiter.wait(\"outgoing_request\", LimitSpec(requests=5, seconds=1))\n```\n\n## Error Handling for 429 Responses in Frameworks\n\n### FastAPI Example\n\n```python\nfrom fastapi import FastAPI, HTTPException, Request\nfrom fastapi.responses import JSONResponse\nfrom ratelimit_io import RatelimitIO, RatelimitIOError, LimitSpec\nfrom redis.asyncio import Redis as AsyncRedis\n\napp = FastAPI()\nredis_client = AsyncRedis(host=\"localhost\", port=6379)\nlimiter = RatelimitIO(\n backend=redis_client,\n default_limit=LimitSpec(requests=5, seconds=1),\n is_incoming=True\n)\n\n\n@app.middleware(\"http\")\nasync def ratelimit_middleware(request: Request, call_next):\n try:\n response = await call_next(request)\n return response\n except RatelimitIOError as exc:\n return JSONResponse(\n status_code=exc.status_code,\n content={\"detail\": exc.detail}\n )\n\n\n@app.get(\"/fetch\")\n@limit\nasync def fetch_data():\n return {\"message\": \"Request succeeded!\"}\n```\n\n### Django Middleware Example\n\n```python\nfrom django.http import JsonResponse\nfrom django.utils.deprecation import MiddlewareMixin\nfrom ratelimit_io import RatelimitIO, LimitSpec, RatelimitIOError\nfrom redis import Redis\nfrom rest_framework import status\nfrom rest_framework.response import Response\nfrom rest_framework.views import APIView\n\nredis = Redis(\"localhost\", port=6379)\nlimit = RatelimitIO(\n backend=redis,\n default_limit=LimitSpec(requests=5, seconds=1),\n is_incoming=True,\n)\n\n\nclass RatelimitMiddleware(MiddlewareMixin):\n def process_exception(self, request, exception):\n if isinstance(exception, RatelimitIOError):\n return JsonResponse(\n {\"detail\": exception.detail},\n status=exception.status_code,\n )\n return None\n\n\nclass Foo(APIView):\n permission_classes = ()\n\n @limit\n def get(self, request, *args, **kwargs):\n return Response(data={\"message\": \"ok\"}, status=status.HTTP_200_OK)\n\n```\n\n### Flask Example\n\n```python\nfrom flask import Flask, jsonify\nfrom ratelimit_io import RatelimitIO, RatelimitIOError, LimitSpec\nfrom redis import Redis\n\napp = Flask(__name__)\nredis_client = Redis(host=\"localhost\", port=6379)\nlimiter = RatelimitIO(backend=redis_client, is_incoming=True)\n\n@app.errorhandler(RatelimitIOError)\ndef handle_ratelimit_error(error):\n return jsonify({\"error\": error.detail}), error.status_code\n\n@app.route(\"/fetch\")\n@limiter\ndef fetch_data():\n return jsonify({\"message\": \"Request succeeded!\"})\n```\n\n## Key Handling Priority\n\nThe rate limiter determines the key for requests in the following priority:\n\n1. `provided_key` (directly passed to `wait`, `a_wait`, or decorator).\n2. `unique_key` (e.g., from a decorator argument).\n3. `default_key` (set during `RatelimitIO` initialization).\n4. `kwargs.get(\"ip\")` (from additional arguments passed to the decorated function).\n5. `\"unknown_key\"` (fallback if no other key is found).\n\nThis flexible approach ensures minimal configuration while maintaining granular control when needed.\n\n## License\n\n[MIT](https://github.com/bagowix/ratelimit-io/blob/main/LICENSE)\n\n## Contribution\n\nContributions are welcome! Follow the Contribution Guide for details.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Flexible bidirectional rate-limiting library with redis backend",
"version": "0.6.3",
"project_urls": {
"Homepage": "https://github.com/bagowix/ratelimit-io",
"Repository": "https://github.com/bagowix/ratelimit-io"
},
"split_keywords": [
"rate-limit",
" ratelimit",
" rate limit",
" throttle",
" rate control",
" rate-limiting",
" ratelimiting",
" limit",
" limiting",
" asyncio",
" i/o",
" redis",
" aioredis"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "d10d1d3d6362ff2848f2ad6ec2b7afc9cc799f6369ce4a75ae70f2f78aed3b1e",
"md5": "a1e8a0f399ebb2d0b04153d6406fc520",
"sha256": "6b523ee70b64aec657824b95e222af2b2ec08a1404f5195b1e49c62da2a0667e"
},
"downloads": -1,
"filename": "ratelimit_io-0.6.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a1e8a0f399ebb2d0b04153d6406fc520",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.8",
"size": 8750,
"upload_time": "2025-01-18T18:07:39",
"upload_time_iso_8601": "2025-01-18T18:07:39.818093Z",
"url": "https://files.pythonhosted.org/packages/d1/0d/1d3d6362ff2848f2ad6ec2b7afc9cc799f6369ce4a75ae70f2f78aed3b1e/ratelimit_io-0.6.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "296ae6f685546a9b40118c836efc71577787c4ba89487e9eaad2c273fd070fa9",
"md5": "f33c444cd95d2d4e57bfe2f56fd81a06",
"sha256": "003d72f78a3871988c27f635fbb3c493b286d21a478d305255eb460ab33773c4"
},
"downloads": -1,
"filename": "ratelimit_io-0.6.3.tar.gz",
"has_sig": false,
"md5_digest": "f33c444cd95d2d4e57bfe2f56fd81a06",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.8",
"size": 10772,
"upload_time": "2025-01-18T18:07:41",
"upload_time_iso_8601": "2025-01-18T18:07:41.077702Z",
"url": "https://files.pythonhosted.org/packages/29/6a/e6f685546a9b40118c836efc71577787c4ba89487e9eaad2c273fd070fa9/ratelimit_io-0.6.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-01-18 18:07:41",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "bagowix",
"github_project": "ratelimit-io",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "ratelimit-io"
}