Name | rediskit JSON |
Version |
0.0.8
JSON |
| download |
home_page | None |
Summary | A comprehensive Redis toolkit for Python with caching, memoization, and utilities |
upload_time | 2025-07-10 07:28:58 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.12 |
license | Apache-2.0 |
keywords |
redis
cache
memoization
toolkit
async
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# rediskit
A Python toolkit that provides Redis-backed performance and concurrency primitives for applications. It enables developers to add caching, distributed coordination, and data protection to their Python applications with minimal effort.
## Still work in progress
Many features are still under development, there will be many breaking changes. Please use at your own risk.
## Features
- **Function Result Caching**: Use the `@RedisMemoize` decorator to cache expensive function calls with automatic serialization, compression, and encryption
- **Distributed Coordination**: Redis-based distributed locks and semaphores for coordinating access across multiple processes/machines
- **Data Protection**: Multi-version encryption keys with automatic key rotation for sensitive cached data
- **Async Support**: Full support for both synchronous and asynchronous applications
- **Flexible Storage**: Choose between string or hash-based Redis storage patterns
- **Modern Type Hints**: Full type safety with Python 3.12+ syntax
## Installation
```bash
uv add rediskit
# or
poetry add rediskit
```
## Quick Start
### Basic Setup
```python
from rediskit import redis_memoize, init_redis_connection_pool
# Initialize Redis connection pool (call once at app startup)
init_redis_connection_pool()
# Cache expensive function results
@redis_memoize(memoize_key="expensive_calc", ttl=300)
def expensive_calculation(tenantId: str, value: int) -> dict:
# Simulate expensive computation
import time
time.sleep(2)
return {"result": value * 42}
# Usage
result = expensive_calculation("tenant1", 10) # Takes 2 seconds
result = expensive_calculation("tenant1", 10) # Returns instantly from cache
```
### Custom Redis Connection
```python
import redis
from rediskit import redis_memoize
# Use your own Redis connection
my_redis = redis.Redis(host='my-redis-host', port=6379, db=1)
@redis_memoize(
memoize_key="custom_calc",
ttl=600,
connection=my_redis
)
def my_function(tenantId: str, data: dict) -> dict:
return {"processed": data}
```
### Advanced Caching Options
```python
from rediskit import redis_memoize
# Hash-based storage with encryption
@redis_memoize(
memoize_key=lambda tenantId, user_id: f"user_profile:{tenantId}:{user_id}",
ttl=3600,
storage_type="hash", # Store in Redis hash for efficient field access
enable_encryption=True, # Encrypt sensitive data
cache_type="zipJson" # JSON serialization with compression
)
def get_user_profile(tenantId: str, user_id: str) -> dict:
# Fetch user data from database
return {"user_id": user_id, "name": "John Doe", "email": "john@example.com"}
# Dynamic TTL and cache bypass
@redis_memoize(
memoize_key="dynamic_data",
ttl=lambda tenantId, priority: 3600 if priority == "high" else 300,
bypass_cache=lambda tenantId, force_refresh: force_refresh
)
def get_dynamic_data(tenantId: str, priority: str, force_refresh: bool = False) -> dict:
return {"data": "fresh_data", "priority": priority}
```
### Async Support
```python
import asyncio
from rediskit import redis_memoize, init_async_redis_connection_pool
# Initialize async Redis connection pool
await init_async_redis_connection_pool()
@redis_memoize(memoize_key="async_calc", ttl=300)
async def async_expensive_function(tenantId: str, value: int) -> dict:
await asyncio.sleep(1) # Simulate async work
return {"async_result": value * 100}
# Usage
result = await async_expensive_function("tenant1", 5)
```
### Distributed Locking
```python
from rediskit import get_redis_mutex_lock, get_async_redis_mutex_lock
# Synchronous distributed lock
with get_redis_mutex_lock("critical_section", expire=30) as lock:
# Only one process can execute this block at a time
perform_critical_operation()
# Async distributed lock
async with get_async_redis_mutex_lock("async_critical_section", expire=30) as lock:
await perform_async_critical_operation()
```
### Encryption Management
```python
from rediskit import Encrypter
# Generate new encryption keys
encrypter = Encrypter()
new_key = encrypter.generate_new_hex_key()
# Encrypt/decrypt data manually
encrypted = encrypter.encrypt("sensitive data", useZstd=True)
decrypted = encrypter.decrypt(encrypted)
```
## Configuration
Configure rediskit using environment variables:
```bash
# Redis connection settings
export REDISKIT_REDIS_HOST="localhost"
export REDISKIT_REDIS_PORT="6379"
export REDISKIT_REDIS_PASSWORD=""
# Encryption keys (base64-encoded JSON)
export REDISKIT_ENCRYPTION_SECRET="eyJfX2VuY192MSI6ICI0MGViODJlNWJhNTJiNmQ4..."
# Cache settings
export REDISKIT_REDIS_TOP_NODE="my_app_cache"
export REDISKIT_REDIS_SKIP_CACHING="false"
```
## API Reference
### Core Decorators
#### `@RedisMemoize`
Cache function results in Redis with configurable options.
**Parameters:**
- `memoizeKey`: Cache key (string or callable)
- `ttl`: Time to live in seconds (int, callable, or None)
- `bypassCache`: Skip cache lookup (bool or callable)
- `cacheType`: Serialization method ("zipJson" or "zipPickled")
- `resetTtlUponRead`: Refresh TTL when reading from cache
- `enableEncryption`: Encrypt cached data
- `storageType`: Redis storage pattern ("string" or "hash")
- `connection`: Custom Redis connection (optional)
### Connection Management
- `init_redis_connection_pool()`: Initialize sync Redis connection pool
- `init_async_redis_connection_pool()`: Initialize async Redis connection pool
- `get_redis_connection()`: Get sync Redis connection
- `get_async_redis_connection()`: Get async Redis connection
### Distributed Locking
- `GetRedisMutexLock(name, expire, auto_renewal, id)`: Get sync distributed lock
- `GetAsyncRedisMutexLock(name, expire, auto_renewal)`: Get async distributed lock
### Encryption
- `Encrypter(keyHexDict)`: Encryption/decryption with key versioning
## Requirements
- Python 3.12+
- Redis server
- Dependencies: redis, redis-lock, nacl, zstd
## License
Apache-2.0 license
Raw data
{
"_id": null,
"home_page": null,
"name": "rediskit",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.12",
"maintainer_email": "Badr Elfarri <badr.elfarri@gmail.com>",
"keywords": "redis, cache, memoization, toolkit, async",
"author": null,
"author_email": "Badr Elfarri <badr.elfarri@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/3f/06/6b033a02baf1f48dc444326f821abc9cd4d2c82d3ff13645d54549319587/rediskit-0.0.8.tar.gz",
"platform": null,
"description": "# rediskit\n\nA Python toolkit that provides Redis-backed performance and concurrency primitives for applications. It enables developers to add caching, distributed coordination, and data protection to their Python applications with minimal effort.\n\n## Still work in progress\nMany features are still under development, there will be many breaking changes. Please use at your own risk.\n\n\n## Features\n\n- **Function Result Caching**: Use the `@RedisMemoize` decorator to cache expensive function calls with automatic serialization, compression, and encryption\n- **Distributed Coordination**: Redis-based distributed locks and semaphores for coordinating access across multiple processes/machines\n- **Data Protection**: Multi-version encryption keys with automatic key rotation for sensitive cached data\n- **Async Support**: Full support for both synchronous and asynchronous applications\n- **Flexible Storage**: Choose between string or hash-based Redis storage patterns\n- **Modern Type Hints**: Full type safety with Python 3.12+ syntax\n\n## Installation\n\n```bash\nuv add rediskit\n# or\npoetry add rediskit\n```\n\n## Quick Start\n\n### Basic Setup\n\n```python\nfrom rediskit import redis_memoize, init_redis_connection_pool\n\n# Initialize Redis connection pool (call once at app startup)\ninit_redis_connection_pool()\n\n\n# Cache expensive function results\n@redis_memoize(memoize_key=\"expensive_calc\", ttl=300)\ndef expensive_calculation(tenantId: str, value: int) -> dict:\n # Simulate expensive computation\n import time\n time.sleep(2)\n return {\"result\": value * 42}\n\n\n# Usage\nresult = expensive_calculation(\"tenant1\", 10) # Takes 2 seconds\nresult = expensive_calculation(\"tenant1\", 10) # Returns instantly from cache\n```\n\n### Custom Redis Connection\n\n```python\nimport redis\nfrom rediskit import redis_memoize\n\n# Use your own Redis connection\nmy_redis = redis.Redis(host='my-redis-host', port=6379, db=1)\n\n\n@redis_memoize(\n memoize_key=\"custom_calc\",\n ttl=600,\n connection=my_redis\n)\ndef my_function(tenantId: str, data: dict) -> dict:\n return {\"processed\": data}\n```\n\n### Advanced Caching Options\n\n```python\nfrom rediskit import redis_memoize\n\n\n# Hash-based storage with encryption\n@redis_memoize(\n memoize_key=lambda tenantId, user_id: f\"user_profile:{tenantId}:{user_id}\",\n ttl=3600,\n storage_type=\"hash\", # Store in Redis hash for efficient field access\n enable_encryption=True, # Encrypt sensitive data\n cache_type=\"zipJson\" # JSON serialization with compression\n)\ndef get_user_profile(tenantId: str, user_id: str) -> dict:\n # Fetch user data from database\n return {\"user_id\": user_id, \"name\": \"John Doe\", \"email\": \"john@example.com\"}\n\n\n# Dynamic TTL and cache bypass\n@redis_memoize(\n memoize_key=\"dynamic_data\",\n ttl=lambda tenantId, priority: 3600 if priority == \"high\" else 300,\n bypass_cache=lambda tenantId, force_refresh: force_refresh\n)\ndef get_dynamic_data(tenantId: str, priority: str, force_refresh: bool = False) -> dict:\n return {\"data\": \"fresh_data\", \"priority\": priority}\n```\n\n### Async Support\n\n```python\nimport asyncio\nfrom rediskit import redis_memoize, init_async_redis_connection_pool\n\n# Initialize async Redis connection pool\nawait init_async_redis_connection_pool()\n\n\n@redis_memoize(memoize_key=\"async_calc\", ttl=300)\nasync def async_expensive_function(tenantId: str, value: int) -> dict:\n await asyncio.sleep(1) # Simulate async work\n return {\"async_result\": value * 100}\n\n\n# Usage\nresult = await async_expensive_function(\"tenant1\", 5)\n```\n\n### Distributed Locking\n\n```python\nfrom rediskit import get_redis_mutex_lock, get_async_redis_mutex_lock\n\n# Synchronous distributed lock\nwith get_redis_mutex_lock(\"critical_section\", expire=30) as lock:\n # Only one process can execute this block at a time\n perform_critical_operation()\n\n# Async distributed lock\nasync with get_async_redis_mutex_lock(\"async_critical_section\", expire=30) as lock:\n await perform_async_critical_operation()\n```\n\n### Encryption Management\n\n```python\nfrom rediskit import Encrypter\n\n# Generate new encryption keys\nencrypter = Encrypter()\nnew_key = encrypter.generate_new_hex_key()\n\n# Encrypt/decrypt data manually\nencrypted = encrypter.encrypt(\"sensitive data\", useZstd=True)\ndecrypted = encrypter.decrypt(encrypted)\n```\n\n## Configuration\n\nConfigure rediskit using environment variables:\n\n```bash\n# Redis connection settings\nexport REDISKIT_REDIS_HOST=\"localhost\"\nexport REDISKIT_REDIS_PORT=\"6379\"\nexport REDISKIT_REDIS_PASSWORD=\"\"\n\n# Encryption keys (base64-encoded JSON)\nexport REDISKIT_ENCRYPTION_SECRET=\"eyJfX2VuY192MSI6ICI0MGViODJlNWJhNTJiNmQ4...\"\n\n# Cache settings\nexport REDISKIT_REDIS_TOP_NODE=\"my_app_cache\"\nexport REDISKIT_REDIS_SKIP_CACHING=\"false\"\n```\n\n## API Reference\n\n### Core Decorators\n\n#### `@RedisMemoize`\n\nCache function results in Redis with configurable options.\n\n**Parameters:**\n- `memoizeKey`: Cache key (string or callable)\n- `ttl`: Time to live in seconds (int, callable, or None)\n- `bypassCache`: Skip cache lookup (bool or callable)\n- `cacheType`: Serialization method (\"zipJson\" or \"zipPickled\")\n- `resetTtlUponRead`: Refresh TTL when reading from cache\n- `enableEncryption`: Encrypt cached data\n- `storageType`: Redis storage pattern (\"string\" or \"hash\")\n- `connection`: Custom Redis connection (optional)\n\n### Connection Management\n\n- `init_redis_connection_pool()`: Initialize sync Redis connection pool\n- `init_async_redis_connection_pool()`: Initialize async Redis connection pool\n- `get_redis_connection()`: Get sync Redis connection\n- `get_async_redis_connection()`: Get async Redis connection\n\n### Distributed Locking\n\n- `GetRedisMutexLock(name, expire, auto_renewal, id)`: Get sync distributed lock\n- `GetAsyncRedisMutexLock(name, expire, auto_renewal)`: Get async distributed lock\n\n### Encryption\n\n- `Encrypter(keyHexDict)`: Encryption/decryption with key versioning\n\n## Requirements\n\n- Python 3.12+\n- Redis server\n- Dependencies: redis, redis-lock, nacl, zstd\n\n## License\nApache-2.0 license\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "A comprehensive Redis toolkit for Python with caching, memoization, and utilities",
"version": "0.0.8",
"project_urls": {
"Bug Tracker": "https://github.com/badrelfarri/rediskit/issues",
"Changelog": "https://github.com/badrelfarri/rediskit/blob/main/CHANGELOG.md",
"Documentation": "https://github.com/badrelfarri/rediskit#readme",
"Homepage": "https://github.com/badrelfarri/rediskit",
"Repository": "https://github.com/badrelfarri/rediskit"
},
"split_keywords": [
"redis",
" cache",
" memoization",
" toolkit",
" async"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "488a2f55c7cabe025179e63857320f8b225de342bb1c924de862b6e471472cbe",
"md5": "be4a23873bbd9b1b4271aa7be80c7338",
"sha256": "d9daf4d183cc9a98bf3d5a2a03880e0fbff99fa1355229698f35f406070fa958"
},
"downloads": -1,
"filename": "rediskit-0.0.8-py3-none-any.whl",
"has_sig": false,
"md5_digest": "be4a23873bbd9b1b4271aa7be80c7338",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.12",
"size": 22291,
"upload_time": "2025-07-10T07:28:56",
"upload_time_iso_8601": "2025-07-10T07:28:56.667334Z",
"url": "https://files.pythonhosted.org/packages/48/8a/2f55c7cabe025179e63857320f8b225de342bb1c924de862b6e471472cbe/rediskit-0.0.8-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "3f066b033a02baf1f48dc444326f821abc9cd4d2c82d3ff13645d54549319587",
"md5": "3e3f059439a66dbc13129509ff5bec91",
"sha256": "6cd48a879448ae4bc4d23fa62a1d6c780056afe4ed5ca2158a75c7598ed7a985"
},
"downloads": -1,
"filename": "rediskit-0.0.8.tar.gz",
"has_sig": false,
"md5_digest": "3e3f059439a66dbc13129509ff5bec91",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.12",
"size": 37605,
"upload_time": "2025-07-10T07:28:58",
"upload_time_iso_8601": "2025-07-10T07:28:58.034227Z",
"url": "https://files.pythonhosted.org/packages/3f/06/6b033a02baf1f48dc444326f821abc9cd4d2c82d3ff13645d54549319587/rediskit-0.0.8.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-10 07:28:58",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "badrelfarri",
"github_project": "rediskit",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "rediskit"
}