# base-cacheable-class
A flexible base class for adding caching capabilities to your Python classes.
## Features
- Asynchronous Support
- Simple decorator-based caching for class methods
- Support for both in-memory and Redis cache backends
- TTL (Time To Live) support
- Cache invalidation strategies
- Async/await support
- Type hints included
## To come
* Synchronous Operation Support
## Installation
```bash
uv add base-cacheable-class
```
For Redis support:
```bash
uv add base-cacheable-class[redis]
```
## Quick Start
### Basic Usage with In-Memory Cache
```python
import asyncio
from base_cacheable_class import BaseCacheableClass, InMemoryCache, InMemoryCacheDecorator
class UserService(BaseCacheableClass):
def __init__(self):
cache = InMemoryCache()
cache_decorator = InMemoryCacheDecorator(cache, default_ttl=300) # 5 minutes default
super().__init__(cache_decorator)
@BaseCacheableClass.cache(ttl=60) # Cache for 1 minute
async def get_user(self, user_id: int):
# Expensive operation here
return {"id": user_id, "name": f"User {user_id}"}
@BaseCacheableClass.invalidate("get_user", param_mapping={"user_id": "user_id"})
async def update_user(self, user_id: int, name: str):
# Update user logic here
return {"id": user_id, "name": name}
async def main():
service = UserService()
# First call - will execute the function
user = await service.get_user(1)
print(user) # {"id": 1, "name": "User 1"}
# Second call - will return cached result
user = await service.get_user(1)
print(user) # {"id": 1, "name": "User 1"} (from cache)
# Update user - will invalidate cache
await service.update_user(1, "Updated User")
# Next call - will execute the function again
user = await service.get_user(1)
print(user) # {"id": 1, "name": "User 1"}
if __name__ == "__main__":
asyncio.run(main())
```
### Using Redis Cache
```python
import asyncio
from base_cacheable_class import BaseCacheableClass, RedisCache, RedisCacheDecorator
class ProductService(BaseCacheableClass):
def __init__(self):
cache = RedisCache(
host="localhost",
port=6379,
password="your_password",
db=0
)
cache_decorator = RedisCacheDecorator(cache, default_ttl=3600) # 1 hour default
super().__init__(cache_decorator)
@BaseCacheableClass.cache(ttl=300) # Cache for 5 minutes
async def get_product(self, product_id: int):
# Fetch product from database
return {"id": product_id, "name": f"Product {product_id}"}
@BaseCacheableClass.invalidate_all()
async def refresh_catalog(self):
# Clear all caches when catalog is refreshed
return "Catalog refreshed"
async def main():
service = ProductService()
# Use the service
product = await service.get_product(1)
print(product)
# Clear all caches
await service.refresh_catalog()
if __name__ == "__main__":
asyncio.run(main())
```
* Special Thanks to: https://github.com/Ilevk/fastapi-tutorial/blob/098d3a05f224220cc2cd5125dea5c5cf7bb810ab/app/core/redis.py#L35
## API Reference
### Decorators
#### `@BaseCacheableClass.cache(ttl=None)`
Cache the decorated method's result. If `ttl` is None, cache indefinitely.
#### `@BaseCacheableClass.invalidate(target_func_name, param_mapping=None)`
Invalidate cache for specific function when the decorated method is called.
- `target_func_name`: Name of the function whose cache should be invalidated
- `param_mapping`: Dict mapping current function params to target function params
#### `@BaseCacheableClass.invalidate_all()`
Clear all caches when the decorated method is called.
### Cache Backends
#### InMemoryCache
Simple in-memory cache using a dictionary. Singleton pattern ensures single instance.
#### RedisCache
Redis-based cache with support for distributed systems.
Constructor parameters:
- `host`: Redis host
- `port`: Redis port
- `password`: Redis password
- `db`: Redis database number (default: 0)
- `username`: Redis username
- `socket_timeout`: Socket timeout in seconds (default: 0.5)
- `socket_connect_timeout`: Connection timeout in seconds (default: 0.5)
#### Multi-tier Caching (L1/L2) Patterns
```python
class MultiTierCacheDecorator(CacheDecoratorInterface):
"""Implements L1/L2 caching with memory as L1 and Redis as L2."""
def __init__(self, l1_cache: CacheDecoratorInterface, l2_cache: CacheDecoratorInterface):
self.l1_cache = l1_cache
self.l2_cache = l2_cache
async def get(self, key: str) -> Optional[Any]:
# Try L1 first
value = await self.l1_cache.get(key)
if value is not None:
return value
# Try L2
value = await self.l2_cache.get(key)
if value is not None:
# Populate L1
await self.l1_cache.set(key, value, ttl=60) # Short TTL for L1
return value
```
## License
MIT License
Raw data
{
"_id": null,
"home_page": null,
"name": "base-cacheable-class",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "cache, caching, decorator, redis, in-memory",
"author": "Your Name",
"author_email": "Your Name <your.email@example.com>",
"download_url": "https://files.pythonhosted.org/packages/5c/ea/36749c5638b42947ea56bf1df72cec67c3e0fbb3144d6a504d36cb6103c1/base_cacheable_class-0.1.0.tar.gz",
"platform": null,
"description": "# base-cacheable-class\n\nA flexible base class for adding caching capabilities to your Python classes.\n\n## Features\n- Asynchronous Support \n- Simple decorator-based caching for class methods\n- Support for both in-memory and Redis cache backends\n- TTL (Time To Live) support\n- Cache invalidation strategies\n- Async/await support\n- Type hints included\n\n## To come\n* Synchronous Operation Support\n\n## Installation\n\n```bash\nuv add base-cacheable-class\n```\n\nFor Redis support:\n```bash\nuv add base-cacheable-class[redis]\n```\n\n## Quick Start\n\n### Basic Usage with In-Memory Cache\n\n```python\nimport asyncio\nfrom base_cacheable_class import BaseCacheableClass, InMemoryCache, InMemoryCacheDecorator\n\n\nclass UserService(BaseCacheableClass):\n def __init__(self):\n cache = InMemoryCache()\n cache_decorator = InMemoryCacheDecorator(cache, default_ttl=300) # 5 minutes default\n super().__init__(cache_decorator)\n \n @BaseCacheableClass.cache(ttl=60) # Cache for 1 minute\n async def get_user(self, user_id: int):\n # Expensive operation here\n return {\"id\": user_id, \"name\": f\"User {user_id}\"}\n \n @BaseCacheableClass.invalidate(\"get_user\", param_mapping={\"user_id\": \"user_id\"})\n async def update_user(self, user_id: int, name: str):\n # Update user logic here\n return {\"id\": user_id, \"name\": name}\n\n\nasync def main():\n service = UserService()\n \n # First call - will execute the function\n user = await service.get_user(1)\n print(user) # {\"id\": 1, \"name\": \"User 1\"}\n \n # Second call - will return cached result\n user = await service.get_user(1)\n print(user) # {\"id\": 1, \"name\": \"User 1\"} (from cache)\n \n # Update user - will invalidate cache\n await service.update_user(1, \"Updated User\")\n \n # Next call - will execute the function again\n user = await service.get_user(1)\n print(user) # {\"id\": 1, \"name\": \"User 1\"}\n\n\nif __name__ == \"__main__\":\n asyncio.run(main())\n```\n\n### Using Redis Cache\n\n```python\nimport asyncio\nfrom base_cacheable_class import BaseCacheableClass, RedisCache, RedisCacheDecorator\n\n\nclass ProductService(BaseCacheableClass):\n def __init__(self):\n cache = RedisCache(\n host=\"localhost\",\n port=6379,\n password=\"your_password\",\n db=0\n )\n cache_decorator = RedisCacheDecorator(cache, default_ttl=3600) # 1 hour default\n super().__init__(cache_decorator)\n \n @BaseCacheableClass.cache(ttl=300) # Cache for 5 minutes\n async def get_product(self, product_id: int):\n # Fetch product from database\n return {\"id\": product_id, \"name\": f\"Product {product_id}\"}\n \n @BaseCacheableClass.invalidate_all()\n async def refresh_catalog(self):\n # Clear all caches when catalog is refreshed\n return \"Catalog refreshed\"\n\n\nasync def main():\n service = ProductService()\n \n # Use the service\n product = await service.get_product(1)\n print(product)\n \n # Clear all caches\n await service.refresh_catalog()\n\n\nif __name__ == \"__main__\":\n asyncio.run(main())\n```\n\n* Special Thanks to: https://github.com/Ilevk/fastapi-tutorial/blob/098d3a05f224220cc2cd5125dea5c5cf7bb810ab/app/core/redis.py#L35\n\n## API Reference\n\n### Decorators\n\n#### `@BaseCacheableClass.cache(ttl=None)`\nCache the decorated method's result. If `ttl` is None, cache indefinitely.\n\n#### `@BaseCacheableClass.invalidate(target_func_name, param_mapping=None)`\nInvalidate cache for specific function when the decorated method is called.\n- `target_func_name`: Name of the function whose cache should be invalidated\n- `param_mapping`: Dict mapping current function params to target function params\n\n#### `@BaseCacheableClass.invalidate_all()`\nClear all caches when the decorated method is called.\n\n### Cache Backends\n\n#### InMemoryCache\nSimple in-memory cache using a dictionary. Singleton pattern ensures single instance.\n\n#### RedisCache\nRedis-based cache with support for distributed systems.\n\nConstructor parameters:\n- `host`: Redis host\n- `port`: Redis port\n- `password`: Redis password\n- `db`: Redis database number (default: 0)\n- `username`: Redis username\n- `socket_timeout`: Socket timeout in seconds (default: 0.5)\n- `socket_connect_timeout`: Connection timeout in seconds (default: 0.5)\n\n#### Multi-tier Caching (L1/L2) Patterns\n\n```python\nclass MultiTierCacheDecorator(CacheDecoratorInterface):\n \"\"\"Implements L1/L2 caching with memory as L1 and Redis as L2.\"\"\"\n \n def __init__(self, l1_cache: CacheDecoratorInterface, l2_cache: CacheDecoratorInterface):\n self.l1_cache = l1_cache\n self.l2_cache = l2_cache\n \n async def get(self, key: str) -> Optional[Any]:\n # Try L1 first\n value = await self.l1_cache.get(key)\n if value is not None:\n return value\n \n # Try L2\n value = await self.l2_cache.get(key)\n if value is not None:\n # Populate L1\n await self.l1_cache.set(key, value, ttl=60) # Short TTL for L1\n \n return value\n```\n\n## License\n\nMIT License",
"bugtrack_url": null,
"license": "MIT",
"summary": "A flexible base class for adding caching capabilities to your Python classes",
"version": "0.1.0",
"project_urls": {
"Documentation": "https://github.com/yourusername/base-cacheable-class#readme",
"Homepage": "https://github.com/yourusername/base-cacheable-class",
"Issues": "https://github.com/yourusername/base-cacheable-class/issues",
"Repository": "https://github.com/yourusername/base-cacheable-class"
},
"split_keywords": [
"cache",
" caching",
" decorator",
" redis",
" in-memory"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "4b5b2c6f4a68d545de6fcdebfb2f86ebb32146c91423680ad8a1d3ca51d57655",
"md5": "77d2432d3c02a037d569b71283826db0",
"sha256": "91d34ee0c09aafeb89828947836f920897a23e5a8cb9f05e594740a393a32f20"
},
"downloads": -1,
"filename": "base_cacheable_class-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "77d2432d3c02a037d569b71283826db0",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 11769,
"upload_time": "2025-07-09T14:41:34",
"upload_time_iso_8601": "2025-07-09T14:41:34.442539Z",
"url": "https://files.pythonhosted.org/packages/4b/5b/2c6f4a68d545de6fcdebfb2f86ebb32146c91423680ad8a1d3ca51d57655/base_cacheable_class-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "5cea36749c5638b42947ea56bf1df72cec67c3e0fbb3144d6a504d36cb6103c1",
"md5": "f5df32061cd05a090ada497e3cf689d2",
"sha256": "1e2eb2b0726f65efd4e2bc70a2702fd224b7ac250787a5e93ae87b40786c06ab"
},
"downloads": -1,
"filename": "base_cacheable_class-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "f5df32061cd05a090ada497e3cf689d2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 6817,
"upload_time": "2025-07-09T14:41:35",
"upload_time_iso_8601": "2025-07-09T14:41:35.570124Z",
"url": "https://files.pythonhosted.org/packages/5c/ea/36749c5638b42947ea56bf1df72cec67c3e0fbb3144d6a504d36cb6103c1/base_cacheable_class-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-09 14:41:35",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "yourusername",
"github_project": "base-cacheable-class#readme",
"github_not_found": true,
"lcname": "base-cacheable-class"
}