# Pydantic Typed Cache
An async cache library for Pydantic models without FastAPI dependencies. This library provides a simple decorator-based caching mechanism for async functions that return Pydantic models or other Python objects.
> **Note**: This project was inspired by [fastapi-cache](https://github.com/long2ice/fastapi-cache) but designed to work independently of FastAPI/Starlette, making it suitable for any async Python application.
## Features
- 🚀 Simple decorator-based caching for async functions
- 🔄 Support for both async and sync functions (sync functions run in thread pool)
- 📦 Multiple backends: Redis, In-Memory
- 🎯 Type-safe with Pydantic model support
- 🔑 Customizable cache key generation
- 📝 Multiple serialization options (JSON, Pickle)
- ⚡ Zero FastAPI/Starlette dependencies
- 🔧 Flexible type conversion with `model` parameter
- ✅ Support for nullable/optional types with proper None handling
- 🎭 Support for Union types and complex type hints
## Installation
```bash
# Basic installation
pip install pydantic-typed-cache
# With orjson for faster JSON serialization
pip install pydantic-typed-cache[orjson]
# With development dependencies
pip install pydantic-typed-cache[dev]
```
## Quick Start
```python
import asyncio
from pydantic import BaseModel
from pydantic_cache import PydanticCache, cache
from pydantic_cache.backends.inmemory import InMemoryBackend
# Define a Pydantic model
class User(BaseModel):
id: int
name: str
email: str
# Initialize cache
backend = InMemoryBackend()
PydanticCache.init(backend, prefix="myapp", expire=60)
# Cache a function
@cache(expire=120, namespace="users")
async def get_user(user_id: int) -> User:
# Expensive operation (e.g., database query)
return User(id=user_id, name="John", email="john@example.com")
# Use the cached function
async def main():
user = await get_user(1) # First call - cache miss
user = await get_user(1) # Second call - cache hit
asyncio.run(main())
```
## Advanced Features
### Type Conversion with `model` Parameter
The `model` parameter allows you to force type conversion of cached values. This works with any Python type, not just Pydantic models:
```python
# Convert string to int
@cache(model=int)
async def get_count() -> str:
return "42" # Will be converted to int 42
# Convert dict to Pydantic model
class UserResponse(BaseModel):
id: int
name: str
@cache(model=UserResponse)
async def get_user_data(user_id: int) -> dict:
# Returns dict that will be converted to UserResponse
return {"id": user_id, "name": "Alice"}
# Work with collections
@cache(model=list[int])
async def get_scores() -> list[str]:
return ["95", "87", "92"] # Converted to [95, 87, 92]
# Support Union types
@cache(model=int | str)
async def get_flexible_value(use_number: bool) -> any:
return 123 if use_number else "hello"
# Support Optional types
@cache(model=int | None)
async def get_optional_value(value: str | None) -> str | None:
return value # "123" becomes 123, None stays None
```
### Nullable/Optional Type Support
The library properly handles None values in optional types, distinguishing between cached None values and cache misses:
```python
# Function returning Optional types
@cache(namespace="users")
async def get_user(user_id: int) -> User | None:
if user_id < 0:
return None # This None will be cached
return User(id=user_id, name="John", email="john@example.com")
# First call with -1
user = await get_user(-1) # Returns None (cache miss, stores None)
user = await get_user(-1) # Returns None (cache hit, retrieves None)
# Pydantic models with optional fields
class Profile(BaseModel):
id: int
bio: str | None = None
age: int | None = None
@cache(namespace="profiles")
async def get_profile(user_id: int) -> Profile:
return Profile(id=user_id, bio=None, age=None) # None fields are properly cached
```
### Sync Function Support
Sync functions are automatically wrapped and run in a thread pool:
```python
@cache(namespace="compute")
def expensive_computation(x: int, y: int) -> int:
# This sync function will be run in a thread pool
import time
time.sleep(1)
return x * y
# Can be called as async
result = await expensive_computation(10, 20)
```
## Backends
### In-Memory Backend
Perfect for development and testing:
```python
from pydantic_cache.backends.inmemory import InMemoryBackend
backend = InMemoryBackend()
PydanticCache.init(backend)
```
### Redis Backend
For production use with persistence:
```python
from redis.asyncio import Redis
from pydantic_cache.backends.redis import RedisBackend
redis = Redis(host="localhost", port=6379)
backend = RedisBackend(redis)
PydanticCache.init(backend)
```
## Configuration
### Global Configuration
```python
PydanticCache.init(
backend=backend,
prefix="myapp", # Prefix for all cache keys
expire=300, # Default expiration in seconds
coder=JsonCoder, # Serialization method (JsonCoder or PickleCoder)
key_builder=my_key_builder, # Custom key builder function
enable=True # Enable/disable caching globally
)
```
### Per-Decorator Configuration
```python
@cache(
expire=120, # Override default expiration
namespace="users", # Namespace for this function
coder=PickleCoder, # Override default coder
key_builder=custom_key_builder, # Custom key builder
model=UserModel # Force type conversion
)
async def cached_function():
pass
```
## Serialization
### JsonCoder (Default)
- Human-readable cache values
- Good for debugging
- Supports most Python types and Pydantic models
- Moderate performance
### OrjsonCoder (Recommended for performance)
- **2-3x faster** than standard JSON
- Efficient datetime handling
- Better performance with large datasets
- Requires: `pip install pydantic-typed-cache[orjson]`
```python
from pydantic_cache import OrjsonCoder
PydanticCache.init(backend, coder=OrjsonCoder)
```
### PickleCoder
- Supports all Python objects
- Fast serialization
- Binary format (not human-readable)
- Better for complex nested structures
```python
from pydantic_cache.coder import JsonCoder, OrjsonCoder, PickleCoder
# Set globally with default instance
PydanticCache.init(backend, coder=OrjsonCoder()) # Recommended
# Or with custom configuration
custom_coder = OrjsonCoder(default=my_handler)
PydanticCache.init(backend, coder=custom_coder)
# Or per decorator
@cache(coder=JsonCoder()) # Default configuration
async def my_function():
pass
# Or with custom configuration per function
@cache(coder=JsonCoder(default=my_handler))
async def my_other_function():
pass
```
### Performance Comparison
| Coder | Speed | Human Readable | Size | Use Case |
|-------|-------|----------------|------|----------|
| JsonCoder | Moderate | ✅ | Small | Debugging, small data |
| OrjsonCoder | Fast | ✅ | Small | Production, large data |
| PickleCoder | Fast | ❌ | Medium | Complex objects |
### Custom Serialization
All coders now support instance-based configuration for custom serialization:
#### OrjsonCoder with Custom Types
```python
from pydantic_cache import OrjsonCoder
# Define handler for non-serializable types
def handle_objectid(obj):
if isinstance(obj, ObjectId):
return str(obj)
raise TypeError # Let orjson handle other types
# Create coder instance with custom handler
custom_coder = OrjsonCoder(default=handle_objectid)
# Use with decorator
@cache(coder=custom_coder)
async def get_document(doc_id: str) -> dict:
return {
"_id": ObjectId(doc_id),
"name": "Document",
"tags": [ObjectId("..."), ObjectId("...")] # Nested structures handled automatically
}
```
#### JsonCoder with Custom Handler
```python
from pydantic_cache import JsonCoder
def handle_custom_types(obj):
if isinstance(obj, ObjectId):
return str(obj)
if isinstance(obj, Decimal):
return float(obj) # Convert to float instead of string
raise TypeError # Let default encoder handle other types
# Create coder with custom handler (same interface as OrjsonCoder!)
custom_coder = JsonCoder(default=handle_custom_types)
@cache(coder=custom_coder)
async def get_data():
return {"id": ObjectId("..."), "price": Decimal("99.99")}
```
#### PickleCoder with Protocol Version
```python
from pydantic_cache import PickleCoder
import pickle
# Use specific protocol version
coder = PickleCoder(protocol=pickle.HIGHEST_PROTOCOL)
@cache(coder=coder)
async def get_complex_object():
return complex_python_object
```
## Cache Management
```python
# Clear specific key
await PydanticCache.clear(key="specific_key")
# Clear entire namespace
await PydanticCache.clear(namespace="users")
# Clear all cache
await PydanticCache.clear()
# Disable caching temporarily
PydanticCache.set_enable(False)
# Re-enable caching
PydanticCache.set_enable(True)
# Check if caching is enabled
is_enabled = PydanticCache.get_enable()
```
## Custom Key Builder
Create custom cache key generation logic:
```python
from pydantic_cache.types import KeyBuilder
def custom_key_builder(
func,
namespace: str,
args: tuple,
kwargs: dict
) -> str:
# Custom logic to generate cache key
func_name = func.__name__
args_str = "_".join(str(arg) for arg in args)
return f"{namespace}:{func_name}:{args_str}"
# Use globally
PydanticCache.init(backend, key_builder=custom_key_builder)
# Or per decorator
@cache(key_builder=custom_key_builder)
async def my_function():
pass
```
## Examples
### Complex Type Conversions
```python
# Nested structures
@cache(model=list[list[int]])
async def get_matrix() -> list[list[str]]:
return [["1", "2"], ["3", "4"]] # Converted to [[1, 2], [3, 4]]
# List of Pydantic models
class Item(BaseModel):
id: int
name: str
price: float
@cache(model=list[Item])
async def get_items() -> list[dict]:
return [
{"id": "1", "name": "Item 1", "price": "9.99"},
{"id": "2", "name": "Item 2", "price": "19.99"}
]
# Complex Union types
@cache(model=User | dict | None)
async def get_flexible_data(data_type: str) -> any:
if data_type == "user":
return {"id": 1, "name": "Alice", "email": "alice@example.com"}
elif data_type == "dict":
return {"key": "value"}
else:
return None
```
### Model-to-Model Conversion
```python
class DetailedUser(BaseModel):
id: int
name: str
email: str
age: int
bio: str
class SimpleUser(BaseModel):
id: int
name: str
# Convert DetailedUser to SimpleUser
@cache(model=SimpleUser)
async def get_simple_user(user_id: int) -> DetailedUser:
return DetailedUser(
id=user_id,
name="Alice",
email="alice@example.com",
age=30,
bio="Developer"
)
# Result will be SimpleUser with only id and name
```
### Working with External APIs
```python
class APIResponse(BaseModel):
status: str
data: dict
timestamp: str | None = None
# Force API responses to be validated as Pydantic models
@cache(model=APIResponse, expire=300, namespace="api")
async def fetch_from_api(endpoint: str) -> dict:
# Make actual API call here
return {
"status": "success",
"data": {"result": "some data"},
"timestamp": "2024-01-01T12:00:00Z"
}
# Result is always validated as APIResponse model
response = await fetch_from_api("/users")
print(response.status) # Type-safe access
```
## Testing
```python
import pytest
from pydantic_cache import PydanticCache
from pydantic_cache.backends.inmemory import InMemoryBackend
@pytest.fixture
async def cache_setup():
backend = InMemoryBackend()
PydanticCache.init(backend, prefix="test", expire=60)
yield
await backend.clear()
async def test_caching(cache_setup):
call_count = 0
@cache(namespace="test")
async def get_value():
nonlocal call_count
call_count += 1
return "result"
result1 = await get_value()
result2 = await get_value()
assert result1 == result2
assert call_count == 1 # Called only once due to caching
```
## Acknowledgments
This project was inspired by [fastapi-cache](https://github.com/long2ice/fastapi-cache) by @long2ice. While fastapi-cache provides excellent caching capabilities for FastAPI applications, pydantic-typed-cache was created to offer similar functionality for general async Python applications without the FastAPI/Starlette dependency.
## License
MIT
Raw data
{
"_id": null,
"home_page": null,
"name": "pydantic-typed-cache",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.13",
"maintainer_email": null,
"keywords": "cache, pydantic, async, redis, asyncio, decorator",
"author": null,
"author_email": "Wonderweeks Company <dev@mom-mom.net>",
"download_url": "https://files.pythonhosted.org/packages/7c/92/5db4e9aac8d8b7e7017871263751ec2bbeb278674ddbf09affb73fd16242/pydantic_typed_cache-1.0.4.tar.gz",
"platform": null,
"description": "# Pydantic Typed Cache\n\nAn async cache library for Pydantic models without FastAPI dependencies. This library provides a simple decorator-based caching mechanism for async functions that return Pydantic models or other Python objects.\n\n> **Note**: This project was inspired by [fastapi-cache](https://github.com/long2ice/fastapi-cache) but designed to work independently of FastAPI/Starlette, making it suitable for any async Python application.\n\n## Features\n\n- \ud83d\ude80 Simple decorator-based caching for async functions\n- \ud83d\udd04 Support for both async and sync functions (sync functions run in thread pool)\n- \ud83d\udce6 Multiple backends: Redis, In-Memory\n- \ud83c\udfaf Type-safe with Pydantic model support\n- \ud83d\udd11 Customizable cache key generation\n- \ud83d\udcdd Multiple serialization options (JSON, Pickle)\n- \u26a1 Zero FastAPI/Starlette dependencies\n- \ud83d\udd27 Flexible type conversion with `model` parameter\n- \u2705 Support for nullable/optional types with proper None handling\n- \ud83c\udfad Support for Union types and complex type hints\n\n## Installation\n\n```bash\n# Basic installation\npip install pydantic-typed-cache\n\n# With orjson for faster JSON serialization\npip install pydantic-typed-cache[orjson]\n\n# With development dependencies\npip install pydantic-typed-cache[dev]\n```\n\n## Quick Start\n\n```python\nimport asyncio\nfrom pydantic import BaseModel\nfrom pydantic_cache import PydanticCache, cache\nfrom pydantic_cache.backends.inmemory import InMemoryBackend\n\n# Define a Pydantic model\nclass User(BaseModel):\n id: int\n name: str\n email: str\n\n# Initialize cache\nbackend = InMemoryBackend()\nPydanticCache.init(backend, prefix=\"myapp\", expire=60)\n\n# Cache a function\n@cache(expire=120, namespace=\"users\")\nasync def get_user(user_id: int) -> User:\n # Expensive operation (e.g., database query)\n return User(id=user_id, name=\"John\", email=\"john@example.com\")\n\n# Use the cached function\nasync def main():\n user = await get_user(1) # First call - cache miss\n user = await get_user(1) # Second call - cache hit\n \nasyncio.run(main())\n```\n\n## Advanced Features\n\n### Type Conversion with `model` Parameter\n\nThe `model` parameter allows you to force type conversion of cached values. This works with any Python type, not just Pydantic models:\n\n```python\n# Convert string to int\n@cache(model=int)\nasync def get_count() -> str:\n return \"42\" # Will be converted to int 42\n\n# Convert dict to Pydantic model\nclass UserResponse(BaseModel):\n id: int\n name: str\n\n@cache(model=UserResponse)\nasync def get_user_data(user_id: int) -> dict:\n # Returns dict that will be converted to UserResponse\n return {\"id\": user_id, \"name\": \"Alice\"}\n\n# Work with collections\n@cache(model=list[int])\nasync def get_scores() -> list[str]:\n return [\"95\", \"87\", \"92\"] # Converted to [95, 87, 92]\n\n# Support Union types\n@cache(model=int | str)\nasync def get_flexible_value(use_number: bool) -> any:\n return 123 if use_number else \"hello\"\n\n# Support Optional types\n@cache(model=int | None)\nasync def get_optional_value(value: str | None) -> str | None:\n return value # \"123\" becomes 123, None stays None\n```\n\n### Nullable/Optional Type Support\n\nThe library properly handles None values in optional types, distinguishing between cached None values and cache misses:\n\n```python\n# Function returning Optional types\n@cache(namespace=\"users\")\nasync def get_user(user_id: int) -> User | None:\n if user_id < 0:\n return None # This None will be cached\n return User(id=user_id, name=\"John\", email=\"john@example.com\")\n\n# First call with -1\nuser = await get_user(-1) # Returns None (cache miss, stores None)\nuser = await get_user(-1) # Returns None (cache hit, retrieves None)\n\n# Pydantic models with optional fields\nclass Profile(BaseModel):\n id: int\n bio: str | None = None\n age: int | None = None\n\n@cache(namespace=\"profiles\")\nasync def get_profile(user_id: int) -> Profile:\n return Profile(id=user_id, bio=None, age=None) # None fields are properly cached\n```\n\n### Sync Function Support\n\nSync functions are automatically wrapped and run in a thread pool:\n\n```python\n@cache(namespace=\"compute\")\ndef expensive_computation(x: int, y: int) -> int:\n # This sync function will be run in a thread pool\n import time\n time.sleep(1)\n return x * y\n\n# Can be called as async\nresult = await expensive_computation(10, 20)\n```\n\n## Backends\n\n### In-Memory Backend\n\nPerfect for development and testing:\n\n```python\nfrom pydantic_cache.backends.inmemory import InMemoryBackend\n\nbackend = InMemoryBackend()\nPydanticCache.init(backend)\n```\n\n### Redis Backend\n\nFor production use with persistence:\n\n```python\nfrom redis.asyncio import Redis\nfrom pydantic_cache.backends.redis import RedisBackend\n\nredis = Redis(host=\"localhost\", port=6379)\nbackend = RedisBackend(redis)\nPydanticCache.init(backend)\n```\n\n## Configuration\n\n### Global Configuration\n\n```python\nPydanticCache.init(\n backend=backend,\n prefix=\"myapp\", # Prefix for all cache keys\n expire=300, # Default expiration in seconds\n coder=JsonCoder, # Serialization method (JsonCoder or PickleCoder)\n key_builder=my_key_builder, # Custom key builder function\n enable=True # Enable/disable caching globally\n)\n```\n\n### Per-Decorator Configuration\n\n```python\n@cache(\n expire=120, # Override default expiration\n namespace=\"users\", # Namespace for this function\n coder=PickleCoder, # Override default coder\n key_builder=custom_key_builder, # Custom key builder\n model=UserModel # Force type conversion\n)\nasync def cached_function():\n pass\n```\n\n## Serialization\n\n### JsonCoder (Default)\n\n- Human-readable cache values\n- Good for debugging\n- Supports most Python types and Pydantic models\n- Moderate performance\n\n### OrjsonCoder (Recommended for performance)\n\n- **2-3x faster** than standard JSON\n- Efficient datetime handling\n- Better performance with large datasets\n- Requires: `pip install pydantic-typed-cache[orjson]`\n\n```python\nfrom pydantic_cache import OrjsonCoder\n\nPydanticCache.init(backend, coder=OrjsonCoder)\n```\n\n### PickleCoder\n\n- Supports all Python objects\n- Fast serialization\n- Binary format (not human-readable)\n- Better for complex nested structures\n\n```python\nfrom pydantic_cache.coder import JsonCoder, OrjsonCoder, PickleCoder\n\n# Set globally with default instance\nPydanticCache.init(backend, coder=OrjsonCoder()) # Recommended\n\n# Or with custom configuration\ncustom_coder = OrjsonCoder(default=my_handler)\nPydanticCache.init(backend, coder=custom_coder)\n\n# Or per decorator\n@cache(coder=JsonCoder()) # Default configuration\nasync def my_function():\n pass\n\n# Or with custom configuration per function\n@cache(coder=JsonCoder(default=my_handler))\nasync def my_other_function():\n pass\n```\n\n### Performance Comparison\n\n| Coder | Speed | Human Readable | Size | Use Case |\n|-------|-------|----------------|------|----------|\n| JsonCoder | Moderate | \u2705 | Small | Debugging, small data |\n| OrjsonCoder | Fast | \u2705 | Small | Production, large data |\n| PickleCoder | Fast | \u274c | Medium | Complex objects |\n\n### Custom Serialization\n\nAll coders now support instance-based configuration for custom serialization:\n\n#### OrjsonCoder with Custom Types\n\n```python\nfrom pydantic_cache import OrjsonCoder\n\n# Define handler for non-serializable types\ndef handle_objectid(obj):\n if isinstance(obj, ObjectId):\n return str(obj)\n raise TypeError # Let orjson handle other types\n\n# Create coder instance with custom handler\ncustom_coder = OrjsonCoder(default=handle_objectid)\n\n# Use with decorator\n@cache(coder=custom_coder)\nasync def get_document(doc_id: str) -> dict:\n return {\n \"_id\": ObjectId(doc_id),\n \"name\": \"Document\",\n \"tags\": [ObjectId(\"...\"), ObjectId(\"...\")] # Nested structures handled automatically\n }\n```\n\n#### JsonCoder with Custom Handler\n\n```python\nfrom pydantic_cache import JsonCoder\n\ndef handle_custom_types(obj):\n if isinstance(obj, ObjectId):\n return str(obj)\n if isinstance(obj, Decimal):\n return float(obj) # Convert to float instead of string\n raise TypeError # Let default encoder handle other types\n\n# Create coder with custom handler (same interface as OrjsonCoder!)\ncustom_coder = JsonCoder(default=handle_custom_types)\n\n@cache(coder=custom_coder)\nasync def get_data():\n return {\"id\": ObjectId(\"...\"), \"price\": Decimal(\"99.99\")}\n```\n\n#### PickleCoder with Protocol Version\n\n```python\nfrom pydantic_cache import PickleCoder\nimport pickle\n\n# Use specific protocol version\ncoder = PickleCoder(protocol=pickle.HIGHEST_PROTOCOL)\n\n@cache(coder=coder)\nasync def get_complex_object():\n return complex_python_object\n```\n\n## Cache Management\n\n```python\n# Clear specific key\nawait PydanticCache.clear(key=\"specific_key\")\n\n# Clear entire namespace\nawait PydanticCache.clear(namespace=\"users\")\n\n# Clear all cache\nawait PydanticCache.clear()\n\n# Disable caching temporarily\nPydanticCache.set_enable(False)\n\n# Re-enable caching\nPydanticCache.set_enable(True)\n\n# Check if caching is enabled\nis_enabled = PydanticCache.get_enable()\n```\n\n## Custom Key Builder\n\nCreate custom cache key generation logic:\n\n```python\nfrom pydantic_cache.types import KeyBuilder\n\ndef custom_key_builder(\n func,\n namespace: str,\n args: tuple,\n kwargs: dict\n) -> str:\n # Custom logic to generate cache key\n func_name = func.__name__\n args_str = \"_\".join(str(arg) for arg in args)\n return f\"{namespace}:{func_name}:{args_str}\"\n\n# Use globally\nPydanticCache.init(backend, key_builder=custom_key_builder)\n\n# Or per decorator\n@cache(key_builder=custom_key_builder)\nasync def my_function():\n pass\n```\n\n## Examples\n\n### Complex Type Conversions\n\n```python\n# Nested structures\n@cache(model=list[list[int]])\nasync def get_matrix() -> list[list[str]]:\n return [[\"1\", \"2\"], [\"3\", \"4\"]] # Converted to [[1, 2], [3, 4]]\n\n# List of Pydantic models\nclass Item(BaseModel):\n id: int\n name: str\n price: float\n\n@cache(model=list[Item])\nasync def get_items() -> list[dict]:\n return [\n {\"id\": \"1\", \"name\": \"Item 1\", \"price\": \"9.99\"},\n {\"id\": \"2\", \"name\": \"Item 2\", \"price\": \"19.99\"}\n ]\n\n# Complex Union types\n@cache(model=User | dict | None)\nasync def get_flexible_data(data_type: str) -> any:\n if data_type == \"user\":\n return {\"id\": 1, \"name\": \"Alice\", \"email\": \"alice@example.com\"}\n elif data_type == \"dict\":\n return {\"key\": \"value\"}\n else:\n return None\n```\n\n### Model-to-Model Conversion\n\n```python\nclass DetailedUser(BaseModel):\n id: int\n name: str\n email: str\n age: int\n bio: str\n\nclass SimpleUser(BaseModel):\n id: int\n name: str\n\n# Convert DetailedUser to SimpleUser\n@cache(model=SimpleUser)\nasync def get_simple_user(user_id: int) -> DetailedUser:\n return DetailedUser(\n id=user_id,\n name=\"Alice\",\n email=\"alice@example.com\",\n age=30,\n bio=\"Developer\"\n )\n # Result will be SimpleUser with only id and name\n```\n\n### Working with External APIs\n\n```python\nclass APIResponse(BaseModel):\n status: str\n data: dict\n timestamp: str | None = None\n\n# Force API responses to be validated as Pydantic models\n@cache(model=APIResponse, expire=300, namespace=\"api\")\nasync def fetch_from_api(endpoint: str) -> dict:\n # Make actual API call here\n return {\n \"status\": \"success\",\n \"data\": {\"result\": \"some data\"},\n \"timestamp\": \"2024-01-01T12:00:00Z\"\n }\n\n# Result is always validated as APIResponse model\nresponse = await fetch_from_api(\"/users\")\nprint(response.status) # Type-safe access\n```\n\n## Testing\n\n```python\nimport pytest\nfrom pydantic_cache import PydanticCache\nfrom pydantic_cache.backends.inmemory import InMemoryBackend\n\n@pytest.fixture\nasync def cache_setup():\n backend = InMemoryBackend()\n PydanticCache.init(backend, prefix=\"test\", expire=60)\n yield\n await backend.clear()\n\nasync def test_caching(cache_setup):\n call_count = 0\n \n @cache(namespace=\"test\")\n async def get_value():\n nonlocal call_count\n call_count += 1\n return \"result\"\n \n result1 = await get_value()\n result2 = await get_value()\n \n assert result1 == result2\n assert call_count == 1 # Called only once due to caching\n```\n\n## Acknowledgments\n\nThis project was inspired by [fastapi-cache](https://github.com/long2ice/fastapi-cache) by @long2ice. While fastapi-cache provides excellent caching capabilities for FastAPI applications, pydantic-typed-cache was created to offer similar functionality for general async Python applications without the FastAPI/Starlette dependency.\n\n## License\n\nMIT\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Async cache library for Pydantic models without FastAPI dependencies",
"version": "1.0.4",
"project_urls": {
"Documentation": "https://github.com/mom-mom/pydantic-cache#readme",
"Homepage": "https://github.com/mom-mom/pydantic-cache",
"Inspiration": "https://github.com/long2ice/fastapi-cache",
"Issues": "https://github.com/mom-mom/pydantic-cache/issues",
"Repository": "https://github.com/mom-mom/pydantic-cache.git"
},
"split_keywords": [
"cache",
" pydantic",
" async",
" redis",
" asyncio",
" decorator"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "73527d39e1aa22dc1c57e70cf7422e1c232750ac813ef64988757036ad1deb03",
"md5": "8ecbdd1bd553cacba2580c7a34ba0e46",
"sha256": "a3dd5e1965305fb9094e2a173b40c3ef6c299c89173cf9a6ad36a54b2d1be94a"
},
"downloads": -1,
"filename": "pydantic_typed_cache-1.0.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "8ecbdd1bd553cacba2580c7a34ba0e46",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.13",
"size": 14745,
"upload_time": "2025-08-12T09:14:31",
"upload_time_iso_8601": "2025-08-12T09:14:31.230550Z",
"url": "https://files.pythonhosted.org/packages/73/52/7d39e1aa22dc1c57e70cf7422e1c232750ac813ef64988757036ad1deb03/pydantic_typed_cache-1.0.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "7c925db4e9aac8d8b7e7017871263751ec2bbeb278674ddbf09affb73fd16242",
"md5": "ebc3b153ea0744c4b71658e83b78a745",
"sha256": "45016b1aacf3c75c45c7087834fcf7e0ad3404cddf802923aab2db820ef80fb8"
},
"downloads": -1,
"filename": "pydantic_typed_cache-1.0.4.tar.gz",
"has_sig": false,
"md5_digest": "ebc3b153ea0744c4b71658e83b78a745",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.13",
"size": 27282,
"upload_time": "2025-08-12T09:14:32",
"upload_time_iso_8601": "2025-08-12T09:14:32.506734Z",
"url": "https://files.pythonhosted.org/packages/7c/92/5db4e9aac8d8b7e7017871263751ec2bbeb278674ddbf09affb73fd16242/pydantic_typed_cache-1.0.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-12 09:14:32",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "mom-mom",
"github_project": "pydantic-cache#readme",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "pydantic-typed-cache"
}