Name | cacherator JSON |
Version |
1.0.9
JSON |
| download |
home_page | None |
Summary | A Python library for persistent JSON-based caching of class state and function results. |
upload_time | 2025-03-07 08:02:56 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.7 |
license | MIT |
keywords |
cache
json
data storage
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Cacherator
Cacherator is a Python package that provides persistent JSON-based caching for class state and function results. It enables significant performance improvements by caching expensive computations and preserving object state between program executions.
## Installation
You can install Cacherator using pip:
```bash
pip install cacherator
```
## Features
- Persistent caching of function results
- Customizable Time-To-Live (TTL) for cached data
- Option to clear cache on demand
- JSON-based storage for easy inspection and portability
- Automatic serialization and deserialization of cached data
- Support for instance methods and properties
## Core Components
### 1. JSONCache (Base Class)
The foundation class that enables persistent caching of object state.
```python
from cacherator import JSONCache
class MyClass(JSONCache):
def __init__(self, data_id=None):
super().__init__(data_id=data_id)
# Your initialization code here
```
#### Constructor Parameters
| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `data_id` | `str` | Class name | Unique identifier for the cache file |
| `directory` | `str` | "json/data" | Directory for storing cache files |
| `clear_cache` | `bool` | `False` | Whether to clear existing cache on initialization |
| `ttl` | `timedelta \| int \| float` | 999 (days) | Default time-to-live for cached items |
| `logging` | `bool` | `True` | Whether to enable logging of cache operations |
#### Key Methods
- `json_cache_save()`: Manually save the current state to the cache file
### 2. Cached Decorator
Decorator for caching results of instance methods.
```python
from cacherator import JSONCache, Cached
class MyClass(JSONCache):
@Cached(ttl=30, clear_cache=False)
def expensive_calculation(self, param1, param2):
# Expensive computation here
return result
```
#### Parameters
| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `ttl` | `timedelta \| int \| float` | None (uses class ttl) | Time-to-live for cached results |
| `clear_cache` | `bool` | `False` | Whether to clear existing cache for this function |
## Usage Patterns
### Basic Usage
```python
from cacherator import JSONCache, Cached
import time
class DataProcessor(JSONCache):
def __init__(self, dataset_id):
super().__init__(data_id=f"processor_{dataset_id}")
self.dataset_id = dataset_id
@Cached()
def process_data(self, threshold=0.5):
print("Processing data (expensive operation)...")
time.sleep(2) # Simulate expensive computation
return [i for i in range(10) if i/10 > threshold]
# First run - will execute and cache
processor = DataProcessor("dataset1")
result1 = processor.process_data(0.3) # Executes the function
# Second run - will use cache
processor2 = DataProcessor("dataset1")
result2 = processor2.process_data(0.3) # Returns cached result
# Different arguments - new cache entry
result3 = processor2.process_data(0.7) # Executes the function
```
### Cache Clearing
```python
# Clear specific function cache
processor = DataProcessor("dataset1")
result = processor.process_data(0.3, clear_cache=True) # Force recomputation
# Clear all cache for an object
processor = DataProcessor("dataset1", clear_cache=True) # Clear entire object cache
```
### Custom TTL
```python
from datetime import timedelta
class WeatherService(JSONCache):
def __init__(self, location):
# Cache weather data for 1 day by default
super().__init__(data_id=f"weather_{location}", ttl=1)
self.location = location
# Cache forecast for only 6 hours
@Cached(ttl=0.25) # 0.25 days = 6 hours
def get_forecast(self):
# API call to weather service
pass
# Cache historical data for 30 days
@Cached(ttl=30)
def get_historical_data(self, start_date, end_date):
# API call to weather service
pass
```
### State Persistence
```python
class GameState(JSONCache):
def __init__(self, game_id):
super().__init__(data_id=f"game_{game_id}")
# Default values for new games
if not hasattr(self, "score"):
self.score = 0
if not hasattr(self, "level"):
self.level = 1
def increase_score(self, points):
self.score += points
self.json_cache_save() # Explicitly save state
def level_up(self):
self.level += 1
# No explicit save needed, will be saved on garbage collection
```
### Custom Directory
```python
import os
class UserProfile(JSONCache):
def __init__(self, user_id):
cache_dir = os.path.join("data", "users", user_id[:2])
super().__init__(
data_id=user_id,
directory=cache_dir
)
```
### Excluding Variables from Cache
```python
class AnalysisEngine(JSONCache):
def __init__(self, project_id):
self._excluded_cache_vars = ["temp_data", "sensitive_info"]
super().__init__(data_id=project_id)
self.project_id = project_id
self.results = {}
self.temp_data = [] # Will not be cached due to exclusion
self.sensitive_info = {} # Will not be cached due to exclusion
```
## Best Practices
### When to Use Cacherator
- **DO** use for expensive computations that are called repeatedly with the same parameters
- **DO** use for preserving application state between runs
- **DO** use for reducing API calls or database queries
- **DO** use when results can be serialized to JSON
### When Not to Use Cacherator
- **DON'T** use for functions with non-deterministic results (e.g., random generators)
- **DON'T** use for time-sensitive operations where fresh data is critical
- **DON'T** use for functions with non-serializable results
- **DON'T** use for very simple or fast operations where caching overhead exceeds benefits
### Performance Considerations
- Set appropriate TTL values based on data freshness requirements
- Be aware of disk I/O overhead for frequent cache saves
- Consider excluding large or frequently changing attributes with `_excluded_cache_vars`
- Use dedicated cache directories for better organization and performance
### Error Handling
Cacherator gracefully handles common errors:
- Missing cache files (creates new cache)
- Permission errors (logs error and continues)
- JSON parsing errors (logs error and continues)
- Non-serializable objects (excludes from cache)
## Common Issues and Solutions
### Issue: Cache Not Being Saved
**Possible causes:**
1. Object is not being garbage collected
2. Errors during serialization
**Solutions:**
1. Explicitly call `json_cache_save()` at key points
2. Check for non-serializable attributes and exclude them with `_excluded_cache_vars`
### Issue: Cache Not Being Used
**Possible causes:**
1. Function arguments differ slightly (e.g., floats vs integers)
2. TTL has expired
3. `clear_cache=True` is being used
**Solutions:**
1. Standardize argument types before passing to cached functions
2. Increase TTL if appropriate
3. Remove `clear_cache=True` parameter or use conditionally
### Issue: Large Cache Files
**Possible causes:**
1. Caching large data structures
2. Many function calls with different parameters
**Solutions:**
1. Use `_excluded_cache_vars` for large attributes
2. Create separate cache instances for different data sets
## Security Considerations
1. **Sensitive Data**: Avoid caching sensitive information like passwords or API keys
- Either exclude them with `_excluded_cache_vars`
- Or encrypt them before storing
2. **File Permissions**: Cache files are stored as regular files
- Ensure proper file permissions on cache directories
- Consider using more secure storage for sensitive applications
3. **TTL for Sensitive Operations**: Use shorter TTLs for operations with security implications
- Authentication tokens
- User permissions
- Security settings
## Compatibility Notes
Cacherator is compatible with:
- Python 3.7+
- All major operating systems (Windows, macOS, Linux)
- Common serializable Python data types (dict, list, str, int, float, bool, etc.)
- datetime objects (via DateTimeEncoder)
- Most standard library classes that are JSON-serializable
## License
This project is licensed under the MIT License.
## Dependencies
- python-slugify
- logorator
## Links
- GitHub: https://github.com/Redundando/cacherator
Raw data
{
"_id": null,
"home_page": null,
"name": "cacherator",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "cache, json, data storage",
"author": null,
"author_email": "Arved Kl\u00f6hn <arved.kloehn@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/6a/53/adac95f8a71c860082597f6e02b45dc8976be63ed52d5958e557fa3e7492/cacherator-1.0.9.tar.gz",
"platform": null,
"description": "# Cacherator\r\n\r\nCacherator is a Python package that provides persistent JSON-based caching for class state and function results. It enables significant performance improvements by caching expensive computations and preserving object state between program executions.\r\n\r\n## Installation\r\n\r\nYou can install Cacherator using pip:\r\n\r\n```bash\r\npip install cacherator\r\n```\r\n\r\n## Features\r\n\r\n- Persistent caching of function results\r\n- Customizable Time-To-Live (TTL) for cached data\r\n- Option to clear cache on demand\r\n- JSON-based storage for easy inspection and portability\r\n- Automatic serialization and deserialization of cached data\r\n- Support for instance methods and properties\r\n\r\n## Core Components\r\n\r\n### 1. JSONCache (Base Class)\r\n\r\nThe foundation class that enables persistent caching of object state.\r\n\r\n```python\r\nfrom cacherator import JSONCache\r\n\r\nclass MyClass(JSONCache):\r\n def __init__(self, data_id=None):\r\n super().__init__(data_id=data_id)\r\n # Your initialization code here\r\n```\r\n\r\n#### Constructor Parameters\r\n\r\n| Parameter | Type | Default | Description |\r\n|-----------|------|---------|-------------|\r\n| `data_id` | `str` | Class name | Unique identifier for the cache file |\r\n| `directory` | `str` | \"json/data\" | Directory for storing cache files |\r\n| `clear_cache` | `bool` | `False` | Whether to clear existing cache on initialization |\r\n| `ttl` | `timedelta \\| int \\| float` | 999 (days) | Default time-to-live for cached items |\r\n| `logging` | `bool` | `True` | Whether to enable logging of cache operations |\r\n\r\n#### Key Methods\r\n\r\n- `json_cache_save()`: Manually save the current state to the cache file\r\n\r\n### 2. Cached Decorator\r\n\r\nDecorator for caching results of instance methods.\r\n\r\n```python\r\nfrom cacherator import JSONCache, Cached\r\n\r\nclass MyClass(JSONCache):\r\n @Cached(ttl=30, clear_cache=False)\r\n def expensive_calculation(self, param1, param2):\r\n # Expensive computation here\r\n return result\r\n```\r\n\r\n#### Parameters\r\n\r\n| Parameter | Type | Default | Description |\r\n|-----------|------|---------|-------------|\r\n| `ttl` | `timedelta \\| int \\| float` | None (uses class ttl) | Time-to-live for cached results |\r\n| `clear_cache` | `bool` | `False` | Whether to clear existing cache for this function |\r\n\r\n## Usage Patterns\r\n\r\n### Basic Usage\r\n\r\n```python\r\nfrom cacherator import JSONCache, Cached\r\nimport time\r\n\r\nclass DataProcessor(JSONCache):\r\n def __init__(self, dataset_id):\r\n super().__init__(data_id=f\"processor_{dataset_id}\")\r\n self.dataset_id = dataset_id\r\n \r\n @Cached()\r\n def process_data(self, threshold=0.5):\r\n print(\"Processing data (expensive operation)...\")\r\n time.sleep(2) # Simulate expensive computation\r\n return [i for i in range(10) if i/10 > threshold]\r\n \r\n# First run - will execute and cache\r\nprocessor = DataProcessor(\"dataset1\")\r\nresult1 = processor.process_data(0.3) # Executes the function\r\n\r\n# Second run - will use cache\r\nprocessor2 = DataProcessor(\"dataset1\")\r\nresult2 = processor2.process_data(0.3) # Returns cached result\r\n\r\n# Different arguments - new cache entry\r\nresult3 = processor2.process_data(0.7) # Executes the function\r\n```\r\n\r\n### Cache Clearing\r\n\r\n```python\r\n# Clear specific function cache\r\nprocessor = DataProcessor(\"dataset1\")\r\nresult = processor.process_data(0.3, clear_cache=True) # Force recomputation\r\n\r\n# Clear all cache for an object\r\nprocessor = DataProcessor(\"dataset1\", clear_cache=True) # Clear entire object cache\r\n```\r\n\r\n### Custom TTL\r\n\r\n```python\r\nfrom datetime import timedelta\r\n\r\nclass WeatherService(JSONCache):\r\n def __init__(self, location):\r\n # Cache weather data for 1 day by default\r\n super().__init__(data_id=f\"weather_{location}\", ttl=1)\r\n self.location = location\r\n \r\n # Cache forecast for only 6 hours\r\n @Cached(ttl=0.25) # 0.25 days = 6 hours\r\n def get_forecast(self):\r\n # API call to weather service\r\n pass\r\n \r\n # Cache historical data for 30 days\r\n @Cached(ttl=30)\r\n def get_historical_data(self, start_date, end_date):\r\n # API call to weather service\r\n pass\r\n```\r\n\r\n### State Persistence\r\n\r\n```python\r\nclass GameState(JSONCache):\r\n def __init__(self, game_id):\r\n super().__init__(data_id=f\"game_{game_id}\")\r\n # Default values for new games\r\n if not hasattr(self, \"score\"):\r\n self.score = 0\r\n if not hasattr(self, \"level\"):\r\n self.level = 1\r\n \r\n def increase_score(self, points):\r\n self.score += points\r\n self.json_cache_save() # Explicitly save state\r\n \r\n def level_up(self):\r\n self.level += 1\r\n # No explicit save needed, will be saved on garbage collection\r\n```\r\n\r\n### Custom Directory\r\n\r\n```python\r\nimport os\r\n\r\nclass UserProfile(JSONCache):\r\n def __init__(self, user_id):\r\n cache_dir = os.path.join(\"data\", \"users\", user_id[:2])\r\n super().__init__(\r\n data_id=user_id,\r\n directory=cache_dir\r\n )\r\n```\r\n\r\n### Excluding Variables from Cache\r\n\r\n```python\r\nclass AnalysisEngine(JSONCache):\r\n def __init__(self, project_id):\r\n self._excluded_cache_vars = [\"temp_data\", \"sensitive_info\"]\r\n super().__init__(data_id=project_id)\r\n self.project_id = project_id\r\n self.results = {}\r\n self.temp_data = [] # Will not be cached due to exclusion\r\n self.sensitive_info = {} # Will not be cached due to exclusion\r\n```\r\n\r\n## Best Practices\r\n\r\n### When to Use Cacherator\r\n\r\n- **DO** use for expensive computations that are called repeatedly with the same parameters\r\n- **DO** use for preserving application state between runs\r\n- **DO** use for reducing API calls or database queries\r\n- **DO** use when results can be serialized to JSON\r\n\r\n### When Not to Use Cacherator\r\n\r\n- **DON'T** use for functions with non-deterministic results (e.g., random generators)\r\n- **DON'T** use for time-sensitive operations where fresh data is critical\r\n- **DON'T** use for functions with non-serializable results\r\n- **DON'T** use for very simple or fast operations where caching overhead exceeds benefits\r\n\r\n### Performance Considerations\r\n\r\n- Set appropriate TTL values based on data freshness requirements\r\n- Be aware of disk I/O overhead for frequent cache saves\r\n- Consider excluding large or frequently changing attributes with `_excluded_cache_vars`\r\n- Use dedicated cache directories for better organization and performance\r\n\r\n### Error Handling\r\n\r\nCacherator gracefully handles common errors:\r\n- Missing cache files (creates new cache)\r\n- Permission errors (logs error and continues)\r\n- JSON parsing errors (logs error and continues)\r\n- Non-serializable objects (excludes from cache)\r\n\r\n## Common Issues and Solutions\r\n\r\n### Issue: Cache Not Being Saved\r\n\r\n**Possible causes:**\r\n1. Object is not being garbage collected\r\n2. Errors during serialization\r\n\r\n**Solutions:**\r\n1. Explicitly call `json_cache_save()` at key points\r\n2. Check for non-serializable attributes and exclude them with `_excluded_cache_vars`\r\n\r\n### Issue: Cache Not Being Used\r\n\r\n**Possible causes:**\r\n1. Function arguments differ slightly (e.g., floats vs integers)\r\n2. TTL has expired\r\n3. `clear_cache=True` is being used\r\n\r\n**Solutions:**\r\n1. Standardize argument types before passing to cached functions\r\n2. Increase TTL if appropriate\r\n3. Remove `clear_cache=True` parameter or use conditionally\r\n\r\n### Issue: Large Cache Files\r\n\r\n**Possible causes:**\r\n1. Caching large data structures\r\n2. Many function calls with different parameters\r\n\r\n**Solutions:**\r\n1. Use `_excluded_cache_vars` for large attributes\r\n2. Create separate cache instances for different data sets\r\n\r\n## Security Considerations\r\n\r\n1. **Sensitive Data**: Avoid caching sensitive information like passwords or API keys\r\n - Either exclude them with `_excluded_cache_vars`\r\n - Or encrypt them before storing\r\n \r\n2. **File Permissions**: Cache files are stored as regular files\r\n - Ensure proper file permissions on cache directories\r\n - Consider using more secure storage for sensitive applications\r\n\r\n3. **TTL for Sensitive Operations**: Use shorter TTLs for operations with security implications\r\n - Authentication tokens\r\n - User permissions\r\n - Security settings\r\n\r\n## Compatibility Notes\r\n\r\nCacherator is compatible with:\r\n- Python 3.7+\r\n- All major operating systems (Windows, macOS, Linux)\r\n- Common serializable Python data types (dict, list, str, int, float, bool, etc.)\r\n- datetime objects (via DateTimeEncoder)\r\n- Most standard library classes that are JSON-serializable\r\n\r\n## License\r\n\r\nThis project is licensed under the MIT License.\r\n\r\n## Dependencies\r\n\r\n- python-slugify\r\n- logorator\r\n\r\n## Links\r\n\r\n- GitHub: https://github.com/Redundando/cacherator\r\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A Python library for persistent JSON-based caching of class state and function results.",
"version": "1.0.9",
"project_urls": {
"Source": "https://github.com/Redundando/cacherator"
},
"split_keywords": [
"cache",
" json",
" data storage"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "fb1b658b8bef0851d31cf16eb383e080ca820d6224e421da26d5d727d45f2a46",
"md5": "0771949392f5c8d216122b3ad8a537a4",
"sha256": "ef5bdf7ae94f90994cf59d77c9c9190e74d45bebf77df097820b4a6079dd9a0a"
},
"downloads": -1,
"filename": "cacherator-1.0.9-py3-none-any.whl",
"has_sig": false,
"md5_digest": "0771949392f5c8d216122b3ad8a537a4",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 11676,
"upload_time": "2025-03-07T08:02:54",
"upload_time_iso_8601": "2025-03-07T08:02:54.707539Z",
"url": "https://files.pythonhosted.org/packages/fb/1b/658b8bef0851d31cf16eb383e080ca820d6224e421da26d5d727d45f2a46/cacherator-1.0.9-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "6a53adac95f8a71c860082597f6e02b45dc8976be63ed52d5958e557fa3e7492",
"md5": "985e5d68f97dd8c4d2dacd3d2ffbaef5",
"sha256": "45b30d42e23d513b0c81867040afbc6a0f659b62881fdf3168d1441299fd1ec9"
},
"downloads": -1,
"filename": "cacherator-1.0.9.tar.gz",
"has_sig": false,
"md5_digest": "985e5d68f97dd8c4d2dacd3d2ffbaef5",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 13514,
"upload_time": "2025-03-07T08:02:56",
"upload_time_iso_8601": "2025-03-07T08:02:56.491289Z",
"url": "https://files.pythonhosted.org/packages/6a/53/adac95f8a71c860082597f6e02b45dc8976be63ed52d5958e557fa3e7492/cacherator-1.0.9.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-03-07 08:02:56",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Redundando",
"github_project": "cacherator",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "cacherator"
}