# Simple Caching
Small project to standardize storing (key, value) data for caching purposes.
## Supported caching methods:
1. Disk
- **NpyFS** - Numpy array export
2. Memory
- **DictMemory** - keys are stored as dict keys and recalled from memory
## Example
```
from simple_caching.storage import MemoryDict
import numpy as np
import time
def working_fn(image: np.ndarray) -> np.ndarray:
"""Working function that takes on average 0.2s"""
if np.random.rand() <= 0.2:
time.sleep(1)
return (image - image.min()) / (image.max() - image.min())
data = np.random.randn(32, 240, 420, 3).astype(np.float32) # 32 images of 240x420 shape
# Standard version. Takes 0.2s in average per epoch for the same processing.
for i in range(100):
new_data = [working_fn(image) for image in data]
# do further processing with the result
# Cached version. We store the result on memory.
def key_encode_fn(image: np.ndarray) -> str:
"""Return a cachable key for each image. We use a string of mean + std that should be unique enough."""
return f"{image.mean()}_{image.std()}"
cache = DictMemory(name="images", key_encode_fn=key_encode_fn)
# alternative, use: cache.map(working_fn, data)
for image in data:
cache[image] = working_fn(image)
for i in range(100):
new_data = [cache[image] for image in data]
# do further processing with the result
```
## Decoator version
```
@cache_fn(NpyFS, key_encode_fn)
def working_fn(image: np.ndarray) -> np.ndarray:
"""Working function that takes on average 0.2s"""
if np.random.rand() <= 0.2:
time.sleep(1)
return (image - image.min()) / (image.max() - image.min())
def key_encode_fn(image: np.ndarray) -> str:
"""Return a cachable key for each image. We use a string of mean + std that should be unique enough."""
return f"{image.mean()}_{image.std()}"
for i in range(100):
new_data = working_fn(image)
# do further processing with the result
```
Raw data
{
"_id": null,
"home_page": "https://gitlab.com/mihaicristianpirvu/python-simple-caching",
"name": "python-simple-caching",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "",
"keywords": "",
"author": "",
"author_email": "",
"download_url": "https://files.pythonhosted.org/packages/1e/47/4204a81a4f4b1d188bd0e46e61245ed349190dd57bed2895488dcdb6a94b/python-simple-caching-0.3.tar.gz",
"platform": null,
"description": "# Simple Caching\n\nSmall project to standardize storing (key, value) data for caching purposes.\n\n## Supported caching methods:\n1. Disk\n - **NpyFS** - Numpy array export\n2. Memory\n - **DictMemory** - keys are stored as dict keys and recalled from memory\n\n\n## Example\n\n```\nfrom simple_caching.storage import MemoryDict\nimport numpy as np\nimport time\n\ndef working_fn(image: np.ndarray) -> np.ndarray:\n \"\"\"Working function that takes on average 0.2s\"\"\"\n if np.random.rand() <= 0.2:\n time.sleep(1)\n return (image - image.min()) / (image.max() - image.min())\n\ndata = np.random.randn(32, 240, 420, 3).astype(np.float32) # 32 images of 240x420 shape\n\n# Standard version. Takes 0.2s in average per epoch for the same processing.\nfor i in range(100):\n new_data = [working_fn(image) for image in data]\n # do further processing with the result\n\n# Cached version. We store the result on memory.\n\ndef key_encode_fn(image: np.ndarray) -> str:\n \"\"\"Return a cachable key for each image. We use a string of mean + std that should be unique enough.\"\"\"\n return f\"{image.mean()}_{image.std()}\"\n\ncache = DictMemory(name=\"images\", key_encode_fn=key_encode_fn)\n# alternative, use: cache.map(working_fn, data)\nfor image in data:\n cache[image] = working_fn(image)\n\nfor i in range(100):\n new_data = [cache[image] for image in data]\n # do further processing with the result\n```\n\n## Decoator version\n\n```\n@cache_fn(NpyFS, key_encode_fn)\ndef working_fn(image: np.ndarray) -> np.ndarray:\n \"\"\"Working function that takes on average 0.2s\"\"\"\n if np.random.rand() <= 0.2:\n time.sleep(1)\n return (image - image.min()) / (image.max() - image.min())\n\ndef key_encode_fn(image: np.ndarray) -> str:\n \"\"\"Return a cachable key for each image. We use a string of mean + std that should be unique enough.\"\"\"\n return f\"{image.mean()}_{image.std()}\"\n\nfor i in range(100):\n new_data = working_fn(image)\n # do further processing with the result\n```\n\n",
"bugtrack_url": null,
"license": "WTFPL",
"summary": "Python Simple-Caching",
"version": "0.3",
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "1e474204a81a4f4b1d188bd0e46e61245ed349190dd57bed2895488dcdb6a94b",
"md5": "d293685228fca9c59771978a7adb96bd",
"sha256": "8c3e180ae28715d6f6f7f7e3712dde7163918164d4743b0d2c026950299ce970"
},
"downloads": -1,
"filename": "python-simple-caching-0.3.tar.gz",
"has_sig": false,
"md5_digest": "d293685228fca9c59771978a7adb96bd",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 7344,
"upload_time": "2023-01-11T08:40:37",
"upload_time_iso_8601": "2023-01-11T08:40:37.042779Z",
"url": "https://files.pythonhosted.org/packages/1e/47/4204a81a4f4b1d188bd0e46e61245ed349190dd57bed2895488dcdb6a94b/python-simple-caching-0.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-01-11 08:40:37",
"github": false,
"gitlab": true,
"bitbucket": false,
"gitlab_user": "mihaicristianpirvu",
"gitlab_project": "python-simple-caching",
"lcname": "python-simple-caching"
}