cachecache


Namecachecache JSON
Version 0.2.2 PyPI version JSON
download
home_pagehttps://github.com/m-beau/cachecache
SummaryPython package for easy function caching, allowing users to customize cache behavior at call-time.
upload_time2024-10-15 08:35:29
maintainerNone
docs_urlNone
authorMaxime Beau
requires_python>=3.6
licenseGPL-3.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![PyPI version](https://badge.fury.io/py/cachecache.svg)](https://badge.fury.io/py/cachecache)
[![License: GPLv3](https://img.shields.io/badge/license-GPLv3-blue)](https://opensource.org/license/gpl-3-0)
[![Downloads](https://static.pepy.tech/badge/cachecache)](https://pepy.tech/project/cachecache)

# cachecache: Python function decorator for runtime-configurable caching.</h1> <img src="https://raw.githubusercontent.com/m-beau/cachecache/master/images/cachecache.png" width="150" title="Neuropyxels" alt="Neuropixels" align="right" vspace = "50">

A Python package that provides a simple way to cache function results while allowing to dynamically configure caching behavior at each function call.

By "caching behavior" reconfigurable at each function call, we mean:
1) Whether to recompute results and overwrite the cache, which is useful for functions whose results rely on data loaded internally (therefore hidden from the function arguments, thus from the cache hash) and that can change on disk;
2) Whether to cache the results, which is useful for functions who may need caching in context A (e.g. frontend recurrent use) but not context B (e.g. backend unique use);
3) Where to save the cached results, which is useful for functions that return voluminous data, as this allows to distribute their cache across several locations given different arguments.

## Features

- 1-liner caching with a decorator: @cache
- Caching behavior can be customized on the fly for specific function calls, by passing the following arguments to the cached functions:
    - πŸ”„ `again=True`: recompute and overwrite cached results on-demand
    - ⏸️ `cache_results=False`: disable caching for specific function calls, for instance if the computed result would take too much room on disk.
    - πŸ“ `cache_path='different/caching/path'`: use custom cache locations for specific function calls
- Built on joblib's [Memory](https://joblib.readthedocs.io/en/latest/generated/joblib.Memory.html) class.

## Installation

You can install `cachecache` using pip:

```bash
pip install cachecache
```

## Usage

Here's a basic example of how to use `cachecache`:

```python
from cachecache import cache, Cacher
```

Cache using the default "~/.cachecache" directory and default maximum cache size:
```python
@cache # behind the scenes, "cache" is simply defined as "cache = Cacher()"
def my_cached_function(*args, again=False, cache_results=True, cache_path=None):
    # complex operations involving args...
    results = ...
    return results

result = my_cached_function(arg)  # potentially slow
result = my_cached_function(arg)  # always fast (results loaded from cache)
```

Cache using a custom directory and maximum cache size:
```python
cacher = Cacher("my/custom/caching/path", 10e9) # size in bytes - 10GB
@cacher
def my_cached_function(...):
    ...
```

Recompute results and overwrite cache:
```python
result = my_cached_function(arg, again=True)
```
This proves useful if the results depend on data that can change on disk (this information is not present in the arguments of the function, so the cacher does not know about it!).

Adjust caching directory at runtime
```python
result = my_cached_function(arg, cache_path="somewhere/else")
```
This proves useful if you need to distribute the cached results of a function across several disks.

cachecache also provides a way to create a `distributed_cacher` that will cache a function's results at a location specified by a custom argument (such as 'datapath'):
```python
from cachecache import Cacher, distributed_cacher

global_cacher = Cacher('~/.global_cache')

# Arguments of distributed_cacher:
# - datapath_arg_name (str, optional): The name of the argument in the decorated function
#     that specifies the datapath for the local cache. Defaults to 'datapath'.
# - local_cache_path (str, optional): The relative path to the local cache directory
#     within the datapath. Defaults to '.local_cache' (and results cached at f'{datapath}/.local_cache').
# - global_cache (cachecache.Cacher instance, optional): The global cacher to use by default
#     for cached functions without 'datapath_arg_name' (or when 'datapath_arg_name' is None).
#     Defaults to a cache at '~/.cachecache' (default instance of Cacher()).
dist_cacher = distributed_cacher(datapath_arg_name='datapath',
                                local_cache_path='.local_cache',
                                global_cache=global_cacher)

# You can then decorate a function as follow:
@dist_cacher
def my_distributed_cached_function(datapath, ...):
    """
    A function whose results will be cached at 'datapath/.local_cache'
    unless specified otherwise with the cache_path argument.

    Note: works with args and kwargs
    """
    ...
```
Behind the scenes, this works by swapping in the value of the specified argument (datapath_arg_name) instead of the 'cache_path' argument from Cacher (if 'cache_path' is also specified, it takes precedence over 'datapath').

Of course, you can use a single cacher for multiple functions:
```python
@cacher
def foo1(x):
    return x ** 2

@cacher
def foo2(x):
    return x / 10
```

And both of these syntaxes are possible:
```python
cacher = Cacher("my/custom/caching/path")
@cacher
def my_cached_function(...):
    ...

@Cacher("my/custom/caching/path")
def my_cached_function(...):
    ...
```

## License

This project is licensed under the terms of the [GNU General Public License v3.0](https://opensource.org/license/gpl-3-0). You may copy, distribute and modify the software as long as you track changes/dates in source files. Any modifications to or software including (via compiler) GPL-licensed code must also be made available under the GPL along with build & install instructions.

## Support

If you have any questions, issues, or feature requests, please [open an issue](https://github.com/m-beau/cachecache/issues) so that everybody can benefit from your experience! This package is actively maintained.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/m-beau/cachecache",
    "name": "cachecache",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": null,
    "author": "Maxime Beau",
    "author_email": "m.beau047@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/b3/f9/967c47a0b990fb411a43b23e43f114d4a0428894a05446f0abe071f18c02/cachecache-0.2.2.tar.gz",
    "platform": null,
    "description": "[![PyPI version](https://badge.fury.io/py/cachecache.svg)](https://badge.fury.io/py/cachecache)\n[![License: GPLv3](https://img.shields.io/badge/license-GPLv3-blue)](https://opensource.org/license/gpl-3-0)\n[![Downloads](https://static.pepy.tech/badge/cachecache)](https://pepy.tech/project/cachecache)\n\n# cachecache: Python function decorator for runtime-configurable caching.</h1> <img src=\"https://raw.githubusercontent.com/m-beau/cachecache/master/images/cachecache.png\" width=\"150\" title=\"Neuropyxels\" alt=\"Neuropixels\" align=\"right\" vspace = \"50\">\n\nA Python package that provides a simple way to cache function results while allowing to dynamically configure caching behavior at each function call.\n\nBy \"caching behavior\" reconfigurable at each function call, we mean:\n1) Whether to recompute results and overwrite the cache, which is useful for functions whose results rely on data loaded internally (therefore hidden from the function arguments, thus from the cache hash) and that can change on disk;\n2) Whether to cache the results, which is useful for functions who may need caching in context A (e.g. frontend recurrent use) but not context B (e.g. backend unique use);\n3) Where to save the cached results, which is useful for functions that return voluminous data, as this allows to distribute their cache across several locations given different arguments.\n\n## Features\n\n- 1-liner caching with a decorator: @cache\n- Caching behavior can be customized on the fly for specific function calls, by passing the following arguments to the cached functions:\n    - \ud83d\udd04 `again=True`: recompute and overwrite cached results on-demand\n    - \u23f8\ufe0f `cache_results=False`: disable caching for specific function calls, for instance if the computed result would take too much room on disk.\n    - \ud83d\udcc1 `cache_path='different/caching/path'`: use custom cache locations for specific function calls\n- Built on joblib's [Memory](https://joblib.readthedocs.io/en/latest/generated/joblib.Memory.html) class.\n\n## Installation\n\nYou can install `cachecache` using pip:\n\n```bash\npip install cachecache\n```\n\n## Usage\n\nHere's a basic example of how to use `cachecache`:\n\n```python\nfrom cachecache import cache, Cacher\n```\n\nCache using the default \"~/.cachecache\" directory and default maximum cache size:\n```python\n@cache # behind the scenes, \"cache\" is simply defined as \"cache = Cacher()\"\ndef my_cached_function(*args, again=False, cache_results=True, cache_path=None):\n    # complex operations involving args...\n    results = ...\n    return results\n\nresult = my_cached_function(arg)  # potentially slow\nresult = my_cached_function(arg)  # always fast (results loaded from cache)\n```\n\nCache using a custom directory and maximum cache size:\n```python\ncacher = Cacher(\"my/custom/caching/path\", 10e9) # size in bytes - 10GB\n@cacher\ndef my_cached_function(...):\n    ...\n```\n\nRecompute results and overwrite cache:\n```python\nresult = my_cached_function(arg, again=True)\n```\nThis proves useful if the results depend on data that can change on disk (this information is not present in the arguments of the function, so the cacher does not know about it!).\n\nAdjust caching directory at runtime\n```python\nresult = my_cached_function(arg, cache_path=\"somewhere/else\")\n```\nThis proves useful if you need to distribute the cached results of a function across several disks.\n\ncachecache also provides a way to create a `distributed_cacher` that will cache a function's results at a location specified by a custom argument (such as 'datapath'):\n```python\nfrom cachecache import Cacher, distributed_cacher\n\nglobal_cacher = Cacher('~/.global_cache')\n\n# Arguments of distributed_cacher:\n# - datapath_arg_name (str, optional): The name of the argument in the decorated function\n#     that specifies the datapath for the local cache. Defaults to 'datapath'.\n# - local_cache_path (str, optional): The relative path to the local cache directory\n#     within the datapath. Defaults to '.local_cache' (and results cached at f'{datapath}/.local_cache').\n# - global_cache (cachecache.Cacher instance, optional): The global cacher to use by default\n#     for cached functions without 'datapath_arg_name' (or when 'datapath_arg_name' is None).\n#     Defaults to a cache at '~/.cachecache' (default instance of Cacher()).\ndist_cacher = distributed_cacher(datapath_arg_name='datapath',\n                                local_cache_path='.local_cache',\n                                global_cache=global_cacher)\n\n# You can then decorate a function as follow:\n@dist_cacher\ndef my_distributed_cached_function(datapath, ...):\n    \"\"\"\n    A function whose results will be cached at 'datapath/.local_cache'\n    unless specified otherwise with the cache_path argument.\n\n    Note: works with args and kwargs\n    \"\"\"\n    ...\n```\nBehind the scenes, this works by swapping in the value of the specified argument (datapath_arg_name) instead of the 'cache_path' argument from Cacher (if 'cache_path' is also specified, it takes precedence over 'datapath').\n\nOf course, you can use a single cacher for multiple functions:\n```python\n@cacher\ndef foo1(x):\n    return x ** 2\n\n@cacher\ndef foo2(x):\n    return x / 10\n```\n\nAnd both of these syntaxes are possible:\n```python\ncacher = Cacher(\"my/custom/caching/path\")\n@cacher\ndef my_cached_function(...):\n    ...\n\n@Cacher(\"my/custom/caching/path\")\ndef my_cached_function(...):\n    ...\n```\n\n## License\n\nThis project is licensed under the terms of the [GNU General Public License v3.0](https://opensource.org/license/gpl-3-0). You may copy, distribute and modify the software as long as you track changes/dates in source files. Any modifications to or software including (via compiler) GPL-licensed code must also be made available under the GPL along with build & install instructions.\n\n## Support\n\nIf you have any questions, issues, or feature requests, please [open an issue](https://github.com/m-beau/cachecache/issues) so that everybody can benefit from your experience! This package is actively maintained.\n",
    "bugtrack_url": null,
    "license": "GPL-3.0",
    "summary": "Python package for easy function caching, allowing users to customize cache behavior at call-time.",
    "version": "0.2.2",
    "project_urls": {
        "Homepage": "https://github.com/m-beau/cachecache"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "663109d37a1d276ff63e33dd6130644f0a4ab3a39474f18bca8d9fb818fde188",
                "md5": "9e2955256343ee88e540164d9758ee39",
                "sha256": "74f657eb043a067285fe7769c629ca08ed79e7b4d881d5501fa17b9a94e03a84"
            },
            "downloads": -1,
            "filename": "cachecache-0.2.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9e2955256343ee88e540164d9758ee39",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 21420,
            "upload_time": "2024-10-15T08:35:28",
            "upload_time_iso_8601": "2024-10-15T08:35:28.105929Z",
            "url": "https://files.pythonhosted.org/packages/66/31/09d37a1d276ff63e33dd6130644f0a4ab3a39474f18bca8d9fb818fde188/cachecache-0.2.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b3f9967c47a0b990fb411a43b23e43f114d4a0428894a05446f0abe071f18c02",
                "md5": "1c48fbede9537207aa690552da84d7ea",
                "sha256": "feafb3694dc08b6eae366f672f282b9445c8ebe223c4ba6a009d1337989fe387"
            },
            "downloads": -1,
            "filename": "cachecache-0.2.2.tar.gz",
            "has_sig": false,
            "md5_digest": "1c48fbede9537207aa690552da84d7ea",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 22078,
            "upload_time": "2024-10-15T08:35:29",
            "upload_time_iso_8601": "2024-10-15T08:35:29.750261Z",
            "url": "https://files.pythonhosted.org/packages/b3/f9/967c47a0b990fb411a43b23e43f114d4a0428894a05446f0abe071f18c02/cachecache-0.2.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-15 08:35:29",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "m-beau",
    "github_project": "cachecache",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "cachecache"
}
        
Elapsed time: 0.32215s