conditional-cache


Nameconditional-cache JSON
Version 1.2 PyPI version JSON
download
home_pagehttps://github.com/Eric-Canas/ConditionalCache
SummaryConditional cache is a wrapper over functools.lru_cache that allows for conditionally caching based on the output of the function.
upload_time2024-06-02 08:39:53
maintainerNone
docs_urlNone
authorEric-Canas
requires_python>=3.6
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ConditionalCache
<img alt="ConditionalCache" title="ConditionalCache" src="https://raw.githubusercontent.com/Eric-Canas/ConditionalCache/main/resources/logo.png" width="20%" align="left">

**ConditionalCache** is a set of _decorators_, that provide **conditional function memoization** and **selective cache clearing**.

It works under the same interface that most standard cache decorators like [functools.lru_cache](https://docs.python.org/es/3/library/functools.html#functools.lru_cache) or [cachetools.ttl_cache](https://cachetools.readthedocs.io/en/latest/#cachetools.TTLCache), but unlocking a new `condition` parameter, that will determine if the function result is _memoized_ or not. This feature allows for more granular control over caching behavior, useful for those use cases where we want to store the output only when certain conditions are met. As for example when checking existence in databases.

## Installation

To install **ConditionalCache** simply run:

```bash
pip install conditional-cache
```

## Usage
Working with **ConditionalCache** is as simple and straight-forward as using [functools.lru_cache](https://docs.python.org/es/3/library/functools.html#functools.lru_cache), as it works under the same interface.

```python
from conditional_cache import lru_cache

# Memoize the returned element only when it is different than "Not Found"
@lru_cache(maxsize=64, condition=lambda db_value: db_value != "Not Found")
def element_exists_in_db(element_id: int) -> str:
  
  print(f"Asked to DB: {element_id}")
  # For the example let's consider that even elements exists.
  return "Found" if element_id % 2 == 0 else "Not Found"
```

When we will call this function, it will be execute **only once** for even numbers, and always for odds.

```python
# Will be executed, and not memoized
print(f"Returned: {element_exists_in_db(element_id=1)}")
# Will be executed again
print(f"Returned: {element_exists_in_db(element_id=1)}\n")

# Will be executed and memoized
print(f"Returned: {element_exists_in_db(element_id=2)}")
# Will return the memoized result without executing again
print(f"Returned: {element_exists_in_db(element_id=2)}")
```

```bash
>> Asked to DB: 1
>> Returned: Not Found
>> Asked to DB: 1
>> Returned: Not Found

>> Asked to DB: 2
>> Returned: Found
>> Returned: Found
```

If during your execution, you perform an action that invalidate a given function result, you can actively remove that element cache:

```python
# Will return the result that was memoized before
print(f"Returned: {element_exists_in_db(element_id=2)}\n")
# Remove the element from the cache
element_exists_in_db.cache_remove(element_id=2)

# Will be executed again and memoized
print(f"Returned: {element_exists_in_db(element_id=2)}")
# Will return the memoized result
print(f"Returned: {element_exists_in_db(element_id=2)}")
```

```bash
>> Returned: Found

>> Asked to DB: 2
>> Returned: Found
>> Returned: Found
```

## API Reference

### conditional_cache.lru_cache(maxsize: int = 128, typed: bool = False, condition: callable = lambda x: True)
An _Least Recently Used_ Cache. It works the same way that [functools.lru_cache](https://docs.python.org/es/3/library/functools.html#functools.lru_cache) but accepting **conditional storage** and **selective item removing** through <decorated_function>.cache_remove(**args)

- `maxsize`: **int**. The maximum amount of elements to keep cached. Once the cache is full, new elements will start to override oldest ones.
- `typed`: **bool**. Works the same way that [functools.lru_cache](https://docs.python.org/es/3/library/functools.html#functools.lru_cache). If `True`, function arguments of different types will be cached separately.
- `condition`: **callable**. It must be a function that receives a single parameter as input (the output of the _decorated_ method) and returns a `boolean`. `True` if the result should be cached or `False` if it should not.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Eric-Canas/ConditionalCache",
    "name": "conditional-cache",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": null,
    "author": "Eric-Canas",
    "author_email": "eric@ericcanas.com",
    "download_url": "https://files.pythonhosted.org/packages/61/50/b6f5755fd0b68c78f62b6e7116b8c5da631ec4488da4cc417117797843a7/conditional_cache-1.2.tar.gz",
    "platform": null,
    "description": "# ConditionalCache\n<img alt=\"ConditionalCache\" title=\"ConditionalCache\" src=\"https://raw.githubusercontent.com/Eric-Canas/ConditionalCache/main/resources/logo.png\" width=\"20%\" align=\"left\">\n\n**ConditionalCache** is a set of _decorators_, that provide **conditional function memoization** and **selective cache clearing**.\n\nIt works under the same interface that most standard cache decorators like [functools.lru_cache](https://docs.python.org/es/3/library/functools.html#functools.lru_cache) or [cachetools.ttl_cache](https://cachetools.readthedocs.io/en/latest/#cachetools.TTLCache), but unlocking a new `condition` parameter, that will determine if the function result is _memoized_ or not. This feature allows for more granular control over caching behavior, useful for those use cases where we want to store the output only when certain conditions are met. As for example when checking existence in databases.\n\n## Installation\n\nTo install **ConditionalCache** simply run:\n\n```bash\npip install conditional-cache\n```\n\n## Usage\nWorking with **ConditionalCache** is as simple and straight-forward as using [functools.lru_cache](https://docs.python.org/es/3/library/functools.html#functools.lru_cache), as it works under the same interface.\n\n```python\nfrom conditional_cache import lru_cache\n\n# Memoize the returned element only when it is different than \"Not Found\"\n@lru_cache(maxsize=64, condition=lambda db_value: db_value != \"Not Found\")\ndef element_exists_in_db(element_id: int) -> str:\n  \n  print(f\"Asked to DB: {element_id}\")\n  # For the example let's consider that even elements exists.\n  return \"Found\" if element_id % 2 == 0 else \"Not Found\"\n```\n\nWhen we will call this function, it will be execute **only once** for even numbers, and always for odds.\n\n```python\n# Will be executed, and not memoized\nprint(f\"Returned: {element_exists_in_db(element_id=1)}\")\n# Will be executed again\nprint(f\"Returned: {element_exists_in_db(element_id=1)}\\n\")\n\n# Will be executed and memoized\nprint(f\"Returned: {element_exists_in_db(element_id=2)}\")\n# Will return the memoized result without executing again\nprint(f\"Returned: {element_exists_in_db(element_id=2)}\")\n```\n\n```bash\n>> Asked to DB: 1\n>> Returned: Not Found\n>> Asked to DB: 1\n>> Returned: Not Found\n\n>> Asked to DB: 2\n>> Returned: Found\n>> Returned: Found\n```\n\nIf during your execution, you perform an action that invalidate a given function result, you can actively remove that element cache:\n\n```python\n# Will return the result that was memoized before\nprint(f\"Returned: {element_exists_in_db(element_id=2)}\\n\")\n# Remove the element from the cache\nelement_exists_in_db.cache_remove(element_id=2)\n\n# Will be executed again and memoized\nprint(f\"Returned: {element_exists_in_db(element_id=2)}\")\n# Will return the memoized result\nprint(f\"Returned: {element_exists_in_db(element_id=2)}\")\n```\n\n```bash\n>> Returned: Found\n\n>> Asked to DB: 2\n>> Returned: Found\n>> Returned: Found\n```\n\n## API Reference\n\n### conditional_cache.lru_cache(maxsize: int = 128, typed: bool = False, condition: callable = lambda x: True)\nAn _Least Recently Used_ Cache. It works the same way that [functools.lru_cache](https://docs.python.org/es/3/library/functools.html#functools.lru_cache) but accepting **conditional storage** and **selective item removing** through <decorated_function>.cache_remove(**args)\n\n- `maxsize`: **int**. The maximum amount of elements to keep cached. Once the cache is full, new elements will start to override oldest ones.\n- `typed`: **bool**. Works the same way that [functools.lru_cache](https://docs.python.org/es/3/library/functools.html#functools.lru_cache). If `True`, function arguments of different types will be cached separately.\n- `condition`: **callable**. It must be a function that receives a single parameter as input (the output of the _decorated_ method) and returns a `boolean`. `True` if the result should be cached or `False` if it should not.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Conditional cache is a wrapper over functools.lru_cache that allows for conditionally caching based on the output of the function.",
    "version": "1.2",
    "project_urls": {
        "Homepage": "https://github.com/Eric-Canas/ConditionalCache"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "77fdccf95dbfb07bd065dcd78b081e89e71d66a712e3f371ffbaf05cb4c57ee4",
                "md5": "ee3d6eb76ad3af72df1bd5fc19a53fb1",
                "sha256": "973ea5ff053a5449c9c98390ae0c22dc3625fcf1e5624b1366fd1bd47cbd5eb5"
            },
            "downloads": -1,
            "filename": "conditional_cache-1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ee3d6eb76ad3af72df1bd5fc19a53fb1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 4732,
            "upload_time": "2024-06-02T08:39:52",
            "upload_time_iso_8601": "2024-06-02T08:39:52.247335Z",
            "url": "https://files.pythonhosted.org/packages/77/fd/ccf95dbfb07bd065dcd78b081e89e71d66a712e3f371ffbaf05cb4c57ee4/conditional_cache-1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6150b6f5755fd0b68c78f62b6e7116b8c5da631ec4488da4cc417117797843a7",
                "md5": "dd730cae17cdf18fb9b575721305ff01",
                "sha256": "836247f92320e7f1a0b744480074f09cf2812edd6e7fe2619ffa788c33ef799e"
            },
            "downloads": -1,
            "filename": "conditional_cache-1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "dd730cae17cdf18fb9b575721305ff01",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 4329,
            "upload_time": "2024-06-02T08:39:53",
            "upload_time_iso_8601": "2024-06-02T08:39:53.471946Z",
            "url": "https://files.pythonhosted.org/packages/61/50/b6f5755fd0b68c78f62b6e7116b8c5da631ec4488da4cc417117797843a7/conditional_cache-1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-02 08:39:53",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Eric-Canas",
    "github_project": "ConditionalCache",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "conditional-cache"
}
        
Elapsed time: 3.25308s