mppfc


Namemppfc JSON
Version 1.1.2 PyPI version JSON
download
home_page
Summarymulti-processing persistent function cache
upload_time2023-06-16 07:13:07
maintainer
docs_urlNone
authorRichard Hartmann
requires_python>=3.8,<4.0
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # mppfc - Multi-Processing Persistent Function Cache

[![PyPI version](https://badge.fury.io/py/mppfc.svg)](https://badge.fury.io/py/mppfc)

The `mppfc` module allows to speed up the evaluation of computationally 
expansive functions by 
a) processing several arguments in parallel and 
b) persistent caching of the results to disk.
Persistent caching becomes available by simply decorating a given function.
With no more than two extra lines of code, parallel evaluation is realized.

Here is a [minimal example](https://github.com/richard-hartmann/mppfc/blob/main/examples/minimal.py):

```python
import mppfc

@mppfc.MultiProcCachedFunctionDec()
def slow_function(x):
    # complicated stuff
    return x

slow_function.start_mp()
for x in some_range:
    y = slow_function(x)
slow_function.wait()
```
The first time you run this script, all `y` are `None`, since the evaluation 
is done by several background processes.
Once `wait()` returns, all parameters have been cached to disk.
So calling the script a second time yields (almost immediately) the
desired results in `y`.

Evaluating only the `for` loop in a jupyter notebook cell
will give you partial results if the background processes are still doing some work.
In that way you can already show successfully retrieved results.
(see the examples [simple.ipynb](https://github.com/richard-hartmann/mppfc/blob/main/examples/simple.ipynb) 
and [live_update.ipynb](https://github.com/richard-hartmann/mppfc/blob/main/examples/live_update.ipynb))

For a nearly exhaustive example see [full.py](https://github.com/richard-hartmann/mppfc/blob/main/examples/full.py).

### caching class instantiation

*new in Version 1.1*

When class instantiation, i.e. calling `__init__(...)` takes very long, you can cache the instantiation
by subclassing from `mppfc.CacheInit`.

```python
class SomeClass(mppfc.CacheInit):
    """instantiation is being cached simply by subclassing from `CacheInit`"""
    def __init__(self, a, t=1):
        time.sleep(t)
        self.a = a
```

Note that subclassing such a cached class is not supported.
If you try that, a `CacheInitSubclassError` is raised.
However, you can simply circumvent this problem by creating a dummy class for caching, e.g.

```python
class S0:
    s0 = 's0'

class S1(S0):
    s1 = 's1'
    def __init__(self, s):
        self.s = s

class S1Cached(mppfc.CacheInit, S1):
    """dummy 'subclass' of S1 with caching"""
    def __init__(self, s):
        super().__init__(s)

class S2(mppfc.CacheInit, S1):
    """S2 inherits from S1 AND is being cached"""
    s2 = "s2"
    def __init__(self, s):
        super().__init__(s)
```

When subclassing from `CacheInit` the following extra keyword arguments can be used
to control the Cache

* `_CacheInit_serializer`: a function which serializes an object to binary data
    (default is binfootprint.dump).
* `_CacheInit_path`: the path where to put the cache data (default is '.CacheInit')
* `_CacheInit_include_module_name`: if `True` (default) include the name of module where the class
   is defined into the path where the instances will be cached.
   (useful during development stage where Classes might be moved around or module name are still
   under debate)

### pitfalls

Note that arguments are distinguished by their binary representation obtained from the 
[binfootprint](https://github.com/richard-hartmann/binfootprint) module.
This implies that the integer `1` and the float `1.0` are treated as different arguments, even though
in many numeric situations the result does not differ.

```python
import mppfc
import math

@mppfc.MultiProcCachedFunctionDec()
def pitfall_1(x):
    return math.sqrt(x)

x = 1
print("pitfall_1(x={}) = {}".format(x, pitfall_1(x=x)))
# pitfall_1(x=1) = 1.0
x = 1.0
print("BUT, x={} in cache: {}".format(x, pitfall_1(x=x, _cache_flag="has_key")))
# BUT, x=1.0 in cache: False
print("and obviously: pitfall_1(x={}) = {}".format(x, pitfall_1(x=x, _cache_flag="no_cache")))
# and obviously: pitfall_1(x=1.0) = 1.0
```

The same holds true for lists and tuples.

```python
import mppfc
import math

@mppfc.MultiProcCachedFunctionDec()
def pitfall_2(arr):
    return sum(arr)

arr = [1, 2, 3]
print("pitfall_2(arr={}) = {}".format(arr, pitfall_2(arr=arr)))
# pitfall_2(arr=[1, 2, 3]) = 6
arr = (1, 2, 3)
print("BUT, arr={} in cache: {}".format(arr, pitfall_2(arr=arr, _cache_flag="has_key")))
# BUT, arr=(1, 2, 3) in cache: False
print("and obviously: pitfall_1(arr={}) = {}".format(arr, pitfall_2(arr=arr, _cache_flag="no_cache")))
# and obviously: pitfall_1(arr=(1, 2, 3)) = 6
```

For more details see [binfootprint's README](https://github.com/richard-hartmann/binfootprint).

## ToDo

- Set the signature of the wrapper `_cached_init` to the signature of `cls.__init__` (if possible).
  Probably requires some MetaClass programming.
- Create online documentation with mkdocs.


## Installation

### pip

    pip install mppfc

### poetry

Using poetry allows you to include this package in your project as a dependency.

### git

check out the code from github

    git clone https://github.com/richard-hartmann/mppfc.git

## Dependencies

 - requires at least python 3.8
 - uses [`binfootprint`](https://github.com/richard-hartmann/binfootprint) 
   to serialize and hash the arguments of a function 

## Licence

### MIT licence
Copyright (c) 2023 Richard Hartmann

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "mppfc",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8,<4.0",
    "maintainer_email": "",
    "keywords": "",
    "author": "Richard Hartmann",
    "author_email": "richard_hartmann@gmx.de",
    "download_url": "https://files.pythonhosted.org/packages/5f/94/8e8967f0767a235c775832e5064baca4c0b8e31e3332f82e17ae0b5999c8/mppfc-1.1.2.tar.gz",
    "platform": null,
    "description": "# mppfc - Multi-Processing Persistent Function Cache\n\n[![PyPI version](https://badge.fury.io/py/mppfc.svg)](https://badge.fury.io/py/mppfc)\n\nThe `mppfc` module allows to speed up the evaluation of computationally \nexpansive functions by \na) processing several arguments in parallel and \nb) persistent caching of the results to disk.\nPersistent caching becomes available by simply decorating a given function.\nWith no more than two extra lines of code, parallel evaluation is realized.\n\nHere is a [minimal example](https://github.com/richard-hartmann/mppfc/blob/main/examples/minimal.py):\n\n```python\nimport mppfc\n\n@mppfc.MultiProcCachedFunctionDec()\ndef slow_function(x):\n    # complicated stuff\n    return x\n\nslow_function.start_mp()\nfor x in some_range:\n    y = slow_function(x)\nslow_function.wait()\n```\nThe first time you run this script, all `y` are `None`, since the evaluation \nis done by several background processes.\nOnce `wait()` returns, all parameters have been cached to disk.\nSo calling the script a second time yields (almost immediately) the\ndesired results in `y`.\n\nEvaluating only the `for` loop in a jupyter notebook cell\nwill give you partial results if the background processes are still doing some work.\nIn that way you can already show successfully retrieved results.\n(see the examples [simple.ipynb](https://github.com/richard-hartmann/mppfc/blob/main/examples/simple.ipynb) \nand [live_update.ipynb](https://github.com/richard-hartmann/mppfc/blob/main/examples/live_update.ipynb))\n\nFor a nearly exhaustive example see [full.py](https://github.com/richard-hartmann/mppfc/blob/main/examples/full.py).\n\n### caching class instantiation\n\n*new in Version 1.1*\n\nWhen class instantiation, i.e. calling `__init__(...)` takes very long, you can cache the instantiation\nby subclassing from `mppfc.CacheInit`.\n\n```python\nclass SomeClass(mppfc.CacheInit):\n    \"\"\"instantiation is being cached simply by subclassing from `CacheInit`\"\"\"\n    def __init__(self, a, t=1):\n        time.sleep(t)\n        self.a = a\n```\n\nNote that subclassing such a cached class is not supported.\nIf you try that, a `CacheInitSubclassError` is raised.\nHowever, you can simply circumvent this problem by creating a dummy class for caching, e.g.\n\n```python\nclass S0:\n    s0 = 's0'\n\nclass S1(S0):\n    s1 = 's1'\n    def __init__(self, s):\n        self.s = s\n\nclass S1Cached(mppfc.CacheInit, S1):\n    \"\"\"dummy 'subclass' of S1 with caching\"\"\"\n    def __init__(self, s):\n        super().__init__(s)\n\nclass S2(mppfc.CacheInit, S1):\n    \"\"\"S2 inherits from S1 AND is being cached\"\"\"\n    s2 = \"s2\"\n    def __init__(self, s):\n        super().__init__(s)\n```\n\nWhen subclassing from `CacheInit` the following extra keyword arguments can be used\nto control the Cache\n\n* `_CacheInit_serializer`: a function which serializes an object to binary data\n    (default is binfootprint.dump).\n* `_CacheInit_path`: the path where to put the cache data (default is '.CacheInit')\n* `_CacheInit_include_module_name`: if `True` (default) include the name of module where the class\n   is defined into the path where the instances will be cached.\n   (useful during development stage where Classes might be moved around or module name are still\n   under debate)\n\n### pitfalls\n\nNote that arguments are distinguished by their binary representation obtained from the \n[binfootprint](https://github.com/richard-hartmann/binfootprint) module.\nThis implies that the integer `1` and the float `1.0` are treated as different arguments, even though\nin many numeric situations the result does not differ.\n\n```python\nimport mppfc\nimport math\n\n@mppfc.MultiProcCachedFunctionDec()\ndef pitfall_1(x):\n    return math.sqrt(x)\n\nx = 1\nprint(\"pitfall_1(x={}) = {}\".format(x, pitfall_1(x=x)))\n# pitfall_1(x=1) = 1.0\nx = 1.0\nprint(\"BUT, x={} in cache: {}\".format(x, pitfall_1(x=x, _cache_flag=\"has_key\")))\n# BUT, x=1.0 in cache: False\nprint(\"and obviously: pitfall_1(x={}) = {}\".format(x, pitfall_1(x=x, _cache_flag=\"no_cache\")))\n# and obviously: pitfall_1(x=1.0) = 1.0\n```\n\nThe same holds true for lists and tuples.\n\n```python\nimport mppfc\nimport math\n\n@mppfc.MultiProcCachedFunctionDec()\ndef pitfall_2(arr):\n    return sum(arr)\n\narr = [1, 2, 3]\nprint(\"pitfall_2(arr={}) = {}\".format(arr, pitfall_2(arr=arr)))\n# pitfall_2(arr=[1, 2, 3]) = 6\narr = (1, 2, 3)\nprint(\"BUT, arr={} in cache: {}\".format(arr, pitfall_2(arr=arr, _cache_flag=\"has_key\")))\n# BUT, arr=(1, 2, 3) in cache: False\nprint(\"and obviously: pitfall_1(arr={}) = {}\".format(arr, pitfall_2(arr=arr, _cache_flag=\"no_cache\")))\n# and obviously: pitfall_1(arr=(1, 2, 3)) = 6\n```\n\nFor more details see [binfootprint's README](https://github.com/richard-hartmann/binfootprint).\n\n## ToDo\n\n- Set the signature of the wrapper `_cached_init` to the signature of `cls.__init__` (if possible).\n  Probably requires some MetaClass programming.\n- Create online documentation with mkdocs.\n\n\n## Installation\n\n### pip\n\n    pip install mppfc\n\n### poetry\n\nUsing poetry allows you to include this package in your project as a dependency.\n\n### git\n\ncheck out the code from github\n\n    git clone https://github.com/richard-hartmann/mppfc.git\n\n## Dependencies\n\n - requires at least python 3.8\n - uses [`binfootprint`](https://github.com/richard-hartmann/binfootprint) \n   to serialize and hash the arguments of a function \n\n## Licence\n\n### MIT licence\nCopyright (c) 2023 Richard Hartmann\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "multi-processing persistent function cache",
    "version": "1.1.2",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3e92d1f172ca24d2fad1ad831701f765ca296e2831b39d51d9f46472ead04649",
                "md5": "42fc8403da35ed01c9ffbbbab76b117d",
                "sha256": "4640294816ca0d4c49e3784ef37925fd88fc5bf21e0357deeb691bae6343d3fa"
            },
            "downloads": -1,
            "filename": "mppfc-1.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "42fc8403da35ed01c9ffbbbab76b117d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8,<4.0",
            "size": 18353,
            "upload_time": "2023-06-16T07:13:05",
            "upload_time_iso_8601": "2023-06-16T07:13:05.803369Z",
            "url": "https://files.pythonhosted.org/packages/3e/92/d1f172ca24d2fad1ad831701f765ca296e2831b39d51d9f46472ead04649/mppfc-1.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5f948e8967f0767a235c775832e5064baca4c0b8e31e3332f82e17ae0b5999c8",
                "md5": "c1e18c068ce39f3e6ffe41a04c03303e",
                "sha256": "c1c6b951200b7a5598d8a35b53effab6dd27c7ccf9da9bd265cca6a1ecb55ee5"
            },
            "downloads": -1,
            "filename": "mppfc-1.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "c1e18c068ce39f3e6ffe41a04c03303e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8,<4.0",
            "size": 18772,
            "upload_time": "2023-06-16T07:13:07",
            "upload_time_iso_8601": "2023-06-16T07:13:07.471624Z",
            "url": "https://files.pythonhosted.org/packages/5f/94/8e8967f0767a235c775832e5064baca4c0b8e31e3332f82e17ae0b5999c8/mppfc-1.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-16 07:13:07",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "mppfc"
}
        
Elapsed time: 0.07570s