=====
zeppy
=====
.. image:: https://img.shields.io/pypi/v/zeppy.svg
:target: https://pypi.python.org/pypi/zeppy
.. image:: https://img.shields.io/travis/santoshphilip/zeppy.svg
:target: https://travis-ci.com/santoshphilip/zeppy
.. image:: https://readthedocs.org/projects/zeppy/badge/?version=latest
:target: https://zeppy.readthedocs.io/en/latest/?badge=latest
:alt: Documentation Status
distributed processing for eppyy
* Free software: Mozilla Public License 2.0 (MPL-2.0)
* Documentation: https://zeppy.readthedocs.io.
Vision
------
To run eppy on multiple nodes in parallel and collect the results.
So what is a node and why would you want to do this ?
A node can be any or all of the following:
- a process (such E+ running on a single core on a multi-core computer)
- so we can do multi-processing and run it on many cores on a single computer
- a computer
- so we can run it on multiple computers that are on the same network
- a group of computer in a local network
- So we can run multiple groups of machines that may be at different locations on different local networks
- This can also be computers at different cloud locations
- a single computer in the local network may act as an access node
Features
--------
Do the distributed processing with a single function call and get all the results back.
Sample code ::
import zeppy import ppipes
result = ppipes.ipc_parallelpipe(runfunction,
args_list,
nworkers=None)
# runfunction is a function you will write,
# that may run idf.run(),
# gather the total energy use and return it
# args_list = {args: [idf1, idf2, idf3, ...]}
# list of files to run
# if nworkers=None:
# it will start up as many nodes as there are items in args_list
# if you don't have enough nodes avaliable, you can set nworkers=n.
# it will start up n nodes and queue up the runs evenly on the nodes
For example the above code can do the following:
- ``runfunction`` will run the *idf* file, and return the *total energy usage*
- ``result`` will be a list *total energy usage* in the same order as the items in ``args_list``
- see the comments in the code for greater clarity
Credits
-------
This package was created with Cookiecutter_ and the `audreyr/cookiecutter-pypackage`_ project template.
.. _Cookiecutter: https://github.com/audreyr/cookiecutter
.. _`audreyr/cookiecutter-pypackage`: https://github.com/audreyr/cookiecutter-pypackage
Raw data
{
"_id": null,
"home_page": "https://github.com/santoshphilip/zeppy",
"name": "zeppy",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.5",
"maintainer_email": "",
"keywords": "zeppy",
"author": "Santosh Philip",
"author_email": "Santosh@example.com",
"download_url": "https://files.pythonhosted.org/packages/25/6d/0fbea990517179a4d4b3e1140b69e1703e88ce35284d38da9e3cea0942e8/zeppy-0.1.4.tar.gz",
"platform": null,
"description": "=====\nzeppy\n=====\n\n\n.. image:: https://img.shields.io/pypi/v/zeppy.svg\n :target: https://pypi.python.org/pypi/zeppy\n\n.. image:: https://img.shields.io/travis/santoshphilip/zeppy.svg\n :target: https://travis-ci.com/santoshphilip/zeppy\n\n.. image:: https://readthedocs.org/projects/zeppy/badge/?version=latest\n :target: https://zeppy.readthedocs.io/en/latest/?badge=latest\n :alt: Documentation Status\n\n\n\n\ndistributed processing for eppyy\n\n\n* Free software: Mozilla Public License 2.0 (MPL-2.0)\n* Documentation: https://zeppy.readthedocs.io.\n\n\nVision\n------\n\nTo run eppy on multiple nodes in parallel and collect the results.\n\nSo what is a node and why would you want to do this ?\n\nA node can be any or all of the following:\n\n- a process (such E+ running on a single core on a multi-core computer)\n - so we can do multi-processing and run it on many cores on a single computer\n- a computer\n - so we can run it on multiple computers that are on the same network\n- a group of computer in a local network \n - So we can run multiple groups of machines that may be at different locations on different local networks\n - This can also be computers at different cloud locations\n - a single computer in the local network may act as an access node \n \nFeatures\n--------\n\nDo the distributed processing with a single function call and get all the results back. \n\nSample code ::\n \n import zeppy import ppipes\n \n result = ppipes.ipc_parallelpipe(runfunction, \n args_list, \n nworkers=None)\n\n # runfunction is a function you will write, \n # that may run idf.run(), \n # gather the total energy use and return it\n # args_list = {args: [idf1, idf2, idf3, ...]}\n # list of files to run\n # if nworkers=None: \n # it will start up as many nodes as there are items in args_list\n # if you don't have enough nodes avaliable, you can set nworkers=n.\n # it will start up n nodes and queue up the runs evenly on the nodes\n \n\nFor example the above code can do the following:\n\n- ``runfunction`` will run the *idf* file, and return the *total energy usage*\n- ``result`` will be a list *total energy usage* in the same order as the items in ``args_list``\n- see the comments in the code for greater clarity\n\n\n\n\n\nCredits\n-------\n\nThis package was created with Cookiecutter_ and the `audreyr/cookiecutter-pypackage`_ project template.\n\n.. _Cookiecutter: https://github.com/audreyr/cookiecutter\n.. _`audreyr/cookiecutter-pypackage`: https://github.com/audreyr/cookiecutter-pypackage\n",
"bugtrack_url": null,
"license": "",
"summary": "distributed processing for eppy",
"version": "0.1.4",
"split_keywords": [
"zeppy"
],
"urls": [
{
"comment_text": "",
"digests": {
"md5": "adbaca5c4b760c9dd039dea75f3497e4",
"sha256": "9f8af5547caa3526ea4f86f4f61549bcbb34323221c6ba629cf87ec3e050a10a"
},
"downloads": -1,
"filename": "zeppy-0.1.4-py2.py3-none-any.whl",
"has_sig": false,
"md5_digest": "adbaca5c4b760c9dd039dea75f3497e4",
"packagetype": "bdist_wheel",
"python_version": "py2.py3",
"requires_python": ">=3.5",
"size": 20093,
"upload_time": "2022-12-02T19:21:49",
"upload_time_iso_8601": "2022-12-02T19:21:49.847072Z",
"url": "https://files.pythonhosted.org/packages/fc/8c/f27bf0ae7c49b1cfbd5d76462ce8559e4dc076a874e9d18bb4676055511e/zeppy-0.1.4-py2.py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"md5": "81bf285542251e84002dc7068011bd56",
"sha256": "8e71a0389364e43e745135560ce7031173ac3215d07a35d6f6902dde3739507b"
},
"downloads": -1,
"filename": "zeppy-0.1.4.tar.gz",
"has_sig": false,
"md5_digest": "81bf285542251e84002dc7068011bd56",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.5",
"size": 25310,
"upload_time": "2022-12-02T19:21:53",
"upload_time_iso_8601": "2022-12-02T19:21:53.063439Z",
"url": "https://files.pythonhosted.org/packages/25/6d/0fbea990517179a4d4b3e1140b69e1703e88ce35284d38da9e3cea0942e8/zeppy-0.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2022-12-02 19:21:53",
"github": true,
"gitlab": false,
"bitbucket": false,
"github_user": "santoshphilip",
"github_project": "zeppy",
"travis_ci": true,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "pyzmq",
"specs": []
},
{
"name": "eppy",
"specs": []
},
{
"name": "witheppy",
"specs": []
},
{
"name": "nbsphinx",
"specs": []
},
{
"name": "black",
"specs": []
}
],
"tox": true,
"lcname": "zeppy"
}