=======================================
trio-parallel: CPU parallelism for Trio
=======================================
Do you have CPU-bound work that just keeps slowing down your Trio_ event loop no
matter what you try? Do you need to get all those cores humming at once? This is the
library for you!
The aim of trio-parallel is to use the lightest-weight, lowest-overhead, lowest-latency
method to achieve CPU parallelism of arbitrary Python code with a dead-simple API.
Resources
---------
============= =============================
License |license badge|
Documentation |documentation badge|
Chat |chat badge|
Forum |forum badge|
Issues |issues badge|
Repository |repository badge|
Tests |tests badge|
Coverage |coverage badge|
Style |style badge|
Distribution | |version badge|
| |python versions badge|
| |python interpreters badge|
============= =============================
Example
-------
.. code-block:: python
import functools
import multiprocessing
import trio
import trio_parallel
def loop(n):
# Arbitrary CPU-bound work
for _ in range(n):
pass
print("Loops completed:", n)
async def amain():
t0 = trio.current_time()
async with trio.open_nursery() as nursery:
# Do CPU-bound work in parallel
for i in [6, 7, 8] * 4:
nursery.start_soon(trio_parallel.run_sync, loop, 10 ** i)
# Event loop remains responsive
t1 = trio.current_time()
await trio.sleep(0)
print("Scheduling latency:", trio.current_time() - t1)
# This job could take far too long, make it cancellable!
nursery.start_soon(
functools.partial(
trio_parallel.run_sync, loop, 10 ** 20, cancellable=True
)
)
await trio.sleep(2)
# Only explicitly cancellable jobs are killed on cancel
nursery.cancel_scope.cancel()
print("Total runtime:", trio.current_time() - t0)
if __name__ == "__main__":
multiprocessing.freeze_support()
trio.run(amain)
Additional examples and the full API are available in the documentation_.
Features
--------
- Bypasses the GIL for CPU-bound work
- Minimal API complexity
- looks and feels like Trio threads_
- Minimal internal complexity
- No reliance on ``multiprocessing.Pool``, ``ProcessPoolExecutor``, or any background threads
- Cross-platform
- ``print`` just works
- Seamless interoperation with
- coverage.py_
- viztracer_
- cloudpickle_
- Automatic LIFO caching of subprocesses
- Cancel seriously misbehaving code via SIGKILL/TerminateProcess
- Convert segfaults and other scary things to catchable errors
FAQ
---
How does trio-parallel run Python code in parallel?
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Currently, this project is based on ``multiprocessing`` subprocesses and
has all the usual multiprocessing caveats_ (``freeze_support``, pickleable objects
only, executing the ``__main__`` module).
The case for basing these workers on multiprocessing is that it keeps a lot of
complexity outside of the project while offering a set of quirks that users are
likely already familiar with.
The pickling limitations can be partially alleviated by installing cloudpickle_.
Can I have my workers talk to each other?
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This is currently possible through the use of ``multiprocessing.Manager``,
but we don't and will not officially support it.
This package focuses on providing
a flat hierarchy of worker subprocesses to run synchronous, CPU-bound functions.
If you are looking to create a nested hierarchy of processes communicating
asynchronously with each other, while preserving the power, safety, and convenience of
structured concurrency, look into `tractor <https://github.com/goodboy/tractor>`_.
Or, if you are looking for a more customized solution, try using ``trio.run_process``
to spawn additional Trio runs and have them talk to each other over sockets.
Can I let my workers outlive the main Trio process?
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
No. Trio's structured concurrency strictly bounds job runs to within a given
``trio.run`` call, while cached idle workers are shutdown and killed if necessary
by our ``atexit`` handler, so this use case is not supported.
How should I map a function over a collection of arguments?
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This is fully possible but we leave the implementation of that up to you. Think
of us as a `loky <https://loky.readthedocs.io/en/stable/index.html>`_ for your
`joblib <https://joblib.readthedocs.io/en/latest/>`_, but natively async and Trionic.
We take care of the worker handling so that you can focus on the best concurrency
for your application. That said, some example parallelism patterns can be found in
the documentation_.
Also, look into `aiometer <https://github.com/florimondmanca/aiometer>`_?
Contributing
------------
If you notice any bugs, need any help, or want to contribute any code, GitHub issues_
and pull requests are very welcome! Please read the `code of conduct`_.
.. _chat: https://gitter.im/python-trio/general
.. |chat badge| image:: https://img.shields.io/badge/chat-join%20now-blue.svg?color=royalblue&logo=Gitter
:target: `chat`_
:alt: Chatroom
.. _forum: https://trio.discourse.group
.. |forum badge| image:: https://img.shields.io/badge/forum-join%20now-blue.svg?color=royalblue&logo=Discourse
:target: `forum`_
:alt: Forum
.. _documentation: https://trio-parallel.readthedocs.io/
.. |documentation badge| image:: https://img.shields.io/readthedocs/trio-parallel?logo=readthedocs&logoColor=whitesmoke
:target: `documentation`_
:alt: Documentation
.. _distribution: https://pypi.org/project/trio-parallel/
.. |version badge| image:: https://img.shields.io/pypi/v/trio-parallel?logo=PyPI&logoColor=whitesmoke
:target: `distribution`_
:alt: Latest Pypi version
.. _pypistats: https://pypistats.org/packages/trio-parallel
.. |pypistats badge| image:: https://img.shields.io/pypi/dm/trio-parallel?logo=pypi&logoColor=whitesmoke
:target: `pypistats`_
:alt: Pypi monthly downloads
.. _pepy: https://pepy.tech/project/trio-parallel
.. |pepy badge| image:: https://pepy.tech/badge/trio-parallel/month
:target: `pepy`_
:alt: Pypi monthly downloads
.. |python versions badge| image:: https://img.shields.io/pypi/pyversions/trio-parallel.svg?logo=PyPI&logoColor=whitesmoke
:alt: Supported Python versions
:target: `distribution`_
.. |python interpreters badge| image:: https://img.shields.io/pypi/implementation/trio-parallel.svg?logo=PyPI&logoColor=whitesmoke
:alt: Supported Python interpreters
:target: `distribution`_
.. _issues: https://github.com/richardsheridan/trio-parallel/issues
.. |issues badge| image:: https://img.shields.io/github/issues-raw/richardsheridan/trio-parallel?logo=github
:target: `issues`_
:alt: Issues
.. _repository: https://github.com/richardsheridan/trio-parallel
.. |repository badge| image:: https://img.shields.io/github/last-commit/richardsheridan/trio-parallel?logo=github
:target: `repository`_
:alt: Repository
.. _tests: https://github.com/richardsheridan/trio-parallel/actions?query=branch%3Amain
.. |tests badge| image:: https://img.shields.io/github/actions/workflow/status/richardsheridan/trio-parallel/ci.yml?branch=main&logo=Github-Actions&logoColor=whitesmoke
:target: `tests`_
:alt: Tests
.. _coverage: https://codecov.io/gh/richardsheridan/trio-parallel
.. |coverage badge| image:: https://codecov.io/gh/richardsheridan/trio-parallel/branch/main/graph/badge.svg?token=EQqs2abxxG
:target: `coverage`_
:alt: Test coverage
.. _style: https://github.com/psf/black
.. |style badge| image:: https://img.shields.io/badge/code%20style-Black-black
:target: `style`_
:alt: Code style
.. _license: https://github.com/richardsheridan/trio-parallel/blob/main/LICENSE
.. |license badge| image:: https://img.shields.io/pypi/l/trio-parallel?color=informational
:target: `license`_
:alt: MIT -or- Apache License 2.0
.. _coverage.py: https://coverage.readthedocs.io/
.. _viztracer: https://viztracer.readthedocs.io/
.. _cloudpickle: https://github.com/cloudpipe/cloudpickle
.. _threads: https://trio.readthedocs.io/en/stable/reference-core.html#trio.to_thread.run_sync
.. _caveats: https://docs.python.org/3/library/multiprocessing.html#programming-guidelines
.. _Trio: https://github.com/python-trio/trio
.. _code of conduct: https://trio.readthedocs.io/en/stable/code-of-conduct.html
Raw data
{
"_id": null,
"home_page": null,
"name": "trio-parallel",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "parallel, trio, async, dispatch, multiprocessing",
"author": null,
"author_email": "Richard Sheridan <richard.sheridan@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/73/7f/c2d3c243afc3494ed8a133ca7b4c3dfb6367419aa4ffa550f45a17a95caf/trio_parallel-1.2.3.tar.gz",
"platform": null,
"description": "=======================================\ntrio-parallel: CPU parallelism for Trio\n=======================================\n\nDo you have CPU-bound work that just keeps slowing down your Trio_ event loop no\nmatter what you try? Do you need to get all those cores humming at once? This is the\nlibrary for you!\n\nThe aim of trio-parallel is to use the lightest-weight, lowest-overhead, lowest-latency\nmethod to achieve CPU parallelism of arbitrary Python code with a dead-simple API.\n\nResources\n---------\n\n============= =============================\n\nLicense |license badge|\nDocumentation |documentation badge|\nChat |chat badge|\nForum |forum badge|\nIssues |issues badge|\nRepository |repository badge|\nTests |tests badge|\nCoverage |coverage badge|\nStyle |style badge|\nDistribution | |version badge|\n | |python versions badge|\n | |python interpreters badge|\n\n============= =============================\n\nExample\n-------\n\n.. code-block:: python\n\n import functools\n import multiprocessing\n import trio\n import trio_parallel\n\n\n def loop(n):\n # Arbitrary CPU-bound work\n for _ in range(n):\n pass\n print(\"Loops completed:\", n)\n\n\n async def amain():\n t0 = trio.current_time()\n async with trio.open_nursery() as nursery:\n # Do CPU-bound work in parallel\n for i in [6, 7, 8] * 4:\n nursery.start_soon(trio_parallel.run_sync, loop, 10 ** i)\n # Event loop remains responsive\n t1 = trio.current_time()\n await trio.sleep(0)\n print(\"Scheduling latency:\", trio.current_time() - t1)\n # This job could take far too long, make it cancellable!\n nursery.start_soon(\n functools.partial(\n trio_parallel.run_sync, loop, 10 ** 20, cancellable=True\n )\n )\n await trio.sleep(2)\n # Only explicitly cancellable jobs are killed on cancel\n nursery.cancel_scope.cancel()\n print(\"Total runtime:\", trio.current_time() - t0)\n\n\n if __name__ == \"__main__\":\n multiprocessing.freeze_support()\n trio.run(amain)\n\n\nAdditional examples and the full API are available in the documentation_.\n\nFeatures\n--------\n\n- Bypasses the GIL for CPU-bound work\n- Minimal API complexity\n\n - looks and feels like Trio threads_\n\n- Minimal internal complexity\n\n - No reliance on ``multiprocessing.Pool``, ``ProcessPoolExecutor``, or any background threads\n\n- Cross-platform\n- ``print`` just works\n- Seamless interoperation with\n\n - coverage.py_\n - viztracer_\n - cloudpickle_\n\n- Automatic LIFO caching of subprocesses\n- Cancel seriously misbehaving code via SIGKILL/TerminateProcess\n\n- Convert segfaults and other scary things to catchable errors\n\nFAQ\n---\n\nHow does trio-parallel run Python code in parallel?\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\nCurrently, this project is based on ``multiprocessing`` subprocesses and\nhas all the usual multiprocessing caveats_ (``freeze_support``, pickleable objects\nonly, executing the ``__main__`` module).\nThe case for basing these workers on multiprocessing is that it keeps a lot of\ncomplexity outside of the project while offering a set of quirks that users are\nlikely already familiar with.\n\nThe pickling limitations can be partially alleviated by installing cloudpickle_.\n\nCan I have my workers talk to each other?\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\nThis is currently possible through the use of ``multiprocessing.Manager``,\nbut we don't and will not officially support it.\n\nThis package focuses on providing\na flat hierarchy of worker subprocesses to run synchronous, CPU-bound functions.\nIf you are looking to create a nested hierarchy of processes communicating\nasynchronously with each other, while preserving the power, safety, and convenience of\nstructured concurrency, look into `tractor <https://github.com/goodboy/tractor>`_.\nOr, if you are looking for a more customized solution, try using ``trio.run_process``\nto spawn additional Trio runs and have them talk to each other over sockets.\n\nCan I let my workers outlive the main Trio process?\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\nNo. Trio's structured concurrency strictly bounds job runs to within a given\n``trio.run`` call, while cached idle workers are shutdown and killed if necessary\nby our ``atexit`` handler, so this use case is not supported.\n\nHow should I map a function over a collection of arguments?\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\nThis is fully possible but we leave the implementation of that up to you. Think\nof us as a `loky <https://loky.readthedocs.io/en/stable/index.html>`_ for your\n`joblib <https://joblib.readthedocs.io/en/latest/>`_, but natively async and Trionic.\nWe take care of the worker handling so that you can focus on the best concurrency\nfor your application. That said, some example parallelism patterns can be found in\nthe documentation_.\n\nAlso, look into `aiometer <https://github.com/florimondmanca/aiometer>`_?\n\nContributing\n------------\nIf you notice any bugs, need any help, or want to contribute any code, GitHub issues_\nand pull requests are very welcome! Please read the `code of conduct`_.\n\n.. _chat: https://gitter.im/python-trio/general\n.. |chat badge| image:: https://img.shields.io/badge/chat-join%20now-blue.svg?color=royalblue&logo=Gitter\n :target: `chat`_\n :alt: Chatroom\n\n.. _forum: https://trio.discourse.group\n.. |forum badge| image:: https://img.shields.io/badge/forum-join%20now-blue.svg?color=royalblue&logo=Discourse\n :target: `forum`_\n :alt: Forum\n\n.. _documentation: https://trio-parallel.readthedocs.io/\n.. |documentation badge| image:: https://img.shields.io/readthedocs/trio-parallel?logo=readthedocs&logoColor=whitesmoke\n :target: `documentation`_\n :alt: Documentation\n\n.. _distribution: https://pypi.org/project/trio-parallel/\n.. |version badge| image:: https://img.shields.io/pypi/v/trio-parallel?logo=PyPI&logoColor=whitesmoke\n :target: `distribution`_\n :alt: Latest Pypi version\n\n.. _pypistats: https://pypistats.org/packages/trio-parallel\n.. |pypistats badge| image:: https://img.shields.io/pypi/dm/trio-parallel?logo=pypi&logoColor=whitesmoke\n :target: `pypistats`_\n :alt: Pypi monthly downloads\n\n.. _pepy: https://pepy.tech/project/trio-parallel\n.. |pepy badge| image:: https://pepy.tech/badge/trio-parallel/month\n :target: `pepy`_\n :alt: Pypi monthly downloads\n\n.. |python versions badge| image:: https://img.shields.io/pypi/pyversions/trio-parallel.svg?logo=PyPI&logoColor=whitesmoke\n :alt: Supported Python versions\n :target: `distribution`_\n\n.. |python interpreters badge| image:: https://img.shields.io/pypi/implementation/trio-parallel.svg?logo=PyPI&logoColor=whitesmoke\n :alt: Supported Python interpreters\n :target: `distribution`_\n\n.. _issues: https://github.com/richardsheridan/trio-parallel/issues\n.. |issues badge| image:: https://img.shields.io/github/issues-raw/richardsheridan/trio-parallel?logo=github\n :target: `issues`_\n :alt: Issues\n\n.. _repository: https://github.com/richardsheridan/trio-parallel\n.. |repository badge| image:: https://img.shields.io/github/last-commit/richardsheridan/trio-parallel?logo=github\n :target: `repository`_\n :alt: Repository\n\n.. _tests: https://github.com/richardsheridan/trio-parallel/actions?query=branch%3Amain\n.. |tests badge| image:: https://img.shields.io/github/actions/workflow/status/richardsheridan/trio-parallel/ci.yml?branch=main&logo=Github-Actions&logoColor=whitesmoke\n :target: `tests`_\n :alt: Tests\n\n.. _coverage: https://codecov.io/gh/richardsheridan/trio-parallel\n.. |coverage badge| image:: https://codecov.io/gh/richardsheridan/trio-parallel/branch/main/graph/badge.svg?token=EQqs2abxxG\n :target: `coverage`_\n :alt: Test coverage\n\n.. _style: https://github.com/psf/black\n.. |style badge| image:: https://img.shields.io/badge/code%20style-Black-black\n :target: `style`_\n :alt: Code style\n\n.. _license: https://github.com/richardsheridan/trio-parallel/blob/main/LICENSE\n.. |license badge| image:: https://img.shields.io/pypi/l/trio-parallel?color=informational\n :target: `license`_\n :alt: MIT -or- Apache License 2.0\n\n.. _coverage.py: https://coverage.readthedocs.io/\n.. _viztracer: https://viztracer.readthedocs.io/\n.. _cloudpickle: https://github.com/cloudpipe/cloudpickle\n.. _threads: https://trio.readthedocs.io/en/stable/reference-core.html#trio.to_thread.run_sync\n.. _caveats: https://docs.python.org/3/library/multiprocessing.html#programming-guidelines\n.. _Trio: https://github.com/python-trio/trio\n.. _code of conduct: https://trio.readthedocs.io/en/stable/code-of-conduct.html\n",
"bugtrack_url": null,
"license": "MIT OR Apache-2.0",
"summary": "CPU parallelism for Trio",
"version": "1.2.3",
"project_urls": {
"Changelog": "https://trio-parallel.readthedocs.io/en/latest/history.html",
"Documentation": "https://trio-parallel.readthedocs.io/",
"Homepage": "https://github.com/richardsheridan/trio-parallel"
},
"split_keywords": [
"parallel",
" trio",
" async",
" dispatch",
" multiprocessing"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "5dd0bd94189732a2eb3a5b3cf85a05aa2384bf0a28c322f824c6dc93c31a9cf5",
"md5": "f4c077a33aa5c63c906dbac9a61ebc13",
"sha256": "6959ba84b0e485d5894623194bd1bbfc3ae6b35527ab210ceadefb46e13e6190"
},
"downloads": -1,
"filename": "trio_parallel-1.2.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f4c077a33aa5c63c906dbac9a61ebc13",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 36593,
"upload_time": "2024-10-19T20:46:39",
"upload_time_iso_8601": "2024-10-19T20:46:39.021954Z",
"url": "https://files.pythonhosted.org/packages/5d/d0/bd94189732a2eb3a5b3cf85a05aa2384bf0a28c322f824c6dc93c31a9cf5/trio_parallel-1.2.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "737fc2d3c243afc3494ed8a133ca7b4c3dfb6367419aa4ffa550f45a17a95caf",
"md5": "c802e59735a9215525374009d50e3ff5",
"sha256": "56a6fa38d449353a56ca6f47c65a60bce23ee1fd55b4488b04c36b4c6fe6ec3d"
},
"downloads": -1,
"filename": "trio_parallel-1.2.3.tar.gz",
"has_sig": false,
"md5_digest": "c802e59735a9215525374009d50e3ff5",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 52911,
"upload_time": "2024-10-19T20:46:40",
"upload_time_iso_8601": "2024-10-19T20:46:40.525577Z",
"url": "https://files.pythonhosted.org/packages/73/7f/c2d3c243afc3494ed8a133ca7b4c3dfb6367419aa4ffa550f45a17a95caf/trio_parallel-1.2.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-19 20:46:40",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "richardsheridan",
"github_project": "trio-parallel",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "trio-parallel"
}