mpipartition


Namempipartition JSON
Version 1.4.0 PyPI version JSON
download
home_pagehttps://github.com/ArgonneCPAC/MPIPartition
SummaryMPI volume decomposition and particle distribution tools
upload_time2024-11-15 22:48:13
maintainerNone
docs_urlNone
authorMichael Buehlmann
requires_python>=3.9
licenseMIT
keywords mpi mpi4py scientific computing parallel computing
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            MPIPartition
============


.. image:: https://img.shields.io/pypi/v/mpipartition.svg
   :target: https://pypi.python.org/pypi/mpipartition

.. image:: https://github.com/ArgonneCPAC/MPIPartition/actions/workflows/pypi.yml/badge.svg
   :target: https://github.com/ArgonneCPAC/MPIPartition/actions/workflows/pypi.yml

.. image:: https://github.com/ArgonneCPAC/MPIPartition/actions/workflows/sphinx.yml/badge.svg
   :target: https://github.com/ArgonneCPAC/MPIPartition/actions/workflows/sphinx.yml

A python module for MPI volume decomposition and particle distribution


* Free software: MIT license
* Documentation: https://argonnecpac.github.io/MPIPartition
* Repository: https://github.com/ArgonneCPAC/MPIPartition


Features
--------

* Cartesian partitioning of a cubic volume (arbitrary dimensions) among MPI ranks
* Equal area decomposition of the spherical shell (S2) among MPI ranks
* distributing particle-data among ranks to the corresponding subvolume / surface segment
* overloading particle-data at rank boundaries ("ghost particles")



Installation
------------

Installing from the PyPI repository:

.. code-block:: bash

   pip install mpipartition

Installing the development version from the GIT repository

.. code-block:: bash

   git clone https://github.com/ArgonneCPAC/mpipartition.git
   cd mpipartition
   python setup.py develop


Requirements
------------

These packages will be automatically installed if they are not already present:

* Python >= 3.8
* `mpi4py <https://mpi4py.readthedocs.io/en/stable/>`_: MPI for Python
* `numpy <https://numpy.org/>`_: Python array library
* `numba <https://numba.pydata.org/>`_: Python JIT compiler

Basic Usage
-----------
Check the `documentation <https://argonnecpac.github.io/MPIPartition>`_ for
an in-depth explanation / documentation.

.. code-block:: python

   # this code goes into mpipartition_example.py

   from mpipartition import Partition, distribute, overload
   import numpy as np

   # create a partition of the unit cube with available MPI ranks
   box_size = 1.
   partition = Partition()

   if partition.rank == 0:
       print(f"Number of ranks: {partition.nranks}")
       print(f"Volume decomposition: {partition.decomposition}")

   # create random data
   nparticles_local = 1000
   data = {
       "x": np.random.uniform(0, 1, nparticles_local),
       "y": np.random.uniform(0, 1, nparticles_local),
       "z": np.random.uniform(0, 1, nparticles_local)
   }

   # distribute data to ranks assigned to corresponding subvolume
   data = distribute(partition, box_size, data, ('x', 'y', 'z'))

   # overload "edge" of each subvolume by 0.05
   data = overload(partition, box_size, data, 0.05, ('x', 'y', 'z'))

This code can then be executed with ``mpi``:

.. code-block:: bash

   mpirun -n 10 python mpipartition_example.py

--------

A more applied example, using halo catalogs from a
`HACC <https://cpac.hep.anl.gov/projects/hacc/>`_ cosmological simulation (in
the `GenericIO <https://git.cels.anl.gov/hacc/genericio>`_ data format):

.. code-block:: python

   from mpipartition import Partition, distribute, overload
   import numpy as np
   import pygio

   # create a partition with available MPI ranks
   box_size = 64.  # box size in Mpc/h
   partition = Partition(3)  # by default, the dimension is 3

   # read GenericIO data in parallel
   data = pygio.read_genericio("m000p-499.haloproperties")

   # distribute
   data = distribute(partition, box_size, data, [f"fof_halo_center_{x}" for x in "xyz"])

   # mark "owned" data with rank (allows differentiating owned and overloaded data)
   data["status"] = partition.rank * np.ones(len(data["fof_halo_center_x"]), dtype=np.uint16)

   # overload by 4Mpc/h
   data = overload(partition, box_size, data, 4., [f"fof_halo_center_{x}" for x in "xyz"])

   # now we can do analysis such as 2pt correlation functions (up to 4Mpc/h)
   # or neighbor finding, etc.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/ArgonneCPAC/MPIPartition",
    "name": "mpipartition",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "MPI, mpi4py, scientific computing, parallel computing",
    "author": "Michael Buehlmann",
    "author_email": "buehlmann.michi@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/3d/b7/6e3dacaadc1e682c51e894a66146470f42db7cd2ce39c580f9f8b0dc9f5a/mpipartition-1.4.0.tar.gz",
    "platform": null,
    "description": "MPIPartition\n============\n\n\n.. image:: https://img.shields.io/pypi/v/mpipartition.svg\n   :target: https://pypi.python.org/pypi/mpipartition\n\n.. image:: https://github.com/ArgonneCPAC/MPIPartition/actions/workflows/pypi.yml/badge.svg\n   :target: https://github.com/ArgonneCPAC/MPIPartition/actions/workflows/pypi.yml\n\n.. image:: https://github.com/ArgonneCPAC/MPIPartition/actions/workflows/sphinx.yml/badge.svg\n   :target: https://github.com/ArgonneCPAC/MPIPartition/actions/workflows/sphinx.yml\n\nA python module for MPI volume decomposition and particle distribution\n\n\n* Free software: MIT license\n* Documentation: https://argonnecpac.github.io/MPIPartition\n* Repository: https://github.com/ArgonneCPAC/MPIPartition\n\n\nFeatures\n--------\n\n* Cartesian partitioning of a cubic volume (arbitrary dimensions) among MPI ranks\n* Equal area decomposition of the spherical shell (S2) among MPI ranks\n* distributing particle-data among ranks to the corresponding subvolume / surface segment\n* overloading particle-data at rank boundaries (\"ghost particles\")\n\n\n\nInstallation\n------------\n\nInstalling from the PyPI repository:\n\n.. code-block:: bash\n\n   pip install mpipartition\n\nInstalling the development version from the GIT repository\n\n.. code-block:: bash\n\n   git clone https://github.com/ArgonneCPAC/mpipartition.git\n   cd mpipartition\n   python setup.py develop\n\n\nRequirements\n------------\n\nThese packages will be automatically installed if they are not already present:\n\n* Python >= 3.8\n* `mpi4py <https://mpi4py.readthedocs.io/en/stable/>`_: MPI for Python\n* `numpy <https://numpy.org/>`_: Python array library\n* `numba <https://numba.pydata.org/>`_: Python JIT compiler\n\nBasic Usage\n-----------\nCheck the `documentation <https://argonnecpac.github.io/MPIPartition>`_ for\nan in-depth explanation / documentation.\n\n.. code-block:: python\n\n   # this code goes into mpipartition_example.py\n\n   from mpipartition import Partition, distribute, overload\n   import numpy as np\n\n   # create a partition of the unit cube with available MPI ranks\n   box_size = 1.\n   partition = Partition()\n\n   if partition.rank == 0:\n       print(f\"Number of ranks: {partition.nranks}\")\n       print(f\"Volume decomposition: {partition.decomposition}\")\n\n   # create random data\n   nparticles_local = 1000\n   data = {\n       \"x\": np.random.uniform(0, 1, nparticles_local),\n       \"y\": np.random.uniform(0, 1, nparticles_local),\n       \"z\": np.random.uniform(0, 1, nparticles_local)\n   }\n\n   # distribute data to ranks assigned to corresponding subvolume\n   data = distribute(partition, box_size, data, ('x', 'y', 'z'))\n\n   # overload \"edge\" of each subvolume by 0.05\n   data = overload(partition, box_size, data, 0.05, ('x', 'y', 'z'))\n\nThis code can then be executed with ``mpi``:\n\n.. code-block:: bash\n\n   mpirun -n 10 python mpipartition_example.py\n\n--------\n\nA more applied example, using halo catalogs from a\n`HACC <https://cpac.hep.anl.gov/projects/hacc/>`_ cosmological simulation (in\nthe `GenericIO <https://git.cels.anl.gov/hacc/genericio>`_ data format):\n\n.. code-block:: python\n\n   from mpipartition import Partition, distribute, overload\n   import numpy as np\n   import pygio\n\n   # create a partition with available MPI ranks\n   box_size = 64.  # box size in Mpc/h\n   partition = Partition(3)  # by default, the dimension is 3\n\n   # read GenericIO data in parallel\n   data = pygio.read_genericio(\"m000p-499.haloproperties\")\n\n   # distribute\n   data = distribute(partition, box_size, data, [f\"fof_halo_center_{x}\" for x in \"xyz\"])\n\n   # mark \"owned\" data with rank (allows differentiating owned and overloaded data)\n   data[\"status\"] = partition.rank * np.ones(len(data[\"fof_halo_center_x\"]), dtype=np.uint16)\n\n   # overload by 4Mpc/h\n   data = overload(partition, box_size, data, 4., [f\"fof_halo_center_{x}\" for x in \"xyz\"])\n\n   # now we can do analysis such as 2pt correlation functions (up to 4Mpc/h)\n   # or neighbor finding, etc.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "MPI volume decomposition and particle distribution tools",
    "version": "1.4.0",
    "project_urls": {
        "Documentation": "https://argonnecpac.github.io/MPIPartition",
        "Homepage": "https://github.com/ArgonneCPAC/MPIPartition",
        "Repository": "https://github.com/ArgonneCPAC/MPIPartition"
    },
    "split_keywords": [
        "mpi",
        " mpi4py",
        " scientific computing",
        " parallel computing"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "33636251de534b0a10d386fdb4d04745518fc4b2f0fd12009bc7c1103534d2fc",
                "md5": "21aedb3d2dd92e9fd7625e9eb72d9a06",
                "sha256": "b3c0078d365950cffa347c15d419da6f9dcdbe34c9494b7b1e975aee5ab364b1"
            },
            "downloads": -1,
            "filename": "mpipartition-1.4.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "21aedb3d2dd92e9fd7625e9eb72d9a06",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 23473,
            "upload_time": "2024-11-15T22:48:11",
            "upload_time_iso_8601": "2024-11-15T22:48:11.332225Z",
            "url": "https://files.pythonhosted.org/packages/33/63/6251de534b0a10d386fdb4d04745518fc4b2f0fd12009bc7c1103534d2fc/mpipartition-1.4.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3db76e3dacaadc1e682c51e894a66146470f42db7cd2ce39c580f9f8b0dc9f5a",
                "md5": "bda89496c29858f690db8e3f5a03998e",
                "sha256": "c6e5be3be9fbebbad3aa1e6ae091017c9b11b6f5ede4b409ea0e8414287f390c"
            },
            "downloads": -1,
            "filename": "mpipartition-1.4.0.tar.gz",
            "has_sig": false,
            "md5_digest": "bda89496c29858f690db8e3f5a03998e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 18353,
            "upload_time": "2024-11-15T22:48:13",
            "upload_time_iso_8601": "2024-11-15T22:48:13.008415Z",
            "url": "https://files.pythonhosted.org/packages/3d/b7/6e3dacaadc1e682c51e894a66146470f42db7cd2ce39c580f9f8b0dc9f5a/mpipartition-1.4.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-15 22:48:13",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ArgonneCPAC",
    "github_project": "MPIPartition",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "tox": true,
    "lcname": "mpipartition"
}
        
Elapsed time: 1.52968s