Name | PyMPDATA-MPI JSON |
Version |
0.0.9
JSON |
| download |
home_page | None |
Summary | PyMPDATA + numba-mpi coupler sandbox |
upload_time | 2024-04-08 23:38:22 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.10 |
license | GPL-3.0 |
keywords |
mpi
mpdata
numba
pympdata
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# PyMPDATA-MPI
[![Python 3](https://img.shields.io/static/v1?label=Python&logo=Python&color=3776AB&message=3)](https://www.python.org/)
[![LLVM](https://img.shields.io/static/v1?label=LLVM&logo=LLVM&color=gold&message=Numba)](https://numba.pydata.org)
[![Linux OK](https://img.shields.io/static/v1?label=Linux&logo=Linux&color=yellow&message=%E2%9C%93)](https://en.wikipedia.org/wiki/Linux)
[![macOS OK](https://img.shields.io/static/v1?label=macOS&logo=Apple&color=silver&message=%E2%9C%93)](https://en.wikipedia.org/wiki/macOS)
[![Maintenance](https://img.shields.io/badge/Maintained%3F-yes-green.svg)](https://GitHub.com/open-atmos/PyMPDATA-MPI/graphs/commit-activity)
[![PL Funding](https://img.shields.io/static/v1?label=PL%20Funding%20by&color=d21132&message=NCN&logoWidth=25&logo=image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAANCAYAAACpUE5eAAAABmJLR0QA/wD/AP+gvaeTAAAAKUlEQVQ4jWP8////fwYqAiZqGjZqIHUAy4dJS6lqIOMdEZvRZDPcDQQAb3cIaY1Sbi4AAAAASUVORK5CYII=)](https://www.ncn.gov.pl/?language=en)
[![License: GPL v3](https://img.shields.io/badge/License-GPL%20v3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0.html)
[![Copyright](https://img.shields.io/static/v1?label=Copyright&color=249fe2&message=Jagiellonian%20University&)](https://en.uj.edu.pl/)
[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.10866521.svg)](https://doi.org/10.5281/zenodo.10866521)
[![GitHub issues](https://img.shields.io/github/issues-pr/open-atmos/PyMPDATA-MPI.svg?logo=github&logoColor=white)](https://github.com/open-atmos/PyMPDATA-MPI/pulls?q=)
[![GitHub issues](https://img.shields.io/github/issues-pr-closed/open-atmos/PyMPDATA-MPI.svg?logo=github&logoColor=white)](https://github.com/open-atmos/PyMPDATA-MPI/pulls?q=is:closed)
[![GitHub issues](https://img.shields.io/github/issues/open-atmos/PyMPDATA-MPI.svg?logo=github&logoColor=white)](https://github.com/open-atmos/PyMPDATA-MPI/issues?q=)
[![GitHub issues](https://img.shields.io/github/issues-closed/open-atmos/PyMPDATA-MPI.svg?logo=github&logoColor=white)](https://github.com/open-atmos/PyMPDATA-MPI/issues?q=is:closed)
[![Github Actions Build Status](https://github.com/open-atmos/PyMPDATA-MPI/workflows/tests+pypi/badge.svg?branch=main)](https://github.com/open-atmos/PyMPDATA-MPI/actions)
[![PyPI version](https://badge.fury.io/py/PyMPDATA-MPI.svg)](https://pypi.org/project/PyMPDATA-MPI)
[![API docs](https://img.shields.io/badge/API_docs-pdoc3-blue.svg)](https://open-atmos.github.io/PyMPDATA-MPI/)
[![Coverage Status](https://codecov.io/gh/open-atmos/PyMPDATA-MPI/branch/main/graph/badge.svg)](https://app.codecov.io/gh/open-atmos/PyMPDATA-MPI)
PyMPDATA-MPI constitutes a [PyMPDATA](https://github.com/open-atmos/PyMPDATA) +
[numba-mpi](https://github.com/numba-mpi/numba-mpi) coupler enabling numerical solutions
of transport equations with the MPDATA numerical scheme in a
hybrid parallelisation model with both multi-threading and MPI distributed memory communication.
PyMPDATA-MPI adapts to API of PyMPDATA offering domain decomposition logic.
## Hello world examples
In a minimal setup, PyMPDATA-MPI can be used to solve the following transport equation:
$$\partial_t (G \psi) + \nabla \cdot (Gu \psi)= 0$$
in an environment with multiple nodes.
Every node (process) is responsible for computing its part of the decomposed domain.
### Spherical scenario (2D)
In spherical geometry, the $G$ factor represents the Jacobian of coordinate transformation.
In this example (based on a test case from [Williamson & Rasch 1989](https://doi.org/10.1175/1520-0493(1989)117<0102:TDSLTW>2.0.CO;2)),
domain decomposition is done cutting the sphere along meridians.
The inner dimension uses the [`MPIPolar`](https://open-atmos.github.io/PyMPDATA-MPI/mpi_polar.html)
boundary condition class, while the outer dimension uses
[`MPIPeriodic`](https://open-atmos.github.io/PyMPDATA-MPI/mpi_periodic.html).
Note that the spherical animations below depict simulations without MPDATA corrective iterations,
i.e. only plain first-order upwind scheme is used (FIX ME).
### 1 worker (n_threads = 1)
<p align="middle">
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.1_rank_0_size_1_c_field_.0.5.0.25._mpi_dim_0_n_threads_1-SphericalScenario-anim.gif" width="49%" />
</p>
### 2 workers (MPI_DIM = 0, n_threads = 1)
<p align="middle">
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.1_rank_1_size_2_c_field_.0.5.0.25._mpi_dim_0_n_threads_1-SphericalScenario-anim.gif" width="49%" />
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.1_rank_0_size_2_c_field_.0.5.0.25._mpi_dim_0_n_threads_1-SphericalScenario-anim.gif" width="49%" />
</p>
### Cartesian scenario (2D)
In the cartesian example below (based on a test case from [Arabas et al. 2014](https://doi.org/10.3233/SPR-140379)),
a constant advector field $u$ is used (and $G=1$).
MPI (Message Passing Interface) is used
for handling data transfers and synchronisation with the domain decomposition
across MPI workers done in either inner or in the outer dimension (user setting).
Multi-threading (using, e.g., OpenMP via Numba) is used for shared-memory parallelisation
within subdomains (indicated by dotted lines in the animations below) with threading subdomain
split done across the inner dimension (internal PyMPDATA logic).
In this example, two corrective MPDATA iterations are employed.
### 1 worker (n_threads=3)
<p align="middle">
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_0_size_1_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif" width="49%" />
</p>
### 2 workers (MPI_DIM = 0, n_threads = 3)
<p align="middle">
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_0_size_2_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif" width="49%" />
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_1_size_2_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif" width="49%" />
</p>
### 2 workers (MPI_DIM = -1, n_threads = 3)
<p align="middle">
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_0_size_2_c_field_.0.5.0.25._mpi_dim_-1_n_threads_3-CartesianScenario-anim.gif" width="49%" />
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_1_size_2_c_field_.0.5.0.25._mpi_dim_-1_n_threads_3-CartesianScenario-anim.gif" width="49%" />
</p>
### 3 workers (MPI_DIM = 0, n_threads = 3)
<p align="middle">
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_0_size_3_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif" width="32%" />
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_1_size_3_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif" width="32%" />
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_2_size_3_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif" width="32%" />
</p>
### 3 workers (MPI_DIM = -1, n_threads = 3)
<p align="middle">
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_0_size_3_c_field_.0.5.0.25._mpi_dim_-1_n_threads_3-CartesianScenario-anim.gif" width="32%" />
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_1_size_3_c_field_.0.5.0.25._mpi_dim_-1_n_threads_3-CartesianScenario-anim.gif" width="32%" />
<img src="https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_2_size_3_c_field_.0.5.0.25._mpi_dim_-1_n_threads_3-CartesianScenario-anim.gif" width="32%" />
</p>
## Package architecture
```mermaid
flowchart BT
H5PY ---> HDF{{HDF5}}
subgraph pythonic-dependencies [Python]
TESTS --> H[pytest-mpi]
subgraph PyMPDATA-MPI ["PyMPDATA-MPI"]
TESTS["PyMPDATA-MPI[tests]"] --> CASES(simulation scenarios)
A1["PyMPDATA-MPI[examples]"] --> CASES
CASES --> D[PyMPDATA-MPI]
end
A1 ---> C[py-modelrunner]
CASES ---> H5PY[h5py]
D --> E[numba-mpi]
H --> X[pytest]
E --> N
F --> N[Numba]
D --> F[PyMPDATA]
end
H ---> MPI
C ---> slurm{{slurm}}
N --> OMPI{{OpenMP}}
N --> L{{LLVM}}
E ---> MPI{{MPI}}
HDF --> MPI
slurm --> MPI
style D fill:#7ae7ff,stroke-width:2px,color:#2B2B2B
click H "https://pypi.org/p/pytest-mpi"
click X "https://pypi.org/p/pytest"
click F "https://pypi.org/p/PyMPDATA"
click N "https://pypi.org/p/numba"
click C "https://pypi.org/p/py-modelrunner"
click H5PY "https://pypi.org/p/h5py"
click E "https://pypi.org/p/numba-mpi"
click A1 "https://pypi.org/p/PyMPDATA-MPI"
click D "https://pypi.org/p/PyMPDATA-MPI"
click TESTS "https://pypi.org/p/PyMPDATA-MPI"
```
Rectangular boxes indicate pip-installable Python packages (click to go to pypi.org package site).
## Credits:
Development of PyMPDATA-MPI has been supported by the [Poland's National Science Centre](https://www.ncn.gov.pl/?language=en)
(grant no. 2020/39/D/ST10/01220).
We acknowledge Poland’s high-performance computing infrastructure [PLGrid](https://plgrid.pl/) (HPC Centers: [ACK Cyfronet AGH](https://www.cyfronet.pl/en/))
for providing computer facilities and support within computational grant no. PLG/2023/016369
copyright: [Jagiellonian University](https://en.uj.edu.pl/en) & [AGH University of Krakow](https://agh.edu.pl/en)
licence: [GPL v3](https://www.gnu.org/licenses/gpl-3.0.html)
## Design goals
- MPI support for [PyMPDATA](https://pypi.org/project/PyMPDATA/) implemented externally (i.e., not incurring any overhead or additional dependencies for PyMPDATA users)
- MPI calls within [Numba njitted code](https://numba.pydata.org/numba-doc/dev/reference/jit-compilation.html) (hence not using [`mpi4py`](https://mpi4py.readthedocs.io/), but rather [`numba-mpi`](https://pypi.org/p/numba-mpi/))
- hybrid domain-decomposition parallelism: threading (internal in PyMPDATA, in the inner dimension) + MPI (either inner or outer dimension)
- example simulation scenarios featuring HDF5/MPI-IO output storage (using [h5py](https://www.h5py.org/))
- [py-modelrunner](https://github.com/zwicker-group/py-modelrunner) simulation orchestration
- portability across Linux & macOS (no Windows support as of now due to [challenges in getting HDF5/MPI-IO to work there](https://docs.h5py.org/en/stable/build.html#source-installation-on-windows))
- Continuous Integration (CI) with different OSes and different MPI implementations (leveraging to mpi4py's [setup-mpi Github Action](https://github.com/mpi4py/setup-mpi/))
- full test coverage including CI builds asserting on same results with multi-node vs. single-node computations (with help of [pytest-mpi](https://pypi.org/p/pytest-mpi/))
- ships as a [pip-installable package](https://pypi.org/project/PyMPDATA-MPI) - aimed to be a dependency of domain-specific packages
## Related resources
### open-source Large-Eddy-Simulation and related software
#### Julia
- https://github.com/CliMA/ClimateMachine.jl/
#### C++
- https://github.com/microhh/microhh
- https://github.com/igfuw/UWLCM
#### C/CUDA
- https://github.com/NCAR/FastEddy-model
#### FORTRAN
- https://github.com/dalesteam/dales
- https://github.com/uclales/uclales
- https://github.com/UCLALES-SALSA/UCLALES-SALSA
- https://github.com/igfuw/bE_SDs
- https://github.com/pencil-code/pencil-code
- https://github.com/AtmosFOAM/AtmosFOAM
- https://github.com/scale-met/scale
#### Python (incl. Cython)
- https://github.com/CliMA/pycles
- https://github.com/pnnl/pinacles
Raw data
{
"_id": null,
"home_page": null,
"name": "PyMPDATA-MPI",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "MPI, MPDATA, Numba, PyMPDATA",
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/fe/35/b9bf7af01c48aa4dd8c811b702d6a8d4a0210ed53ac0ab991fa94b985bfb/PyMPDATA-MPI-0.0.9.tar.gz",
"platform": null,
"description": "# PyMPDATA-MPI\n\n[![Python 3](https://img.shields.io/static/v1?label=Python&logo=Python&color=3776AB&message=3)](https://www.python.org/)\n[![LLVM](https://img.shields.io/static/v1?label=LLVM&logo=LLVM&color=gold&message=Numba)](https://numba.pydata.org)\n[![Linux OK](https://img.shields.io/static/v1?label=Linux&logo=Linux&color=yellow&message=%E2%9C%93)](https://en.wikipedia.org/wiki/Linux)\n[![macOS OK](https://img.shields.io/static/v1?label=macOS&logo=Apple&color=silver&message=%E2%9C%93)](https://en.wikipedia.org/wiki/macOS)\n[![Maintenance](https://img.shields.io/badge/Maintained%3F-yes-green.svg)](https://GitHub.com/open-atmos/PyMPDATA-MPI/graphs/commit-activity)\n\n[![PL Funding](https://img.shields.io/static/v1?label=PL%20Funding%20by&color=d21132&message=NCN&logoWidth=25&logo=image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAANCAYAAACpUE5eAAAABmJLR0QA/wD/AP+gvaeTAAAAKUlEQVQ4jWP8////fwYqAiZqGjZqIHUAy4dJS6lqIOMdEZvRZDPcDQQAb3cIaY1Sbi4AAAAASUVORK5CYII=)](https://www.ncn.gov.pl/?language=en)\n[![License: GPL v3](https://img.shields.io/badge/License-GPL%20v3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0.html)\n[![Copyright](https://img.shields.io/static/v1?label=Copyright&color=249fe2&message=Jagiellonian%20University&)](https://en.uj.edu.pl/)\n[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.10866521.svg)](https://doi.org/10.5281/zenodo.10866521) \n\n[![GitHub issues](https://img.shields.io/github/issues-pr/open-atmos/PyMPDATA-MPI.svg?logo=github&logoColor=white)](https://github.com/open-atmos/PyMPDATA-MPI/pulls?q=)\n[![GitHub issues](https://img.shields.io/github/issues-pr-closed/open-atmos/PyMPDATA-MPI.svg?logo=github&logoColor=white)](https://github.com/open-atmos/PyMPDATA-MPI/pulls?q=is:closed) \n[![GitHub issues](https://img.shields.io/github/issues/open-atmos/PyMPDATA-MPI.svg?logo=github&logoColor=white)](https://github.com/open-atmos/PyMPDATA-MPI/issues?q=)\n[![GitHub issues](https://img.shields.io/github/issues-closed/open-atmos/PyMPDATA-MPI.svg?logo=github&logoColor=white)](https://github.com/open-atmos/PyMPDATA-MPI/issues?q=is:closed) \n[![Github Actions Build Status](https://github.com/open-atmos/PyMPDATA-MPI/workflows/tests+pypi/badge.svg?branch=main)](https://github.com/open-atmos/PyMPDATA-MPI/actions)\n[![PyPI version](https://badge.fury.io/py/PyMPDATA-MPI.svg)](https://pypi.org/project/PyMPDATA-MPI)\n[![API docs](https://img.shields.io/badge/API_docs-pdoc3-blue.svg)](https://open-atmos.github.io/PyMPDATA-MPI/)\n[![Coverage Status](https://codecov.io/gh/open-atmos/PyMPDATA-MPI/branch/main/graph/badge.svg)](https://app.codecov.io/gh/open-atmos/PyMPDATA-MPI) \n\n\nPyMPDATA-MPI constitutes a [PyMPDATA](https://github.com/open-atmos/PyMPDATA) +\n[numba-mpi](https://github.com/numba-mpi/numba-mpi) coupler enabling numerical solutions\nof transport equations with the MPDATA numerical scheme in a\nhybrid parallelisation model with both multi-threading and MPI distributed memory communication.\nPyMPDATA-MPI adapts to API of PyMPDATA offering domain decomposition logic.\n\n## Hello world examples\n\nIn a minimal setup, PyMPDATA-MPI can be used to solve the following transport equation: \n$$\\partial_t (G \\psi) + \\nabla \\cdot (Gu \\psi)= 0$$\nin an environment with multiple nodes.\nEvery node (process) is responsible for computing its part of the decomposed domain.\n\n### Spherical scenario (2D)\n\nIn spherical geometry, the $G$ factor represents the Jacobian of coordinate transformation.\nIn this example (based on a test case from [Williamson & Rasch 1989](https://doi.org/10.1175/1520-0493(1989)117<0102:TDSLTW>2.0.CO;2)),\n domain decomposition is done cutting the sphere along meridians.\nThe inner dimension uses the [`MPIPolar`](https://open-atmos.github.io/PyMPDATA-MPI/mpi_polar.html) \n boundary condition class, while the outer dimension uses\n [`MPIPeriodic`](https://open-atmos.github.io/PyMPDATA-MPI/mpi_periodic.html).\nNote that the spherical animations below depict simulations without MPDATA corrective iterations,\n i.e. only plain first-order upwind scheme is used (FIX ME).\n\n### 1 worker (n_threads = 1)\n<p align=\"middle\">\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.1_rank_0_size_1_c_field_.0.5.0.25._mpi_dim_0_n_threads_1-SphericalScenario-anim.gif\" width=\"49%\" /> \n</p>\n\n### 2 workers (MPI_DIM = 0, n_threads = 1)\n<p align=\"middle\">\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.1_rank_1_size_2_c_field_.0.5.0.25._mpi_dim_0_n_threads_1-SphericalScenario-anim.gif\" width=\"49%\" /> \n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.1_rank_0_size_2_c_field_.0.5.0.25._mpi_dim_0_n_threads_1-SphericalScenario-anim.gif\" width=\"49%\" />\n</p>\n\n### Cartesian scenario (2D)\n\nIn the cartesian example below (based on a test case from [Arabas et al. 2014](https://doi.org/10.3233/SPR-140379)),\n a constant advector field $u$ is used (and $G=1$).\nMPI (Message Passing Interface) is used \n for handling data transfers and synchronisation with the domain decomposition\n across MPI workers done in either inner or in the outer dimension (user setting).\nMulti-threading (using, e.g., OpenMP via Numba) is used for shared-memory parallelisation \n within subdomains (indicated by dotted lines in the animations below) with threading subdomain\n split done across the inner dimension (internal PyMPDATA logic).\nIn this example, two corrective MPDATA iterations are employed.\n\n### 1 worker (n_threads=3)\n<p align=\"middle\">\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_0_size_1_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif\" width=\"49%\" /> \n</p>\n\n### 2 workers (MPI_DIM = 0, n_threads = 3)\n<p align=\"middle\">\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_0_size_2_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif\" width=\"49%\" />\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_1_size_2_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif\" width=\"49%\" /> \n</p>\n\n### 2 workers (MPI_DIM = -1, n_threads = 3)\n<p align=\"middle\">\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_0_size_2_c_field_.0.5.0.25._mpi_dim_-1_n_threads_3-CartesianScenario-anim.gif\" width=\"49%\" />\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_1_size_2_c_field_.0.5.0.25._mpi_dim_-1_n_threads_3-CartesianScenario-anim.gif\" width=\"49%\" /> \n</p>\n\n### 3 workers (MPI_DIM = 0, n_threads = 3)\n<p align=\"middle\">\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_0_size_3_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif\" width=\"32%\" />\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_1_size_3_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif\" width=\"32%\" />\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_2_size_3_c_field_.0.5.0.25._mpi_dim_0_n_threads_3-CartesianScenario-anim.gif\" width=\"32%\" />\n</p>\n\n### 3 workers (MPI_DIM = -1, n_threads = 3)\n<p align=\"middle\">\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_0_size_3_c_field_.0.5.0.25._mpi_dim_-1_n_threads_3-CartesianScenario-anim.gif\" width=\"32%\" />\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_1_size_3_c_field_.0.5.0.25._mpi_dim_-1_n_threads_3-CartesianScenario-anim.gif\" width=\"32%\" />\n <img src=\"https://github.com/open-atmos/PyMPDATA-MPI/releases/download/latest-generated-plots/n_iters.3_rank_2_size_3_c_field_.0.5.0.25._mpi_dim_-1_n_threads_3-CartesianScenario-anim.gif\" width=\"32%\" />\n</p>\n\n## Package architecture\n\n```mermaid\n flowchart BT\n\n H5PY ---> HDF{{HDF5}}\n subgraph pythonic-dependencies [Python]\n TESTS --> H[pytest-mpi]\n subgraph PyMPDATA-MPI [\"PyMPDATA-MPI\"]\n TESTS[\"PyMPDATA-MPI[tests]\"] --> CASES(simulation scenarios)\n A1[\"PyMPDATA-MPI[examples]\"] --> CASES\n CASES --> D[PyMPDATA-MPI]\n end\n A1 ---> C[py-modelrunner]\n CASES ---> H5PY[h5py]\n D --> E[numba-mpi]\n H --> X[pytest]\n E --> N\n F --> N[Numba]\n D --> F[PyMPDATA]\n end\n H ---> MPI\n C ---> slurm{{slurm}}\n N --> OMPI{{OpenMP}}\n N --> L{{LLVM}}\n E ---> MPI{{MPI}}\n HDF --> MPI\n slurm --> MPI\n\nstyle D fill:#7ae7ff,stroke-width:2px,color:#2B2B2B\n\nclick H \"https://pypi.org/p/pytest-mpi\"\nclick X \"https://pypi.org/p/pytest\"\nclick F \"https://pypi.org/p/PyMPDATA\"\nclick N \"https://pypi.org/p/numba\"\nclick C \"https://pypi.org/p/py-modelrunner\"\nclick H5PY \"https://pypi.org/p/h5py\"\nclick E \"https://pypi.org/p/numba-mpi\"\nclick A1 \"https://pypi.org/p/PyMPDATA-MPI\"\nclick D \"https://pypi.org/p/PyMPDATA-MPI\"\nclick TESTS \"https://pypi.org/p/PyMPDATA-MPI\"\n```\nRectangular boxes indicate pip-installable Python packages (click to go to pypi.org package site).\n## Credits:\n\nDevelopment of PyMPDATA-MPI has been supported by the [Poland's National Science Centre](https://www.ncn.gov.pl/?language=en) \n(grant no. 2020/39/D/ST10/01220).\n\nWe acknowledge Poland\u2019s high-performance computing infrastructure [PLGrid](https://plgrid.pl/) (HPC Centers: [ACK Cyfronet AGH](https://www.cyfronet.pl/en/)) \nfor providing computer facilities and support within computational grant no. PLG/2023/016369\n\ncopyright: [Jagiellonian University](https://en.uj.edu.pl/en) & [AGH University of Krakow](https://agh.edu.pl/en) \nlicence: [GPL v3](https://www.gnu.org/licenses/gpl-3.0.html)\n\n## Design goals\n\n- MPI support for [PyMPDATA](https://pypi.org/project/PyMPDATA/) implemented externally (i.e., not incurring any overhead or additional dependencies for PyMPDATA users)\n- MPI calls within [Numba njitted code](https://numba.pydata.org/numba-doc/dev/reference/jit-compilation.html) (hence not using [`mpi4py`](https://mpi4py.readthedocs.io/), but rather [`numba-mpi`](https://pypi.org/p/numba-mpi/))\n- hybrid domain-decomposition parallelism: threading (internal in PyMPDATA, in the inner dimension) + MPI (either inner or outer dimension)\n- example simulation scenarios featuring HDF5/MPI-IO output storage (using [h5py](https://www.h5py.org/))\n- [py-modelrunner](https://github.com/zwicker-group/py-modelrunner) simulation orchestration\n- portability across Linux & macOS (no Windows support as of now due to [challenges in getting HDF5/MPI-IO to work there](https://docs.h5py.org/en/stable/build.html#source-installation-on-windows))\n- Continuous Integration (CI) with different OSes and different MPI implementations (leveraging to mpi4py's [setup-mpi Github Action](https://github.com/mpi4py/setup-mpi/))\n- full test coverage including CI builds asserting on same results with multi-node vs. single-node computations (with help of [pytest-mpi](https://pypi.org/p/pytest-mpi/))\n- ships as a [pip-installable package](https://pypi.org/project/PyMPDATA-MPI) - aimed to be a dependency of domain-specific packages \n\n## Related resources\n\n### open-source Large-Eddy-Simulation and related software\n\n#### Julia\n- https://github.com/CliMA/ClimateMachine.jl/\n#### C++\n- https://github.com/microhh/microhh\n- https://github.com/igfuw/UWLCM\n#### C/CUDA\n- https://github.com/NCAR/FastEddy-model\n#### FORTRAN\n- https://github.com/dalesteam/dales\n- https://github.com/uclales/uclales\n- https://github.com/UCLALES-SALSA/UCLALES-SALSA\n- https://github.com/igfuw/bE_SDs\n- https://github.com/pencil-code/pencil-code\n- https://github.com/AtmosFOAM/AtmosFOAM\n- https://github.com/scale-met/scale\n#### Python (incl. Cython) \n- https://github.com/CliMA/pycles\n- https://github.com/pnnl/pinacles\n",
"bugtrack_url": null,
"license": "GPL-3.0",
"summary": "PyMPDATA + numba-mpi coupler sandbox",
"version": "0.0.9",
"project_urls": null,
"split_keywords": [
"mpi",
" mpdata",
" numba",
" pympdata"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "4882b000e23588158cdb4c54a858080b97887d490242d7b36b37b043ad1195af",
"md5": "76d2f3fec8148d55d6b2614c23517945",
"sha256": "2dc43e55936e9224c03b4a27994ed93124c8361b0fdeab1a8b24daee14edfc68"
},
"downloads": -1,
"filename": "PyMPDATA_MPI-0.0.9-py3-none-any.whl",
"has_sig": false,
"md5_digest": "76d2f3fec8148d55d6b2614c23517945",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 17349,
"upload_time": "2024-04-08T23:38:21",
"upload_time_iso_8601": "2024-04-08T23:38:21.340341Z",
"url": "https://files.pythonhosted.org/packages/48/82/b000e23588158cdb4c54a858080b97887d490242d7b36b37b043ad1195af/PyMPDATA_MPI-0.0.9-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "fe35b9bf7af01c48aa4dd8c811b702d6a8d4a0210ed53ac0ab991fa94b985bfb",
"md5": "f616398728c376398b22bfe1f7bd8b3c",
"sha256": "11257692960dcb2c78e975065759bbfbc492aaa1b82993e63feae394495d9166"
},
"downloads": -1,
"filename": "PyMPDATA-MPI-0.0.9.tar.gz",
"has_sig": false,
"md5_digest": "f616398728c376398b22bfe1f7bd8b3c",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 37260,
"upload_time": "2024-04-08T23:38:22",
"upload_time_iso_8601": "2024-04-08T23:38:22.643463Z",
"url": "https://files.pythonhosted.org/packages/fe/35/b9bf7af01c48aa4dd8c811b702d6a8d4a0210ed53ac0ab991fa94b985bfb/PyMPDATA-MPI-0.0.9.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-04-08 23:38:22",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "pympdata-mpi"
}