itwinai


Nameitwinai JSON
Version 0.2.1 PyPI version JSON
download
home_pageNone
SummaryAI and ML workflows module for scientific digital twins.
upload_time2024-06-12 07:31:17
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseMIT License Copyright (c) 2023 interTwin Community Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords ml ai hpc
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # itwinai

[![GitHub Super-Linter](https://github.com/interTwin-eu/T6.5-AI-and-ML/actions/workflows/lint.yml/badge.svg)](https://github.com/marketplace/actions/super-linter)
[![GitHub Super-Linter](https://github.com/interTwin-eu/T6.5-AI-and-ML/actions/workflows/check-links.yml/badge.svg)](https://github.com/marketplace/actions/markdown-link-check)
 [![SQAaaS source code](https://github.com/EOSC-synergy/itwinai.assess.sqaaas/raw/dev/.badge/status_shields.svg)](https://sqaaas.eosc-synergy.eu/#/full-assessment/report/https://raw.githubusercontent.com/eosc-synergy/itwinai.assess.sqaaas/dev/.report/assessment_output.json)

See the latest version of our [docs](https://itwinai.readthedocs.io/)
for a quick overview of this platform for advanced AI/ML workflows in digital twin applications.

## Installation

Requirements:

- Linux environment. Windows and macOS were never tested.

### Python virtual environment

Depending on your environment, there are different ways to
select a specific python version.

#### Laptop or GPU node

If you are working on a laptop
or on a simple on-prem setup, you could consider using
[pyenv](https://github.com/pyenv/pyenv). See the
[installation instructions](https://github.com/pyenv/pyenv?tab=readme-ov-file#installation). If you are using pyenv,
make sure to read [this](https://github.com/pyenv/pyenv/wiki#suggested-build-environment).

#### HPC environment

In HPC systems it is more popular to load dependencies using
Environment Modules or Lmod. Contact the system administrator
to learn how to select the proper python modules.

On JSC, we activate the required modules in this way:

```bash
ml --force purge
ml Stages/2024 GCC OpenMPI CUDA/12 cuDNN MPI-settings/CUDA
ml Python CMake HDF5 PnetCDF libaio mpi4py
```

### Install itwinai

Install itwinai and its dependencies using the
following command, and follow the instructions:

```bash
# Create a python virtual environment and activate it
$ python -m venv ENV_NAME
$ source ENV_NAME/bin/activate

# Install itwinai inside the environment
(ENV_NAME) $ export ML_FRAMEWORK="pytorch" # or "tensorflow"
(ENV_NAME) $ curl -fsSL https://github.com/interTwin-eu/itwinai/raw/main/env-files/itwinai-installer.sh | bash
```

The `ML_FRAMEWORK` environment variable controls whether you are installing
itwinai for PyTorch or TensorFlow.

> [!WARNING]  
> itwinai depends on Horovod, which requires `CMake>=1.13` and
> [other packages](https://horovod.readthedocs.io/en/latest/install_include.html#requirements).
> Make sure to have them installed in your environment before proceeding.

## Installation for developers

If you are contributing to this repository, please continue below for
more advanced instructions.

> [!WARNING]
> Branch protection rules are applied to all branches which names
> match this regex: `[dm][ea][vi]*` . When creating new branches,
> please avoid using names that match that regex, otherwise branch
> protection rules will block direct pushes to that branch.

### Install itwinai environment

Regardless of how you loaded your environment, you can create the
python virtual environments with the following commands.
Once the correct Python version is loaded, create the virtual
environments using our pre-make Makefile:

```bash
make torch-env # or make torch-env-cpu
make tensorflow-env # or make tensorflow-env-cpu

# Juelich supercomputer
make torch-gpu-jsc
make tf-gpu-jsc
```

#### TensorFlow

Installation:

```bash
# Install TensorFlow 2.13
make tensorflow-env

# Activate env
source .venv-tf/bin/activate
```

A CPU-only version is available at the target `tensorflow-env-cpu`.

#### PyTorch (+ Lightning)

Installation:

```bash
# Install PyTorch + lightning
make torch-env

# Activate env
source .venv-pytorch/bin/activate
```

A CPU-only version is available at the target `torch-env-cpu`.

### Development environment

This is for developers only. To have it, update the installed `itwinai` package
adding the `dev` extra:

```bash
pip install -e .[dev]
```

#### Test with `pytest`

Do this only if you are a developer wanting to test your code with pytest.

First, you need to create virtual environments both for torch and tensorflow.
For instance, you can use:

```bash
make torch-env-cpu
make tensorflow-env-cpu
```

To select the name of the torch and tf environments you can set the following
environment variables, which allow to run the tests in environments with
custom names which are different from `.venv-pytorch` and `.venv-tf`.

```bash
export TORCH_ENV="my_torch_env"
export TF_ENV="my_tf_env"
```

Functional tests (marked with `pytest.mark.functional`) will be executed under
`/tmp/pytest` location to guarantee they are run in a clean environment.

To run functional tests use:

```bash
pytest -v tests/ -m "functional"
```

To run all tests on itwinai package:

```bash
make test
```

Run tests in JSC virtual environments:

```bash
make test-jsc
```

### Micromamba installation (deprecated)

To manage Conda environments we use micromamba, a light weight version of conda.

It is suggested to refer to the
[Manual installation guide](https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html#manual-installation).

Consider that Micromamba can eat a lot of space when building environments because packages are cached on
the local filesystem after being downloaded. To clear cache you can use `micromamba clean -a`.
Micromamba data are kept under the `$HOME` location. However, in some systems, `$HOME` has a limited storage
space and it would be cleverer to install Micromamba in another location with more storage space.
Thus by changing the `$MAMBA_ROOT_PREFIX` variable. See a complete installation example for Linux below, where the
default `$MAMBA_ROOT_PREFIX` is overridden:

```bash
cd $HOME

# Download micromamba (This command is for Linux Intel (x86_64) systems. Find the right one for your system!)
curl -Ls https://micro.mamba.pm/api/micromamba/linux-64/latest | tar -xvj bin/micromamba

# Install micromamba in a custom directory
MAMBA_ROOT_PREFIX='my-mamba-root'
./bin/micromamba shell init $MAMBA_ROOT_PREFIX

# To invoke micromamba from Makefile, you need to add explicitly to $PATH
echo 'PATH="$(dirname $MAMBA_EXE):$PATH"' >> ~/.bashrc
```

**Reference**: [Micromamba installation guide](https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "itwinai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "Matteo Bunino <matteo.bunino@cern.ch>, Rakesh Sarma <r.sarma@fz-juelich.de>, Mario Ruettgers <m.ruettgers@fz-juelich.de>, Kalliopi Tsolaki <kalliopi.tsolaki@cern.ch>",
    "keywords": "ml, ai, hpc",
    "author": null,
    "author_email": "Matteo Bunino <matteo.bunino@cern.ch>, Rakesh Sarma <r.sarma@fz-juelich.de>, Mario Ruettgers <m.ruettgers@fz-juelich.de>, Kalliopi Tsolaki <kalliopi.tsolaki@cern.ch>",
    "download_url": "https://files.pythonhosted.org/packages/f6/75/94c3d602c22a5242d04b50c73466744a0a60cfb920ef428a084d74d8db19/itwinai-0.2.1.tar.gz",
    "platform": null,
    "description": "# itwinai\n\n[![GitHub Super-Linter](https://github.com/interTwin-eu/T6.5-AI-and-ML/actions/workflows/lint.yml/badge.svg)](https://github.com/marketplace/actions/super-linter)\n[![GitHub Super-Linter](https://github.com/interTwin-eu/T6.5-AI-and-ML/actions/workflows/check-links.yml/badge.svg)](https://github.com/marketplace/actions/markdown-link-check)\n [![SQAaaS source code](https://github.com/EOSC-synergy/itwinai.assess.sqaaas/raw/dev/.badge/status_shields.svg)](https://sqaaas.eosc-synergy.eu/#/full-assessment/report/https://raw.githubusercontent.com/eosc-synergy/itwinai.assess.sqaaas/dev/.report/assessment_output.json)\n\nSee the latest version of our [docs](https://itwinai.readthedocs.io/)\nfor a quick overview of this platform for advanced AI/ML workflows in digital twin applications.\n\n## Installation\n\nRequirements:\n\n- Linux environment. Windows and macOS were never tested.\n\n### Python virtual environment\n\nDepending on your environment, there are different ways to\nselect a specific python version.\n\n#### Laptop or GPU node\n\nIf you are working on a laptop\nor on a simple on-prem setup, you could consider using\n[pyenv](https://github.com/pyenv/pyenv). See the\n[installation instructions](https://github.com/pyenv/pyenv?tab=readme-ov-file#installation). If you are using pyenv,\nmake sure to read [this](https://github.com/pyenv/pyenv/wiki#suggested-build-environment).\n\n#### HPC environment\n\nIn HPC systems it is more popular to load dependencies using\nEnvironment Modules or Lmod. Contact the system administrator\nto learn how to select the proper python modules.\n\nOn JSC, we activate the required modules in this way:\n\n```bash\nml --force purge\nml Stages/2024 GCC OpenMPI CUDA/12 cuDNN MPI-settings/CUDA\nml Python CMake HDF5 PnetCDF libaio mpi4py\n```\n\n### Install itwinai\n\nInstall itwinai and its dependencies using the\nfollowing command, and follow the instructions:\n\n```bash\n# Create a python virtual environment and activate it\n$ python -m venv ENV_NAME\n$ source ENV_NAME/bin/activate\n\n# Install itwinai inside the environment\n(ENV_NAME) $ export ML_FRAMEWORK=\"pytorch\" # or \"tensorflow\"\n(ENV_NAME) $ curl -fsSL https://github.com/interTwin-eu/itwinai/raw/main/env-files/itwinai-installer.sh | bash\n```\n\nThe `ML_FRAMEWORK` environment variable controls whether you are installing\nitwinai for PyTorch or TensorFlow.\n\n> [!WARNING]  \n> itwinai depends on Horovod, which requires `CMake>=1.13` and\n> [other packages](https://horovod.readthedocs.io/en/latest/install_include.html#requirements).\n> Make sure to have them installed in your environment before proceeding.\n\n## Installation for developers\n\nIf you are contributing to this repository, please continue below for\nmore advanced instructions.\n\n> [!WARNING]\n> Branch protection rules are applied to all branches which names\n> match this regex: `[dm][ea][vi]*` . When creating new branches,\n> please avoid using names that match that regex, otherwise branch\n> protection rules will block direct pushes to that branch.\n\n### Install itwinai environment\n\nRegardless of how you loaded your environment, you can create the\npython virtual environments with the following commands.\nOnce the correct Python version is loaded, create the virtual\nenvironments using our pre-make Makefile:\n\n```bash\nmake torch-env # or make torch-env-cpu\nmake tensorflow-env # or make tensorflow-env-cpu\n\n# Juelich supercomputer\nmake torch-gpu-jsc\nmake tf-gpu-jsc\n```\n\n#### TensorFlow\n\nInstallation:\n\n```bash\n# Install TensorFlow 2.13\nmake tensorflow-env\n\n# Activate env\nsource .venv-tf/bin/activate\n```\n\nA CPU-only version is available at the target `tensorflow-env-cpu`.\n\n#### PyTorch (+ Lightning)\n\nInstallation:\n\n```bash\n# Install PyTorch + lightning\nmake torch-env\n\n# Activate env\nsource .venv-pytorch/bin/activate\n```\n\nA CPU-only version is available at the target `torch-env-cpu`.\n\n### Development environment\n\nThis is for developers only. To have it, update the installed `itwinai` package\nadding the `dev` extra:\n\n```bash\npip install -e .[dev]\n```\n\n#### Test with `pytest`\n\nDo this only if you are a developer wanting to test your code with pytest.\n\nFirst, you need to create virtual environments both for torch and tensorflow.\nFor instance, you can use:\n\n```bash\nmake torch-env-cpu\nmake tensorflow-env-cpu\n```\n\nTo select the name of the torch and tf environments you can set the following\nenvironment variables, which allow to run the tests in environments with\ncustom names which are different from `.venv-pytorch` and `.venv-tf`.\n\n```bash\nexport TORCH_ENV=\"my_torch_env\"\nexport TF_ENV=\"my_tf_env\"\n```\n\nFunctional tests (marked with `pytest.mark.functional`) will be executed under\n`/tmp/pytest` location to guarantee they are run in a clean environment.\n\nTo run functional tests use:\n\n```bash\npytest -v tests/ -m \"functional\"\n```\n\nTo run all tests on itwinai package:\n\n```bash\nmake test\n```\n\nRun tests in JSC virtual environments:\n\n```bash\nmake test-jsc\n```\n\n### Micromamba installation (deprecated)\n\nTo manage Conda environments we use micromamba, a light weight version of conda.\n\nIt is suggested to refer to the\n[Manual installation guide](https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html#manual-installation).\n\nConsider that Micromamba can eat a lot of space when building environments because packages are cached on\nthe local filesystem after being downloaded. To clear cache you can use `micromamba clean -a`.\nMicromamba data are kept under the `$HOME` location. However, in some systems, `$HOME` has a limited storage\nspace and it would be cleverer to install Micromamba in another location with more storage space.\nThus by changing the `$MAMBA_ROOT_PREFIX` variable. See a complete installation example for Linux below, where the\ndefault `$MAMBA_ROOT_PREFIX` is overridden:\n\n```bash\ncd $HOME\n\n# Download micromamba (This command is for Linux Intel (x86_64) systems. Find the right one for your system!)\ncurl -Ls https://micro.mamba.pm/api/micromamba/linux-64/latest | tar -xvj bin/micromamba\n\n# Install micromamba in a custom directory\nMAMBA_ROOT_PREFIX='my-mamba-root'\n./bin/micromamba shell init $MAMBA_ROOT_PREFIX\n\n# To invoke micromamba from Makefile, you need to add explicitly to $PATH\necho 'PATH=\"$(dirname $MAMBA_EXE):$PATH\"' >> ~/.bashrc\n```\n\n**Reference**: [Micromamba installation guide](https://mamba.readthedocs.io/en/latest/installation/micromamba-installation.html).\n",
    "bugtrack_url": null,
    "license": "MIT License  Copyright (c) 2023 interTwin Community  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ",
    "summary": "AI and ML workflows module for scientific digital twins.",
    "version": "0.2.1",
    "project_urls": {
        "Documentation": "https://itwinai.readthedocs.io/",
        "Homepage": "https://www.intertwin.eu/",
        "Repository": "https://github.com/interTwin-eu/itwinai"
    },
    "split_keywords": [
        "ml",
        " ai",
        " hpc"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "07323d0e99738141854974d21a5ba0d86091417a2777bf9a34f53b63adf25939",
                "md5": "e6e892d543c80667c81be60315ea5cea",
                "sha256": "6938d41060f5d3e6c786f5e4d38f7700aea40adbf6784faae8e7454540c2c30b"
            },
            "downloads": -1,
            "filename": "itwinai-0.2.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e6e892d543c80667c81be60315ea5cea",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 55552,
            "upload_time": "2024-06-12T07:31:16",
            "upload_time_iso_8601": "2024-06-12T07:31:16.374239Z",
            "url": "https://files.pythonhosted.org/packages/07/32/3d0e99738141854974d21a5ba0d86091417a2777bf9a34f53b63adf25939/itwinai-0.2.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f67594c3d602c22a5242d04b50c73466744a0a60cfb920ef428a084d74d8db19",
                "md5": "9621df218c88894b0036350ffe0d1132",
                "sha256": "a1835c86cbfce0e707f9e1fb932f58c193eb4f25e5ce0a18f124a4ed402236a2"
            },
            "downloads": -1,
            "filename": "itwinai-0.2.1.tar.gz",
            "has_sig": false,
            "md5_digest": "9621df218c88894b0036350ffe0d1132",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 717528,
            "upload_time": "2024-06-12T07:31:17",
            "upload_time_iso_8601": "2024-06-12T07:31:17.805334Z",
            "url": "https://files.pythonhosted.org/packages/f6/75/94c3d602c22a5242d04b50c73466744a0a60cfb920ef428a084d74d8db19/itwinai-0.2.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-12 07:31:17",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "interTwin-eu",
    "github_project": "itwinai",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "itwinai"
}
        
Elapsed time: 0.24019s