| Name | scimba JSON |
| Version |
0.5
JSON |
| download |
| home_page | None |
| Summary | This library implements some common tools for scientific machine learning |
| upload_time | 2024-10-25 09:58:43 |
| maintainer | None |
| docs_url | None |
| author | None |
| requires_python | >=3.9 |
| license | MIT |
| keywords |
|
| VCS |
|
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
# ScimBa
[](https://gitlab.inria.fr/scimba/scimba/-/commits/main)
[](https://sciml.gitlabpages.inria.fr/scimba/coverage)
[](https://gitlab.inria.fr/scimba/scimba/-/releases)
[](hhttps://scimba.gitlabpages.inria.fr/scimba/)
This librairies impliment varying SciML methods for varying PDE problem and also tools to build hybrid numerical methods.
## Current Content
- **Nets**: MLP networks, Discontinuous MLP, RBF networks, some activations functions and a basic trainer
- **Sampling and domain**: general uniform sampling methods for PINNs and Neural Operators. Sampling based on approximated signed distance function for general geometries.
- **PDEs**: the librairiy implement différent type of model: ODE, spatial pde, times-apce pde, stationary kinetic PDE and kinetic PDE.
- **Specific networks for PINNs**: For all the equations we implement PINNs networks based on: MLP, Discontinuous MLP and nonlinear radial basis function.
We implement also the Fourier network with general features (Fourier but also Gaussian etc)
- **Trainer**: for each type of PDE we gives a specific trainer.
- **Generative Nets**: Normalized flows, Gaussian mixture. The classical and conditional approaches are proposed. Trainer based on the maximum likelihood principle.
- **Neural Operator**: Continuous Auto-encoder based on PointNet encoder and coordinate based decoder. Physic informed DeepOnet for ODE, spatial and time space PDE.
- **Neural Galerkin**: Method Neural Galerkin for time PDE based on the same network than PINNs.
## Ongoing work for 2024
- **Nets**: New activation function used for implicit representation, Symbolic models, Sindy
- **Sampling and domain**: learning of signed distance function using PINNs, adaptive sampling
- **Specific networks for PINNs**: Multiscale architecture, spectral architecture for kinetic, specific architecture.
- **Trainer**: Trainer with sparsity constraints and globalization method. Loss Balancing
- **Generative Nets**: Energy models, score matching, more complex normalized flow, Continuous VAE
- **Neural Operator**: physic informed DeepGreen operator, FNO, GINO based on FNO, NO with neural implicit representation. Kinetic case
- **Neural Galerkin**: Adaptive sampling, randomization, Least Square solver, implicit scheme. CROM Space time reduced Galerkin model. Greedy basis.
## References
### PINNs and MLP
- https://www.sciencedirect.com/science/article/abs/pii/S0021999118307125
- https://arxiv.org/abs/2209.03984
- https://arxiv.org/abs/1912.00873
- https://arxiv.org/abs/1912.00873
- https://arxiv.org/abs/2109.01050
- https://arxiv.org/abs/2203.01360
- https://arxiv.org/abs/2103.09959
- https://openreview.net/forum?id=vsMyHUq_C1c
### Neural Galerkin
- https://arxiv.org/abs/2306.15630
- https://arxiv.org/abs/2306.03749
- https://arxiv.org/abs/2207.13828
- https://arxiv.org/abs/2201.07953
- https://arxiv.org/abs/2104.13515
### DeepOnet
- https://www.nature.com/articles/s42256-021-00302-5
- https://www.science.org/doi/10.1126/sciadv.abi8605
- https://arxiv.org/abs/2205.11404
- https://arxiv.org/abs/2206.03551
### FNO and diverse geometry
- https://openreview.net/forum?id=c8P9NQVtmnO
- https://arxiv.org/abs/2207.05209
- https://arxiv.org/abs/2212.04689
- https://arxiv.org/abs/2305.00478
- https://arxiv.org/abs/2306.05697
- https://arxiv.org/abs/2305.19663
### Other NO
- https://openreview.net/forum?id=LZDiWaC9CGL
- https://arxiv.org/abs/2205.10573
- https://arxiv.org/abs/2205.02191
- https://openreview.net/forum?id=kIo_C6QmMOM
- https://arxiv.org/abs/2303.10528
- https://arxiv.org/abs/2302.05925
## Install the project
```bash
git clone https://gitlab.inria.fr/sciml/scimba.git
cd scimba
```
## Install the basic package
```bash
pip install -e .
```
if you want the differential physic aspect we must run:
```bash
pip install -e ".[diff_physic]"
```
## Full install
```bash
pip install -e ".[all]"
```
## Launch tests
```bash
pip install -e ".[test]"
pytest
```
## Generate documentation
```bash
pip install -e ".[doc]"
cd docs
env PYTORCH_JIT=0 make html
```
html docs are generated in \_build/html
Raw data
{
"_id": null,
"home_page": null,
"name": "scimba",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": "Emmanuel Franck <emmanuel.franck@inria.fr>, Victor Michel-Dansac <victor.michel-dansac@inria.fr>",
"download_url": "https://files.pythonhosted.org/packages/8e/89/f529798426f1bec76095627a520f77bf61545cae3b7c84ddca9544a1272f/scimba-0.5.tar.gz",
"platform": null,
"description": "# ScimBa\n\n[](https://gitlab.inria.fr/scimba/scimba/-/commits/main)\n[](https://sciml.gitlabpages.inria.fr/scimba/coverage)\n[](https://gitlab.inria.fr/scimba/scimba/-/releases)\n[](hhttps://scimba.gitlabpages.inria.fr/scimba/)\n\nThis librairies impliment varying SciML methods for varying PDE problem and also tools to build hybrid numerical methods.\n\n## Current Content\n\n- **Nets**: MLP networks, Discontinuous MLP, RBF networks, some activations functions and a basic trainer\n- **Sampling and domain**: general uniform sampling methods for PINNs and Neural Operators. Sampling based on approximated signed distance function for general geometries.\n- **PDEs**: the librairiy implement diff\u00e9rent type of model: ODE, spatial pde, times-apce pde, stationary kinetic PDE and kinetic PDE.\n- **Specific networks for PINNs**: For all the equations we implement PINNs networks based on: MLP, Discontinuous MLP and nonlinear radial basis function.\nWe implement also the Fourier network with general features (Fourier but also Gaussian etc)\n- **Trainer**: for each type of PDE we gives a specific trainer.\n- **Generative Nets**: Normalized flows, Gaussian mixture. The classical and conditional approaches are proposed. Trainer based on the maximum likelihood principle.\n- **Neural Operator**: Continuous Auto-encoder based on PointNet encoder and coordinate based decoder. Physic informed DeepOnet for ODE, spatial and time space PDE.\n- **Neural Galerkin**: Method Neural Galerkin for time PDE based on the same network than PINNs.\n\n\n## Ongoing work for 2024\n- **Nets**: New activation function used for implicit representation, Symbolic models, Sindy\n- **Sampling and domain**: learning of signed distance function using PINNs, adaptive sampling\n- **Specific networks for PINNs**: Multiscale architecture, spectral architecture for kinetic, specific architecture.\n- **Trainer**: Trainer with sparsity constraints and globalization method. Loss Balancing\n- **Generative Nets**: Energy models, score matching, more complex normalized flow, Continuous VAE\n- **Neural Operator**: physic informed DeepGreen operator, FNO, GINO based on FNO, NO with neural implicit representation. Kinetic case\n- **Neural Galerkin**: Adaptive sampling, randomization, Least Square solver, implicit scheme. CROM Space time reduced Galerkin model. Greedy basis.\n\n\n## References\n\n### PINNs and MLP\n\n- https://www.sciencedirect.com/science/article/abs/pii/S0021999118307125\n- https://arxiv.org/abs/2209.03984\n- https://arxiv.org/abs/1912.00873\n- https://arxiv.org/abs/1912.00873\n- https://arxiv.org/abs/2109.01050\n- https://arxiv.org/abs/2203.01360\n- https://arxiv.org/abs/2103.09959\n- https://openreview.net/forum?id=vsMyHUq_C1c\n\n### Neural Galerkin\n\n- https://arxiv.org/abs/2306.15630\n- https://arxiv.org/abs/2306.03749\n- https://arxiv.org/abs/2207.13828\n- https://arxiv.org/abs/2201.07953\n- https://arxiv.org/abs/2104.13515\n\n### DeepOnet\n\n- https://www.nature.com/articles/s42256-021-00302-5\n- https://www.science.org/doi/10.1126/sciadv.abi8605\n- https://arxiv.org/abs/2205.11404\n- https://arxiv.org/abs/2206.03551\n\n### FNO and diverse geometry\n\n- https://openreview.net/forum?id=c8P9NQVtmnO\n- https://arxiv.org/abs/2207.05209\n- https://arxiv.org/abs/2212.04689\n- https://arxiv.org/abs/2305.00478\n- https://arxiv.org/abs/2306.05697\n- https://arxiv.org/abs/2305.19663\n\n### Other NO\n\n- https://openreview.net/forum?id=LZDiWaC9CGL\n- https://arxiv.org/abs/2205.10573\n- https://arxiv.org/abs/2205.02191\n- https://openreview.net/forum?id=kIo_C6QmMOM\n- https://arxiv.org/abs/2303.10528\n- https://arxiv.org/abs/2302.05925\n\n## Install the project\n\n```bash\ngit clone https://gitlab.inria.fr/sciml/scimba.git\ncd scimba\n```\n\n## Install the basic package\n\n```bash\npip install -e .\n```\n\nif you want the differential physic aspect we must run:\n\n```bash\npip install -e \".[diff_physic]\"\n```\n\n## Full install\n\n```bash\npip install -e \".[all]\"\n```\n\n## Launch tests\n\n```bash\npip install -e \".[test]\"\npytest\n```\n\n## Generate documentation\n\n```bash\npip install -e \".[doc]\"\ncd docs\nenv PYTORCH_JIT=0 make html\n```\nhtml docs are generated in \\_build/html\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "This library implements some common tools for scientific machine learning",
"version": "0.5",
"project_urls": {
"Homepage": "https://gitlab.inria.fr/scimba/scimba"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "1d9691befdbb065a5d5cbd8e0d095262710fbdf451460e1b8f5231852aa3d576",
"md5": "e8824dfd52c3abf794df69059b0e9e02",
"sha256": "b23c017146d60814ce5c474cc20afaf408d29d70d5f7dbb945da26f9ea3273cd"
},
"downloads": -1,
"filename": "scimba-0.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e8824dfd52c3abf794df69059b0e9e02",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 150425,
"upload_time": "2024-10-25T09:58:41",
"upload_time_iso_8601": "2024-10-25T09:58:41.755141Z",
"url": "https://files.pythonhosted.org/packages/1d/96/91befdbb065a5d5cbd8e0d095262710fbdf451460e1b8f5231852aa3d576/scimba-0.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "8e89f529798426f1bec76095627a520f77bf61545cae3b7c84ddca9544a1272f",
"md5": "5847a19cb19fc7eece8487af5e96323a",
"sha256": "5aba855b7df89510d709f8777c00b8fe4446a4bb691935f0cdaccd4d8926cec6"
},
"downloads": -1,
"filename": "scimba-0.5.tar.gz",
"has_sig": false,
"md5_digest": "5847a19cb19fc7eece8487af5e96323a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 102551,
"upload_time": "2024-10-25T09:58:43",
"upload_time_iso_8601": "2024-10-25T09:58:43.637234Z",
"url": "https://files.pythonhosted.org/packages/8e/89/f529798426f1bec76095627a520f77bf61545cae3b7c84ddca9544a1272f/scimba-0.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-25 09:58:43",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "scimba"
}