folax


Namefolax JSON
Version 0.0.2 PyPI version JSON
download
home_pageNone
SummaryA framework for solving and optimizing PDEs by integrating machine learning with numerical methods in computational mechanics.
upload_time2025-08-20 08:26:52
maintainerNone
docs_urlNone
authorReza Najian Asl, email = <reza.najian-asl@tum.de>
requires_python>=3.10
licenseBSD 4-Clause License Copyright (c) 2024, Reza Najian Asl, Shahed Rezaei All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. All advertising materials mentioning features or use of this software must display the following acknowledgement: This product includes software developed by Reza Najian Asl, Shahed Rezaei. 4. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY COPYRIGHT HOLDER "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL COPYRIGHT HOLDER BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
keywords numerical-simulation-optimization pde-solver physics-informed-neural-networks (pinns) scientific machine learning (sciml)
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align=center><img height="54.125%" width="54.125%" src="https://github.com/RezaNajian/eFOL/assets/62375973/0e1ca4e0-0658-4f5d-aad9-1ae7c9f67574"></p>

[![License][license-image]][license] 
[![CI](https://github.com/RezaNajian/folax/actions/workflows/CI.yml/badge.svg)](https://github.com/RezaNajian/folax/actions/workflows/CI.yml)
[![PyPI version](https://badge.fury.io/py/folax.svg)](https://pypi.org/project/folax/)


[license-image]: https://img.shields.io/badge/license-BSD-green.svg?style=flat
[license]: https://github.com/RezaNajian/FOL/LICENSE

# Folax: Solution and Optimization of parameterized PDEs
**F**inite **O**perator **L**earning (FOL) with [**JAX**](https://github.com/jax-ml/jax) constitutes a unified numerical framework that seamlessly integrates established numerical methods with advanced scientific machine learning techniques for solving and optimizing parametrized partial differential equations (PDEs).  In constructing a physics-informed operator learning approach, FOL formulates a purely physics-based loss function derived from the Method of Weighted Residuals, allowing discrete residuals—computed using classical PDE solution techniques—to be directly incorporated into backpropagation during network training. This approach ensures that the learned operators rigorously satisfy the underlying governing equations while maintaining consistency with established numerical discretizations. Importantly, this loss formulation is agnostic to the network architecture and has been successfully applied to architectures such as Conditional Neural Fields, Fourier Neural Operators (FNO), and DeepONets. 

FOL has been applied in the following scientific studies:
- A Physics-Informed Meta-Learning Framework for the Continuous Solution of Parametric PDEs on Arbitrary Geometries [[arXiv](https://arxiv.org/abs/2504.02459)].
- Finite Operator Learning: Bridging Neural Operators and Numerical Methods for Efficient Parametric Solution and Optimization of PDEs [[arXiv](https://arxiv.org/abs/2407.04157)].
- Digitalizing metallic materials from image segmentation to multiscale solutions via physics informed operator learning [[npj Computational Materials](https://www.nature.com/articles/s41524-025-01718-y)].
- A Finite Operator Learning Technique for Mapping the Elastic Properties of Microstructures to Their Mechanical Deformations [[Numerical Methods in Eng.](https://onlinelibrary.wiley.com/doi/full/10.1002/nme.7637)].
- SPiFOL: A Spectral-based physics-informed finite operator learning for prediction of mechanical behavior of microstructures [[J. Mechanics and Physics of Solids](https://www.sciencedirect.com/science/article/pii/S0022509625001954)].

We built upon several widely adopted Python packages, including [JAX](https://github.com/jax-ml/jax) for high-performance array computations on CPUs and GPUs, [PETSc](https://petsc.org/release/) for the efficient solution of large-scale linear systems, [Metis](https://github.com/KarypisLab/METIS) for mesh partitioning (integration forthcoming), [Flax](https://github.com/google/flax?tab=readme-ov-file) for constructing modular and flexible neural networks, [Optax](https://github.com/google-deepmind/optax) for applying state-of-the-art gradient-based optimization algorithms, and [Orbax](https://github.com/google/orbax) for efficient checkpointing and serialization. This foundation ensures scalability, computational efficiency, and ease of use in large-scale training and simulation workflows.

## Installation
### CPU installation 
To install folax using pip (recommended) for CPU usage you can type the following command

``pip install folax[cpu]``

### GPU installation
To install folax using pip (recommended) for GPU usage you can type the following command

``pip install folax[cuda]``

### Developer installation
If you would like to do development in folax, please first clone the repo and in the folax folder, run the following command

``pip install -e .[cuda,dev]``

## Contributing
If you would like to contribute to the project, please open a pull request with small changes. If you would like to see big changes in the source code, please open an issue or discussion so we can start a conversation.


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "folax",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "Numerical-Simulation-Optimization, PDE-Solver, Physics-Informed-Neural-Networks (PINNs), Scientific Machine Learning (SciML)",
    "author": "Reza Najian Asl, email = <reza.najian-asl@tum.de>",
    "author_email": null,
    "download_url": null,
    "platform": null,
    "description": "<p align=center><img height=\"54.125%\" width=\"54.125%\" src=\"https://github.com/RezaNajian/eFOL/assets/62375973/0e1ca4e0-0658-4f5d-aad9-1ae7c9f67574\"></p>\n\n[![License][license-image]][license] \n[![CI](https://github.com/RezaNajian/folax/actions/workflows/CI.yml/badge.svg)](https://github.com/RezaNajian/folax/actions/workflows/CI.yml)\n[![PyPI version](https://badge.fury.io/py/folax.svg)](https://pypi.org/project/folax/)\n\n\n[license-image]: https://img.shields.io/badge/license-BSD-green.svg?style=flat\n[license]: https://github.com/RezaNajian/FOL/LICENSE\n\n# Folax: Solution and Optimization of parameterized PDEs\n**F**inite **O**perator **L**earning (FOL) with [**JAX**](https://github.com/jax-ml/jax) constitutes a unified numerical framework that seamlessly integrates established numerical methods with advanced scientific machine learning techniques for solving and optimizing parametrized partial differential equations (PDEs).  In constructing a physics-informed operator learning approach, FOL formulates a purely physics-based loss function derived from the Method of Weighted Residuals, allowing discrete residuals\u2014computed using classical PDE solution techniques\u2014to be directly incorporated into backpropagation during network training. This approach ensures that the learned operators rigorously satisfy the underlying governing equations while maintaining consistency with established numerical discretizations. Importantly, this loss formulation is agnostic to the network architecture and has been successfully applied to architectures such as Conditional Neural Fields, Fourier Neural Operators (FNO), and DeepONets. \n\nFOL has been applied in the following scientific studies:\n- A Physics-Informed Meta-Learning Framework for the Continuous Solution of Parametric PDEs on Arbitrary Geometries [[arXiv](https://arxiv.org/abs/2504.02459)].\n- Finite Operator Learning: Bridging Neural Operators and Numerical Methods for Efficient Parametric Solution and Optimization of PDEs [[arXiv](https://arxiv.org/abs/2407.04157)].\n- Digitalizing metallic materials from image segmentation to multiscale solutions via physics informed operator learning [[npj Computational Materials](https://www.nature.com/articles/s41524-025-01718-y)].\n- A Finite Operator Learning Technique for Mapping the Elastic Properties of Microstructures to Their Mechanical Deformations [[Numerical Methods in Eng.](https://onlinelibrary.wiley.com/doi/full/10.1002/nme.7637)].\n- SPiFOL: A Spectral-based physics-informed finite operator learning for prediction of mechanical behavior of microstructures [[J. Mechanics and Physics of Solids](https://www.sciencedirect.com/science/article/pii/S0022509625001954)].\n\nWe built upon several widely adopted Python packages, including [JAX](https://github.com/jax-ml/jax) for high-performance array computations on CPUs and GPUs, [PETSc](https://petsc.org/release/) for the efficient solution of large-scale linear systems, [Metis](https://github.com/KarypisLab/METIS) for mesh partitioning (integration forthcoming), [Flax](https://github.com/google/flax?tab=readme-ov-file) for constructing modular and flexible neural networks, [Optax](https://github.com/google-deepmind/optax) for applying state-of-the-art gradient-based optimization algorithms, and [Orbax](https://github.com/google/orbax) for efficient checkpointing and serialization. This foundation ensures scalability, computational efficiency, and ease of use in large-scale training and simulation workflows.\n\n## Installation\n### CPU installation \nTo install folax using pip (recommended) for CPU usage you can type the following command\n\n``pip install folax[cpu]``\n\n### GPU installation\nTo install folax using pip (recommended) for GPU usage you can type the following command\n\n``pip install folax[cuda]``\n\n### Developer installation\nIf you would like to do development in folax, please first clone the repo and in the folax folder, run the following command\n\n``pip install -e .[cuda,dev]``\n\n## Contributing\nIf you would like to contribute to the project, please open a pull request with small changes. If you would like to see big changes in the source code, please open an issue or discussion so we can start a conversation.\n\n",
    "bugtrack_url": null,
    "license": "BSD 4-Clause License\n        \n        Copyright (c) 2024, Reza Najian Asl, Shahed Rezaei\n        All rights reserved.\n        \n        Redistribution and use in source and binary forms, with or without\n        modification, are permitted provided that the following conditions are met:\n        \n        1. Redistributions of source code must retain the above copyright notice, this\n           list of conditions and the following disclaimer.\n        \n        2. Redistributions in binary form must reproduce the above copyright notice,\n           this list of conditions and the following disclaimer in the documentation\n           and/or other materials provided with the distribution.\n        \n        3. All advertising materials mentioning features or use of this software must\n           display the following acknowledgement:\n             This product includes software developed by Reza Najian Asl, Shahed Rezaei.\n        \n        4. Neither the name of the copyright holder nor the names of its\n           contributors may be used to endorse or promote products derived from\n           this software without specific prior written permission.\n        \n        THIS SOFTWARE IS PROVIDED BY COPYRIGHT HOLDER \"AS IS\" AND ANY EXPRESS OR\n        IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF\n        MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO\n        EVENT SHALL COPYRIGHT HOLDER BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\n        SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,\n        PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS;\n        OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,\n        WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR\n        OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF\n        ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.",
    "summary": "A framework for solving and optimizing PDEs by integrating machine learning with numerical methods in computational mechanics.",
    "version": "0.0.2",
    "project_urls": null,
    "split_keywords": [
        "numerical-simulation-optimization",
        " pde-solver",
        " physics-informed-neural-networks (pinns)",
        " scientific machine learning (sciml)"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "093c145f5e940f657b9f89b80c3664483df57c7d862ab1e93e4afb2299112380",
                "md5": "c0b17be60d2041f2b710ff376b906b4c",
                "sha256": "67d8b382900196ab52ad6a815592f3cf68f1aeedf0ad078880d2c8f5e722347d"
            },
            "downloads": -1,
            "filename": "folax-0.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c0b17be60d2041f2b710ff376b906b4c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 94072,
            "upload_time": "2025-08-20T08:26:52",
            "upload_time_iso_8601": "2025-08-20T08:26:52.100069Z",
            "url": "https://files.pythonhosted.org/packages/09/3c/145f5e940f657b9f89b80c3664483df57c7d862ab1e93e4afb2299112380/folax-0.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-20 08:26:52",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "folax"
}
        
Elapsed time: 1.78708s