JaxSSO


NameJaxSSO JSON
Version 0.0.5 PyPI version JSON
download
home_pagehttps://github.com/GaoyuanWu/JaxSSO
SummaryA framework for structural shape optimization based on automatic differentiation (AD) and the adjoint method, enabled by JAX
upload_time2023-04-04 14:39:29
maintainer
docs_urlNone
authorGaoyuan Wu
requires_python>=3.7
license
keywords jax automatic-differentiation shape optimization form-finding structural optimization
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # JaxSSO
A framework for structural shape optimization based on automatic differentiation (AD) and the adjoint method, enabled by [JAX](https://github.com/google/jax).

Developed by [Gaoyuan Wu](https://gaoyuanwu.github.io/) @ Princeton.


We have a [preprint](https://doi.org/10.48550/arXiv.2211.15409) under review where you can find details regarding this framework.
Please share our project with others and cite us if you find it interesting and helpful.
Cite us using:
```bibtex
@misc{https://doi.org/10.48550/arxiv.2211.15409,
  doi = {10.48550/ARXIV.2211.15409},
  url = {https://arxiv.org/abs/2211.15409},
  author = {Wu, Gaoyuan},
  title = {A framework for structural shape optimization based on automatic differentiation, the adjoint method and accelerated linear algebra},
  publisher = {arXiv},
  year = {2022},
}

```
## Features
* Automatic differentiation (AD): an easy and accurate way for gradient evaluation. The implementation of AD avoids deriving derivatives manually or trauncation errors from numerical differentiation.
* Acclerated linear algebra (XLA) and just-in-time compilation: these features in JAX boost the gradient evaluation
* Hardware acceleration: run on GPUs and TPUs for **faster** experience.
* Form finding based on finite element analysis (FEA) and optimization theory

Here is an implementation of JaxSSO to form-find a structure inspired by [Mannheim Multihalle](https://mannheim-multihalle.de/en/architecture/) using simple gradient descent. (First photo credit to Daniel Lukac)
![alt text](https://github.com/GaoyuanWu/JaxSSO/blob/main/data/images/MannheimMultihalle.jpg)
![alt text](https://github.com/GaoyuanWu/JaxSSO/blob/main/data/images/MM_opt.jpg)
## Background: shape optimization
We consider the minimization of the ***strain energy*** by changing the **shape** of structures, which is equivalent to maximizing the stiffness and reducing the
bending in the structure. The mathematical formulation of this problem is as follows, where no additional constraints are considered.
$$\text{minimize} \quad C(\mathbf{x}) = \frac{1}{2}\int\sigma\epsilon \mathrm{d}V = \frac{1}{2}\mathbf{f}^\mathrm{T}\mathbf{u}(\mathbf{x}) $$
$$\text{subject to: } \quad \mathbf{K}(\mathbf{x})\mathbf{u}(\mathbf{x}) =\mathbf{f}$$
where $C$ is the compliance, which is equal to the work done by the external load; $\mathbf{x} \in \mathbb{R}^{n_d}$ is a vector of $n_d$ design variables that determine the shape of the structure; $\sigma$, $\epsilon$ and $V$ are the stress, strain and volume, respectively; $\mathbf{f} \in \mathbb{R}^n$ and $\mathbf{u}(\mathbf{x}) \in \mathbb{R}^n$ are the generalized load vector and nodal displacement of $n$ structural nodes; $\mathbf{K} \in \mathbb{R}^{6n\times6n}$ is the stiffness matrix. The constraint is essentially the governing equation in finite element analysis (FEA).

To implement **gradient-based optimization**, one needs to calculate $\nabla C$. By applying the ***adjoint method***, the entry of $\nabla C$ is as follows:
$$\frac{\partial C}{\partial x_i}=-\frac{1}{2}\mathbf{u}^\mathrm{T}\frac{\partial \mathbf{K}}{\partial x_i}\mathbf{u}$$ The use of the adjoint method: i) reduces the computation complexity and ii) decouples FEA and the derivative calculation of the stiffness matrix $\mathbf K$.
To get $\nabla C$:
1. Conduct FEA to get $\mathbf u$
2. Conduct sensitivity analysis to get $\frac{\partial \mathbf{K}}{\partial x_i}$.

## Usage

### Installation
Install it with pip: `pip install JaxSSO`

### Dependencies
JaxSSO is written in Python and requires:
* [numpy](https://numpy.org/doc/stable/index.html) >= 1.22.0.
* [JAX](https://jax.readthedocs.io/en/latest/index.html): "JAX is [Autograd](https://github.com/hips/autograd) and [XLA](https://www.tensorflow.org/xla), brought together for high-performance machine learning research." Please refer to [this link](https://github.com/google/jax#installation) for the installation of JAX.
* [Nlopt](https://nlopt.readthedocs.io/en/latest/): Nlopt is a library for nonlinear optimization. It has Python interface, which is implemented herein. Refer to [this link](https://nlopt.readthedocs.io/en/latest/NLopt_Installation/) for the installation of Nlopt. Alternatively, you can use `pip install nlopt`, please refer to [
nlopt-python](https://pypi.org/project/nlopt/).


### Quickstart
The project provides you with interactive examples with Google Colab for quick start:
* [2D-arch](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/Arch_2D.ipynb): form-finding of a 2d-arch
* [3D-arch](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/Arch_3D.ipynb): form-finding of a 3d-arch
* [Mannheim Multihalle](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/Mannheim_Multihalle.ipynb): form-finding of Mannheim Multihalle
* [Four-point supported gridshell](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/FourPt_FreeForm.ipynb): form-finding of a gridshell with four coner nodes pinned. The geometry is parameterized by Bezier Surface.
* [Two-edge supported canopy, unconstrained](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/TwoEdge_FreeForm_Unconstrained.ipynb): form-finding of a canopy. The geometry is parameterized by Bezier Surface.
* [Two-edge supported canopy, constrained](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/TwoEdge_FreeForm_Constrained.ipynb): form-finding of a canopy with height constraints. The geometry is parameterized by Bezier Surface.



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/GaoyuanWu/JaxSSO",
    "name": "JaxSSO",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "jax,automatic-differentiation,shape optimization,form-finding,structural optimization",
    "author": "Gaoyuan Wu",
    "author_email": "gaoyuanw@princeton.edu",
    "download_url": "https://files.pythonhosted.org/packages/eb/92/f5ecddcc300d007ad5d12a1cdb133034d2b0d26a5388fa5045baf6a0f961/JaxSSO-0.0.5.tar.gz",
    "platform": null,
    "description": "# JaxSSO\nA framework for structural shape optimization based on automatic differentiation (AD) and the adjoint method, enabled by [JAX](https://github.com/google/jax).\n\nDeveloped by [Gaoyuan Wu](https://gaoyuanwu.github.io/) @ Princeton.\n\n\nWe have a [preprint](https://doi.org/10.48550/arXiv.2211.15409) under review where you can find details regarding this framework.\nPlease share our project with others and cite us if you find it interesting and helpful.\nCite us using:\n```bibtex\n@misc{https://doi.org/10.48550/arxiv.2211.15409,\n  doi = {10.48550/ARXIV.2211.15409},\n  url = {https://arxiv.org/abs/2211.15409},\n  author = {Wu, Gaoyuan},\n  title = {A framework for structural shape optimization based on automatic differentiation, the adjoint method and accelerated linear algebra},\n  publisher = {arXiv},\n  year = {2022},\n}\n\n```\n## Features\n* Automatic differentiation (AD): an easy and accurate way for gradient evaluation. The implementation of AD avoids deriving derivatives manually or trauncation errors from numerical differentiation.\n* Acclerated linear algebra (XLA) and just-in-time compilation: these features in JAX boost the gradient evaluation\n* Hardware acceleration: run on GPUs and TPUs for **faster** experience.\n* Form finding based on finite element analysis (FEA) and optimization theory\n\nHere is an implementation of JaxSSO to form-find a structure inspired by [Mannheim Multihalle](https://mannheim-multihalle.de/en/architecture/) using simple gradient descent. (First photo credit to Daniel Lukac)\n![alt text](https://github.com/GaoyuanWu/JaxSSO/blob/main/data/images/MannheimMultihalle.jpg)\n![alt text](https://github.com/GaoyuanWu/JaxSSO/blob/main/data/images/MM_opt.jpg)\n## Background: shape optimization\nWe consider the minimization of the ***strain energy*** by changing the **shape** of structures, which is equivalent to maximizing the stiffness and reducing the\nbending in the structure. The mathematical formulation of this problem is as follows, where no additional constraints are considered.\n$$\\text{minimize} \\quad C(\\mathbf{x}) = \\frac{1}{2}\\int\\sigma\\epsilon \\mathrm{d}V = \\frac{1}{2}\\mathbf{f}^\\mathrm{T}\\mathbf{u}(\\mathbf{x}) $$\n$$\\text{subject to: } \\quad \\mathbf{K}(\\mathbf{x})\\mathbf{u}(\\mathbf{x}) =\\mathbf{f}$$\nwhere $C$ is the compliance, which is equal to the work done by the external load; $\\mathbf{x} \\in \\mathbb{R}^{n_d}$ is a vector of $n_d$ design variables that determine the shape of the structure; $\\sigma$, $\\epsilon$ and $V$ are the stress, strain and volume, respectively; $\\mathbf{f} \\in \\mathbb{R}^n$ and $\\mathbf{u}(\\mathbf{x}) \\in \\mathbb{R}^n$ are the generalized load vector and nodal displacement of $n$ structural nodes; $\\mathbf{K} \\in \\mathbb{R}^{6n\\times6n}$ is the stiffness matrix. The constraint is essentially the governing equation in finite element analysis (FEA).\n\nTo implement **gradient-based optimization**, one needs to calculate $\\nabla C$. By applying the ***adjoint method***, the entry of $\\nabla C$ is as follows:\n$$\\frac{\\partial C}{\\partial x_i}=-\\frac{1}{2}\\mathbf{u}^\\mathrm{T}\\frac{\\partial \\mathbf{K}}{\\partial x_i}\\mathbf{u}$$ The use of the adjoint method: i) reduces the computation complexity and ii) decouples FEA and the derivative calculation of the stiffness matrix $\\mathbf K$.\nTo get $\\nabla C$:\n1. Conduct FEA to get $\\mathbf u$\n2. Conduct sensitivity analysis to get $\\frac{\\partial \\mathbf{K}}{\\partial x_i}$.\n\n## Usage\n\n### Installation\nInstall it with pip: `pip install JaxSSO`\n\n### Dependencies\nJaxSSO is written in Python and requires:\n* [numpy](https://numpy.org/doc/stable/index.html) >= 1.22.0.\n* [JAX](https://jax.readthedocs.io/en/latest/index.html): \"JAX is [Autograd](https://github.com/hips/autograd) and [XLA](https://www.tensorflow.org/xla), brought together for high-performance machine learning research.\" Please refer to [this link](https://github.com/google/jax#installation) for the installation of JAX.\n* [Nlopt](https://nlopt.readthedocs.io/en/latest/): Nlopt is a library for nonlinear optimization. It has Python interface, which is implemented herein. Refer to [this link](https://nlopt.readthedocs.io/en/latest/NLopt_Installation/) for the installation of Nlopt. Alternatively, you can use `pip install nlopt`, please refer to [\nnlopt-python](https://pypi.org/project/nlopt/).\n\n\n### Quickstart\nThe project provides you with interactive examples with Google Colab for quick start:\n* [2D-arch](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/Arch_2D.ipynb): form-finding of a 2d-arch\n* [3D-arch](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/Arch_3D.ipynb): form-finding of a 3d-arch\n* [Mannheim Multihalle](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/Mannheim_Multihalle.ipynb): form-finding of Mannheim Multihalle\n* [Four-point supported gridshell](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/FourPt_FreeForm.ipynb): form-finding of a gridshell with four coner nodes pinned. The geometry is parameterized by Bezier Surface.\n* [Two-edge supported canopy, unconstrained](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/TwoEdge_FreeForm_Unconstrained.ipynb): form-finding of a canopy. The geometry is parameterized by Bezier Surface.\n* [Two-edge supported canopy, constrained](https://github.com/GaoyuanWu/JaxSSO/blob/main/Examples/TwoEdge_FreeForm_Constrained.ipynb): form-finding of a canopy with height constraints. The geometry is parameterized by Bezier Surface.\n\n\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "A framework for structural shape optimization based on automatic differentiation (AD) and the adjoint method, enabled by JAX",
    "version": "0.0.5",
    "split_keywords": [
        "jax",
        "automatic-differentiation",
        "shape optimization",
        "form-finding",
        "structural optimization"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3a42b0046acb696b05b372765ccf22ff1b6afb56c0f347ecc95179271932fef2",
                "md5": "084880573b062734c609e6bef6b1858d",
                "sha256": "7328c27744b523cbbcd6321b024b328efbfddcf5ff0c176cf1713cfc73b5cf5f"
            },
            "downloads": -1,
            "filename": "JaxSSO-0.0.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "084880573b062734c609e6bef6b1858d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 14634,
            "upload_time": "2023-04-04T14:39:27",
            "upload_time_iso_8601": "2023-04-04T14:39:27.432185Z",
            "url": "https://files.pythonhosted.org/packages/3a/42/b0046acb696b05b372765ccf22ff1b6afb56c0f347ecc95179271932fef2/JaxSSO-0.0.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "eb92f5ecddcc300d007ad5d12a1cdb133034d2b0d26a5388fa5045baf6a0f961",
                "md5": "f2e023d64172417301dddd18cfd5a45f",
                "sha256": "4344fe399a63a5b2d715e7186852b87d8d2f817de6414a7bf29484eddcbc97b0"
            },
            "downloads": -1,
            "filename": "JaxSSO-0.0.5.tar.gz",
            "has_sig": false,
            "md5_digest": "f2e023d64172417301dddd18cfd5a45f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 12761,
            "upload_time": "2023-04-04T14:39:29",
            "upload_time_iso_8601": "2023-04-04T14:39:29.846600Z",
            "url": "https://files.pythonhosted.org/packages/eb/92/f5ecddcc300d007ad5d12a1cdb133034d2b0d26a5388fa5045baf6a0f961/JaxSSO-0.0.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-04-04 14:39:29",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "GaoyuanWu",
    "github_project": "JaxSSO",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "jaxsso"
}
        
Elapsed time: 0.05122s