# JaxSSO
A framework for structural optimization based on automatic differentiation (AD) and the adjoint method, enabled by [JAX](https://github.com/google/jax).
Developed by [Gaoyuan Wu](https://gaoyuanwu.github.io/) @ Princeton.
## Features
* Automatic differentiation (AD): an easy and accurate way for gradient evaluation. The implementation of AD avoids deriving derivatives manually or trauncation errors from numerical differentiation.
* Acclerated linear algebra (XLA) and just-in-time compilation: these features in JAX boost the gradient evaluation
* Hardware acceleration: run on GPUs and TPUs for **faster** experience
* Support beam-column elements and MITC-4 quadrilateral shell elements
* Shape optimization, size optimization and topology optimization
* Seamless integration with machine learning (ML) libraries
## Usage
### Installation
Install it with pip: `pip install JaxSSO`
### Dependencies
JaxSSO is written in Python and requires:
* [numpy](https://numpy.org/doc/stable/index.html) >= 1.22.0.
* [JAX](https://jax.readthedocs.io/en/latest/index.html): "JAX is [Autograd](https://github.com/hips/autograd) and [XLA](https://www.tensorflow.org/xla), brought together for high-performance machine learning research." Please refer to [this link](https://github.com/google/jax#installation) for the installation of JAX.
* [Nlopt](https://nlopt.readthedocs.io/en/latest/): Nlopt is a library for nonlinear optimization. It has Python interface, which is implemented herein. Refer to [this link](https://nlopt.readthedocs.io/en/latest/NLopt_Installation/) for the installation of Nlopt. Alternatively, you can use `pip install nlopt`, please refer to [
nlopt-python](https://pypi.org/project/nlopt/).
* [scipy](https://scipy.org/).
### Quickstart
The project provides you with interactive examples with Google Colab for quick start: coming soon
Please share our project with others and cite us if you find it interesting and helpful.
The details of our current work are under review.
Our previous work can be seen in this [paper](https://link.springer.com/article/10.1007/s00158-023-03601-0) in [Structural and Multidisciplinary Optimization](https://www.springer.com/journal/158).
Cite our previous work using:
```bibtex
@article{wu_framework_2023,
title = {A framework for structural shape optimization based on automatic differentiation, the adjoint method and accelerated linear algebra},
volume = {66},
issn = {1615-1488},
url = {https://doi.org/10.1007/s00158-023-03601-0},
doi = {10.1007/s00158-023-03601-0},
language = {en},
number = {7},
urldate = {2023-06-21},
journal = {Structural and Multidisciplinary Optimization},
author = {Wu, Gaoyuan},
month = jun,
year = {2023},
keywords = {Adjoint method, Automatic differentiation, B茅zier surface, Form finding, JAX, Shape optimization, Shell structure},
pages = {151},
}
```
Raw data
{
"_id": null,
"home_page": "https://github.com/GaoyuanWu/JaxSSO",
"name": "JaxSSO",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "jax, automatic differentiation, shape optimization, finite element analysis, form finding, topology optimization, structural optimization",
"author": "Gaoyuan Wu",
"author_email": "gaoyuanw@princeton.edu",
"download_url": "https://files.pythonhosted.org/packages/7b/55/6a41783456740bc17d1d4451c78f60d3f1e7bf719f177f47bd308745fd9c/JaxSSO-1.0.0.tar.gz",
"platform": null,
"description": "# JaxSSO\nA framework for structural optimization based on automatic differentiation (AD) and the adjoint method, enabled by [JAX](https://github.com/google/jax).\n\nDeveloped by [Gaoyuan Wu](https://gaoyuanwu.github.io/) @ Princeton.\n\n## Features\n* Automatic differentiation (AD): an easy and accurate way for gradient evaluation. The implementation of AD avoids deriving derivatives manually or trauncation errors from numerical differentiation.\n* Acclerated linear algebra (XLA) and just-in-time compilation: these features in JAX boost the gradient evaluation\n* Hardware acceleration: run on GPUs and TPUs for **faster** experience\n* Support beam-column elements and MITC-4 quadrilateral shell elements\n* Shape optimization, size optimization and topology optimization\n* Seamless integration with machine learning (ML) libraries\n\n## Usage\n\n### Installation\nInstall it with pip: `pip install JaxSSO`\n\n### Dependencies\nJaxSSO is written in Python and requires:\n* [numpy](https://numpy.org/doc/stable/index.html) >= 1.22.0.\n* [JAX](https://jax.readthedocs.io/en/latest/index.html): \"JAX is [Autograd](https://github.com/hips/autograd) and [XLA](https://www.tensorflow.org/xla), brought together for high-performance machine learning research.\" Please refer to [this link](https://github.com/google/jax#installation) for the installation of JAX.\n* [Nlopt](https://nlopt.readthedocs.io/en/latest/): Nlopt is a library for nonlinear optimization. It has Python interface, which is implemented herein. Refer to [this link](https://nlopt.readthedocs.io/en/latest/NLopt_Installation/) for the installation of Nlopt. Alternatively, you can use `pip install nlopt`, please refer to [\nnlopt-python](https://pypi.org/project/nlopt/).\n* [scipy](https://scipy.org/).\n\n\n### Quickstart\nThe project provides you with interactive examples with Google Colab for quick start: coming soon\n\n\nPlease share our project with others and cite us if you find it interesting and helpful.\n\nThe details of our current work are under review.\n\nOur previous work can be seen in this [paper](https://link.springer.com/article/10.1007/s00158-023-03601-0) in [Structural and Multidisciplinary Optimization](https://www.springer.com/journal/158).\nCite our previous work using:\n```bibtex\n@article{wu_framework_2023,\n\ttitle = {A framework for structural shape optimization based on automatic differentiation, the adjoint method and accelerated linear algebra},\n\tvolume = {66},\n\tissn = {1615-1488},\n\turl = {https://doi.org/10.1007/s00158-023-03601-0},\n\tdoi = {10.1007/s00158-023-03601-0},\n\tlanguage = {en},\n\tnumber = {7},\n\turldate = {2023-06-21},\n\tjournal = {Structural and Multidisciplinary Optimization},\n\tauthor = {Wu, Gaoyuan},\n\tmonth = jun,\n\tyear = {2023},\n\tkeywords = {Adjoint method, Automatic differentiation, B\u8305zier surface, Form finding, JAX, Shape optimization, Shell structure},\n\tpages = {151},\n}\n\n\n```\n\n\n",
"bugtrack_url": null,
"license": null,
"summary": "A framework for structural shape optimization based on automatic differentiation (AD) and the adjoint method, enabled by JAX",
"version": "1.0.0",
"project_urls": {
"Homepage": "https://github.com/GaoyuanWu/JaxSSO"
},
"split_keywords": [
"jax",
" automatic differentiation",
" shape optimization",
" finite element analysis",
" form finding",
" topology optimization",
" structural optimization"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "2fbf447b65e1a1f24deaf9f47d54c618eb0aa555e08e8beb599736d53b099e43",
"md5": "89f1175805bcff112f34c85bdb554d1f",
"sha256": "e3bf7819c3f49787fa39b250397db49c5528072bf3e04f0b49519426c86e931c"
},
"downloads": -1,
"filename": "JaxSSO-1.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "89f1175805bcff112f34c85bdb554d1f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 28941,
"upload_time": "2024-07-29T02:10:39",
"upload_time_iso_8601": "2024-07-29T02:10:39.564634Z",
"url": "https://files.pythonhosted.org/packages/2f/bf/447b65e1a1f24deaf9f47d54c618eb0aa555e08e8beb599736d53b099e43/JaxSSO-1.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7b556a41783456740bc17d1d4451c78f60d3f1e7bf719f177f47bd308745fd9c",
"md5": "2f12539aa5ec1eec3e8cfb12949b86ad",
"sha256": "09465b7d0ee365697397fa4da9e59f650993c83bee5b339745a8987ab3c20547"
},
"downloads": -1,
"filename": "JaxSSO-1.0.0.tar.gz",
"has_sig": false,
"md5_digest": "2f12539aa5ec1eec3e8cfb12949b86ad",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 25863,
"upload_time": "2024-07-29T02:10:41",
"upload_time_iso_8601": "2024-07-29T02:10:41.071404Z",
"url": "https://files.pythonhosted.org/packages/7b/55/6a41783456740bc17d1d4451c78f60d3f1e7bf719f177f47bd308745fd9c/JaxSSO-1.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-07-29 02:10:41",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "GaoyuanWu",
"github_project": "JaxSSO",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "jaxsso"
}