# So3krates-torch
> [!IMPORTANT]
> The code is work in progress! There may be breaking changes!
Lightweight implementation of the So3krates model in pytorch. This package is mostly intended for [aims-PAX](https://github.com/tohenkes/aims-PAX) but is a functional implementation of [So3krates](https://github.com/thorben-frank/mlff) and [SO3LR](https://github.com/general-molecular-simulations/so3lr) in pytorch. For now it uses (modified) source code of the [MACE](https://github.com/ACEsuit/mace) package and follows its style, so many functions are actually compatible.
#### Installation
1. activate your environment
2. clone this repository
3. move to the clone repository
4. `pip install -r requirements.xt`
5. `pip install .`
#### Implemented features:
1. ASE calculator for MD (including pre-trained SO3LR)
2. Inference over ase readable datasets: `torchkrates-eval`
3. Error metrics over ase readable datasets: `torchkrates-test`
4. Transforming pyTorch and JAX parameter formates: `torchkrates-jax2torch` or `torchkrates-torch2jax`
5. Training is WIP but `train` in `tools.train` is already working so you can easily build your own script
> [!IMPORTANT]
> Number 4 means that you can transform the weights from this pytorch version into the JAX version and vice versa. Inference and training is much faster (*at least 1 order of magnitude*) in the JAX version. This implementation is mostly for prototyping and compatability with other packages.
## TODO
- [ ] training
- [x] hirshfeld loss
- [ ] load trainings params from yaml
- [ ] script
- [ ] finetuning
- [ ] save and load hyperparameter json from torchkrates
- [ ] enable torch.script (important for openmm)
## Cite
If you are using the models implemented here please cite:
```bibtex
@article{kabylda2024molecular,
title={Molecular Simulations with a Pretrained Neural Network and Universal Pairwise Force Fields},
author={Kabylda, A. and Frank, J. T. and Dou, S. S. and Khabibrakhmanov, A. and Sandonas, L. M.
and Unke, O. T. and Chmiela, S. and M{\"u}ller, K.R. and Tkatchenko, A.},
journal={ChemRxiv},
year={2024},
doi={10.26434/chemrxiv-2024-bdfr0-v2}
}
@article{frank2024euclidean,
title={A Euclidean transformer for fast and stable machine learned force fields},
author={Frank, Thorben and Unke, Oliver and M{\"u}ller, Klaus-Robert and Chmiela, Stefan},
journal={Nature Communications},
volume={15},
number={1},
pages={6539},
year={2024}
}
```
Also consider citing MACE, as this software heavlily leans on or uses its code:
```bibtex
@inproceedings{Batatia2022mace,
title={{MACE}: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields},
author={Ilyes Batatia and David Peter Kovacs and Gregor N. C. Simm and Christoph Ortner and Gabor Csanyi},
booktitle={Advances in Neural Information Processing Systems},
editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
year={2022},
url={https://openreview.net/forum?id=YPpSngE-ZU}
}
@misc{Batatia2022Design,
title = {The Design Space of E(3)-Equivariant Atom-Centered Interatomic Potentials},
author = {Batatia, Ilyes and Batzner, Simon and Kov{\'a}cs, D{\'a}vid P{\'e}ter and Musaelian, Albert and Simm, Gregor N. C. and Drautz, Ralf and Ortner, Christoph and Kozinsky, Boris and Cs{\'a}nyi, G{\'a}bor},
year = {2022},
number = {arXiv:2205.06643},
eprint = {2205.06643},
eprinttype = {arxiv},
doi = {10.48550/arXiv.2205.06643},
archiveprefix = {arXiv}
}
```
## Contact
If you have questions you can reach me at: tobias.henkes@uni.lu
For bugs or feature requests, please use [GitHub Issues](https://github.com/tohenkes/So3krates-torch/issues).
## License
The code is published and distributed under the [MIT License](MIT.md).
Raw data
{
"_id": null,
"home_page": null,
"name": "So3krates-torch",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": "Tobias Henkes <tobias.henkes@uni.lu>",
"keywords": "machine-learning, molecular-dynamics, neural-networks, pytorch, atomistic-simulations, equivariant-networks, interatomic-potentials",
"author": null,
"author_email": "Tobias Henkes <tobias.henkes@uni.lu>",
"download_url": "https://files.pythonhosted.org/packages/27/6a/4a8aefe705f44631a05da7c3b7de366cd156079c8b2ff95252ab3c3a15eb/so3krates_torch-0.1.tar.gz",
"platform": null,
"description": "# So3krates-torch\n\n> [!IMPORTANT]\n> The code is work in progress! There may be breaking changes!\n\nLightweight implementation of the So3krates model in pytorch. This package is mostly intended for [aims-PAX](https://github.com/tohenkes/aims-PAX) but is a functional implementation of [So3krates](https://github.com/thorben-frank/mlff) and [SO3LR](https://github.com/general-molecular-simulations/so3lr) in pytorch. For now it uses (modified) source code of the [MACE](https://github.com/ACEsuit/mace) package and follows its style, so many functions are actually compatible.\n\n#### Installation\n\n1. activate your environment\n2. clone this repository\n3. move to the clone repository\n4. `pip install -r requirements.xt`\n5. `pip install .`\n\n#### Implemented features:\n1. ASE calculator for MD (including pre-trained SO3LR)\n2. Inference over ase readable datasets: `torchkrates-eval`\n3. Error metrics over ase readable datasets: `torchkrates-test`\n4. Transforming pyTorch and JAX parameter formates: `torchkrates-jax2torch` or `torchkrates-torch2jax`\n5. Training is WIP but `train` in `tools.train` is already working so you can easily build your own script\n\n\n> [!IMPORTANT]\n> Number 4 means that you can transform the weights from this pytorch version into the JAX version and vice versa. Inference and training is much faster (*at least 1 order of magnitude*) in the JAX version. This implementation is mostly for prototyping and compatability with other packages.\n\n\n\n## TODO\n\n- [ ] training\n - [x] hirshfeld loss\n - [ ] load trainings params from yaml\n - [ ] script\n- [ ] finetuning\n- [ ] save and load hyperparameter json from torchkrates\n- [ ] enable torch.script (important for openmm)\n\n\n## Cite\nIf you are using the models implemented here please cite:\n\n```bibtex\n@article{kabylda2024molecular,\n title={Molecular Simulations with a Pretrained Neural Network and Universal Pairwise Force Fields},\n author={Kabylda, A. and Frank, J. T. and Dou, S. S. and Khabibrakhmanov, A. and Sandonas, L. M.\n and Unke, O. T. and Chmiela, S. and M{\\\"u}ller, K.R. and Tkatchenko, A.},\n journal={ChemRxiv},\n year={2024},\n doi={10.26434/chemrxiv-2024-bdfr0-v2}\n}\n\n@article{frank2024euclidean,\n title={A Euclidean transformer for fast and stable machine learned force fields},\n author={Frank, Thorben and Unke, Oliver and M{\\\"u}ller, Klaus-Robert and Chmiela, Stefan},\n journal={Nature Communications},\n volume={15},\n number={1},\n pages={6539},\n year={2024}\n}\n```\n\nAlso consider citing MACE, as this software heavlily leans on or uses its code:\n\n\n```bibtex\n@inproceedings{Batatia2022mace,\n title={{MACE}: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields},\n author={Ilyes Batatia and David Peter Kovacs and Gregor N. C. Simm and Christoph Ortner and Gabor Csanyi},\n booktitle={Advances in Neural Information Processing Systems},\n editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},\n year={2022},\n url={https://openreview.net/forum?id=YPpSngE-ZU}\n}\n\n@misc{Batatia2022Design,\n title = {The Design Space of E(3)-Equivariant Atom-Centered Interatomic Potentials},\n author = {Batatia, Ilyes and Batzner, Simon and Kov{\\'a}cs, D{\\'a}vid P{\\'e}ter and Musaelian, Albert and Simm, Gregor N. C. and Drautz, Ralf and Ortner, Christoph and Kozinsky, Boris and Cs{\\'a}nyi, G{\\'a}bor},\n year = {2022},\n number = {arXiv:2205.06643},\n eprint = {2205.06643},\n eprinttype = {arxiv},\n doi = {10.48550/arXiv.2205.06643},\n archiveprefix = {arXiv}\n }\n```\n\n## Contact\n\nIf you have questions you can reach me at: tobias.henkes@uni.lu\n\nFor bugs or feature requests, please use [GitHub Issues](https://github.com/tohenkes/So3krates-torch/issues).\n\n## License\n\nThe code is published and distributed under the [MIT License](MIT.md).\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "PyTorch implementation of So3krates neural network potential for atomistic simulations",
"version": "0.1",
"project_urls": {
"Bug Reports": "https://github.com/tohenkes/So3krates-torch/issues",
"Source Code": "https://github.com/tohenkes/So3krates-torch",
"changelog": "https://github.com/tohenkes/So3krates-torch/blob/main/CHANGELOG.md",
"documentation": "https://github.com/tohenkes/So3krates-torch#readme",
"homepage": "https://github.com/tohenkes/So3krates-torch",
"issues": "https://github.com/tohenkes/So3krates-torch/issues",
"repository": "https://github.com/tohenkes/So3krates-torch"
},
"split_keywords": [
"machine-learning",
" molecular-dynamics",
" neural-networks",
" pytorch",
" atomistic-simulations",
" equivariant-networks",
" interatomic-potentials"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "7bc15437191c55d23ff642903dea884591076dd1141e14896abbd704b618c6ae",
"md5": "c12647b7119c283ca7117518a5109e11",
"sha256": "97766a87c787341d925ed4b3fbaafda6eee05045fa3219c62bf48c7716254ff8"
},
"downloads": -1,
"filename": "so3krates_torch-0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c12647b7119c283ca7117518a5109e11",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 2060452,
"upload_time": "2025-09-02T12:51:49",
"upload_time_iso_8601": "2025-09-02T12:51:49.250495Z",
"url": "https://files.pythonhosted.org/packages/7b/c1/5437191c55d23ff642903dea884591076dd1141e14896abbd704b618c6ae/so3krates_torch-0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "276a4a8aefe705f44631a05da7c3b7de366cd156079c8b2ff95252ab3c3a15eb",
"md5": "94bfe55abdd8f74c0dfa5ddeccbe210f",
"sha256": "58c1741042a893ee50a1f864778060434ef3dbc0c05cba5e45ee344edcac423d"
},
"downloads": -1,
"filename": "so3krates_torch-0.1.tar.gz",
"has_sig": false,
"md5_digest": "94bfe55abdd8f74c0dfa5ddeccbe210f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 2052320,
"upload_time": "2025-09-02T12:51:51",
"upload_time_iso_8601": "2025-09-02T12:51:51.085644Z",
"url": "https://files.pythonhosted.org/packages/27/6a/4a8aefe705f44631a05da7c3b7de366cd156079c8b2ff95252ab3c3a15eb/so3krates_torch-0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-02 12:51:51",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "tohenkes",
"github_project": "So3krates-torch",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [],
"lcname": "so3krates-torch"
}