Name | quests JSON |
Version |
2024.6.21
JSON |
| download |
home_page | None |
Summary | Quick Uncertainty and Entropy from STructural Similarity |
upload_time | 2024-06-21 16:31:54 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.8 |
license | BSD 3-Clause License Copyright (c) 2024, Lawrence Livermore National Security, LLC All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. |
keywords |
feed
reader
tutorial
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# QUESTS: Quick Uncertainty and Entropy via STructural Similarity
QUESTS provides model-free uncertainty and entropy estimation methods for interatomic potentials.
Among the methods, we propose a structural descriptor based on k-nearest neighbors that:
1. Is fast to compute, as it uses only distances between atoms within an environment.
Because the computation of descriptors is efficiently parallelized, generation of descriptors for 1.5M environments takes about 3 seconds on 56 threads (tested against Intel Xeon CLX-8276L CPUs).
2. Can be used to generate distributions and, in combination with information theory, gives rise to entropy values.
3. Is shown to recover thermodynamic quantities, order parameters, and many useful properties of information theory.
This package also contains metrics to quantify the diversity of a dataset using this descriptor, and tools to interface with other representations and packages.
## Installation
### From pip
```bash
pip install quests
```
### From repository
To install the `quests` package directly from the repository, clone it from GitHub and use `pip` to install it into your virtual environment:
```bash
git clone https://github.com/dskoda/quests.git
cd quests
pip install .
```
## Usage
### Command line
Once installed, you can use the `quests` command to perform different analyses. For example, to compute the entropy of any dataset (the input can be anything that ASE reads, including xyz files), you can use the `quests entropy` command:
```bash
quests entropy dump.lammpstrj --bandwidth 0.015
```
For subsampling the dataset and avoiding using the entire dataset, use the `entropy_sampler` example:
```bash
quests entropy_sampler dataset.xyz --batch_size 20000 -s 100000 -n 3
```
`-s` specifies the number of sampled environments, `-n` specifies how many runs will be computed (for statistics).
For additional help with these commands, please use `quests --help`, `quests entropy --help`, and others.
### API
#### Computing descriptors and dataset entropy
To use the QUESTS package to create descriptors and compute entropies, you can use the [descriptor](quests/descriptor.py) and [entropy](quests/entropy.py) submodules:
```python
from ase.io import read
from quests.descriptor import get_descriptors
from quests.entropy import perfect_entropy, diversity
dset = read("dataset.xyz", index=":")
x = get_descriptors(dset, k=32, cutoff=5.0)
h = 0.015
batch_size = 10000
H = perfect_entropy(x, h=h, batch_size=batch_size)
D = diversity(x, h=h, batch_size=batch_size)
```
In this example, descriptors are being created using 32 nearest neighbors and a 5.0 Å cutoff.
The entropy and diversity are being computed using a Gaussian kernel (default) with bandwidth of 0.015 1/Å and batch size of 10,000.
#### Computing differential entropies
```python
from ase.io import read
from quests.descriptor import get_descriptors
from quests.entropy import delta_entropy
dset_x = read("reference.xyz", index=":")
dset_y = read("test.xyz", index=":")
k, cutoff = 32, 5.0
x = get_descriptors(dset_x, k=k, cutoff=cutoff)
y = get_descriptors(dset_y, k=k, cutoff=cutoff)
# computes dH (Y | X)
dH = delta_entropy(y, x, h=0.015)
```
#### Computing approximate differential entropies
```python
from ase.io import read
from quests.descriptor import get_descriptors
from quests.entropy import approx_delta_entropy
dset_x = read("reference.xyz", index=":")
dset_y = read("test.xyz", index=":")
k, cutoff = 32, 5.0
x = get_descriptors(dset_x, k=k, cutoff=cutoff)
y = get_descriptors(dset_y, k=k, cutoff=cutoff)
# approximates dH (Y | X)
# n = 5 and graph_neighbors = 10 are arguments for
# pynndescent, which performs an approximate nearest
# neighbor search for dH
dH = approx_delta_entropy(y, x, h=0.015, n=5, graph_neighbors=10)
```
#### Computing the dataset entropy using PyTorch
To accelerate the computation of entropy of datasets, one can use PyTorch to compute the entropy of a system.
This can be done by first installing the optional dependencies for this repository:
```bash
pip install quests[gpu]
```
The syntax of the entropy, as computed with PyTorch, is identical to the one above.
Instead of loading the functions from [quests.entropy](quests/entropy.py), however, you should load them from [quests.gpu.entropy](quests/gpu/entropy.py).
The descriptors remain the same - as of now, creating descriptors using GPUs is not supported.
Note that this constraint requires the descriptors to be generated using the traditional routes, and later converted into a `torch.tensor`.
```python
import torch
from ase.io import read
from quests.descriptor import get_descriptors
from quests.gpu.entropy import perfect_entropy
dset = read("dataset.xyz", index=":")
x = get_descriptors(dset, k=32, cutoff=5.0)
x = torch.tensor(x, device="cuda")
h = 0.015
batch_size = 10000
H = perfect_entropy(x, h=h, batch_size=batch_size)
```
### Citing
If you use QUESTS in a publication, please cite the following paper:
```bibtex
@article{schwalbekoda2024information,
title = {Information theory unifies atomistic machine learning, uncertainty quantification, and materials thermodynamics},
author = {Schwalbe-Koda, Daniel and Hamel, Sebastien and Sadigh, Babak and Zhou, Fei and Lordi, Vincenzo},
year = {2024},
journal = {arXiv:2404.12367},
url = {https://arxiv.org/abs/2404.12367},
}
```
## License
The QUESTS software is distributed under the following license: BSD-3
All new contributions must be made under this license.
SPDX: BSD-3-Clause
## Acknowledgements
This work was produced under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, with support from LLNL's LDRD program under tracking codes 22-ERD-055 and 23-SI-006.
Code released as LLNL-CODE-858914
Raw data
{
"_id": null,
"home_page": null,
"name": "quests",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "feed, reader, tutorial",
"author": null,
"author_email": "Daniel Schwalbe-Koda <dskoda@ucla.edu>",
"download_url": "https://files.pythonhosted.org/packages/2a/0a/c539146dd55bb729c5cd902ac82ccbc1f301f05ca667582554ec0933b046/quests-2024.6.21.tar.gz",
"platform": null,
"description": "# QUESTS: Quick Uncertainty and Entropy via STructural Similarity\n\nQUESTS provides model-free uncertainty and entropy estimation methods for interatomic potentials.\nAmong the methods, we propose a structural descriptor based on k-nearest neighbors that:\n\n1. Is fast to compute, as it uses only distances between atoms within an environment.\nBecause the computation of descriptors is efficiently parallelized, generation of descriptors for 1.5M environments takes about 3 seconds on 56 threads (tested against Intel Xeon CLX-8276L CPUs).\n2. Can be used to generate distributions and, in combination with information theory, gives rise to entropy values.\n3. Is shown to recover thermodynamic quantities, order parameters, and many useful properties of information theory.\n\nThis package also contains metrics to quantify the diversity of a dataset using this descriptor, and tools to interface with other representations and packages.\n\n## Installation\n\n### From pip\n\n```bash\npip install quests\n```\n\n### From repository\n\nTo install the `quests` package directly from the repository, clone it from GitHub and use `pip` to install it into your virtual environment:\n\n```bash\ngit clone https://github.com/dskoda/quests.git\ncd quests\npip install .\n```\n\n## Usage\n\n### Command line\n\nOnce installed, you can use the `quests` command to perform different analyses. For example, to compute the entropy of any dataset (the input can be anything that ASE reads, including xyz files), you can use the `quests entropy` command:\n\n```bash\nquests entropy dump.lammpstrj --bandwidth 0.015\n```\n\nFor subsampling the dataset and avoiding using the entire dataset, use the `entropy_sampler` example:\n\n```bash\nquests entropy_sampler dataset.xyz --batch_size 20000 -s 100000 -n 3\n```\n\n`-s` specifies the number of sampled environments, `-n` specifies how many runs will be computed (for statistics).\n\nFor additional help with these commands, please use `quests --help`, `quests entropy --help`, and others.\n\n### API\n\n#### Computing descriptors and dataset entropy\n\nTo use the QUESTS package to create descriptors and compute entropies, you can use the [descriptor](quests/descriptor.py) and [entropy](quests/entropy.py) submodules:\n\n```python\nfrom ase.io import read\nfrom quests.descriptor import get_descriptors\nfrom quests.entropy import perfect_entropy, diversity\n\ndset = read(\"dataset.xyz\", index=\":\")\nx = get_descriptors(dset, k=32, cutoff=5.0)\nh = 0.015\nbatch_size = 10000\nH = perfect_entropy(x, h=h, batch_size=batch_size)\nD = diversity(x, h=h, batch_size=batch_size)\n```\n\nIn this example, descriptors are being created using 32 nearest neighbors and a 5.0 \u00c5\u00a0cutoff.\nThe entropy and diversity are being computed using a Gaussian kernel (default) with bandwidth of 0.015 1/\u00c5 and batch size of 10,000.\n\n#### Computing differential entropies\n\n```python\nfrom ase.io import read\nfrom quests.descriptor import get_descriptors\nfrom quests.entropy import delta_entropy\n\ndset_x = read(\"reference.xyz\", index=\":\")\ndset_y = read(\"test.xyz\", index=\":\")\n\nk, cutoff = 32, 5.0\nx = get_descriptors(dset_x, k=k, cutoff=cutoff)\ny = get_descriptors(dset_y, k=k, cutoff=cutoff)\n\n# computes dH (Y | X)\ndH = delta_entropy(y, x, h=0.015)\n```\n\n#### Computing approximate differential entropies\n\n```python\nfrom ase.io import read\nfrom quests.descriptor import get_descriptors\nfrom quests.entropy import approx_delta_entropy\n\ndset_x = read(\"reference.xyz\", index=\":\")\ndset_y = read(\"test.xyz\", index=\":\")\n\nk, cutoff = 32, 5.0\nx = get_descriptors(dset_x, k=k, cutoff=cutoff)\ny = get_descriptors(dset_y, k=k, cutoff=cutoff)\n\n# approximates dH (Y | X)\n# n = 5 and graph_neighbors = 10 are arguments for\n# pynndescent, which performs an approximate nearest\n# neighbor search for dH\ndH = approx_delta_entropy(y, x, h=0.015, n=5, graph_neighbors=10)\n```\n\n#### Computing the dataset entropy using PyTorch\n\nTo accelerate the computation of entropy of datasets, one can use PyTorch to compute the entropy of a system.\nThis can be done by first installing the optional dependencies for this repository:\n\n```bash\npip install quests[gpu]\n```\n\nThe syntax of the entropy, as computed with PyTorch, is identical to the one above.\nInstead of loading the functions from [quests.entropy](quests/entropy.py), however, you should load them from [quests.gpu.entropy](quests/gpu/entropy.py).\nThe descriptors remain the same - as of now, creating descriptors using GPUs is not supported.\nNote that this constraint requires the descriptors to be generated using the traditional routes, and later converted into a `torch.tensor`.\n\n```python\nimport torch\nfrom ase.io import read\nfrom quests.descriptor import get_descriptors\nfrom quests.gpu.entropy import perfect_entropy\n\ndset = read(\"dataset.xyz\", index=\":\")\nx = get_descriptors(dset, k=32, cutoff=5.0)\nx = torch.tensor(x, device=\"cuda\")\nh = 0.015\nbatch_size = 10000\nH = perfect_entropy(x, h=h, batch_size=batch_size)\n```\n\n### Citing\n\nIf you use QUESTS in a publication, please cite the following paper:\n\n```bibtex\n@article{schwalbekoda2024information,\n title = {Information theory unifies atomistic machine learning, uncertainty quantification, and materials thermodynamics},\n author = {Schwalbe-Koda, Daniel and Hamel, Sebastien and Sadigh, Babak and Zhou, Fei and Lordi, Vincenzo},\n year = {2024},\n journal = {arXiv:2404.12367},\n url = {https://arxiv.org/abs/2404.12367},\n}\n```\n## License\n\nThe QUESTS software is distributed under the following license: BSD-3\n\nAll new contributions must be made under this license.\n\nSPDX: BSD-3-Clause\n\n## Acknowledgements\n\nThis work was produced under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, with support from LLNL's LDRD program under tracking codes 22-ERD-055 and 23-SI-006.\n\nCode released as LLNL-CODE-858914\n",
"bugtrack_url": null,
"license": "BSD 3-Clause License Copyright (c) 2024, Lawrence Livermore National Security, LLC All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ",
"summary": "Quick Uncertainty and Entropy from STructural Similarity",
"version": "2024.6.21",
"project_urls": null,
"split_keywords": [
"feed",
" reader",
" tutorial"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "e797e62c8dc7ac15d80717c1028aeb644945c9d08ad928ff83fba60e8ac717d0",
"md5": "8a790d01abbd9a11c34c74f90727933f",
"sha256": "1ad2ef53b2d7069bbbd9a689323421b9485ac77f485238f7697073553d5d1bd7"
},
"downloads": -1,
"filename": "quests-2024.6.21-py3-none-any.whl",
"has_sig": false,
"md5_digest": "8a790d01abbd9a11c34c74f90727933f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 28266,
"upload_time": "2024-06-21T16:31:52",
"upload_time_iso_8601": "2024-06-21T16:31:52.064487Z",
"url": "https://files.pythonhosted.org/packages/e7/97/e62c8dc7ac15d80717c1028aeb644945c9d08ad928ff83fba60e8ac717d0/quests-2024.6.21-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "2a0ac539146dd55bb729c5cd902ac82ccbc1f301f05ca667582554ec0933b046",
"md5": "a8ee8885ce50442b3d8b17116ff334b2",
"sha256": "1bc801688d44d019ec2a84a8f894d3fef671bf80745a6edf148a33414bf46299"
},
"downloads": -1,
"filename": "quests-2024.6.21.tar.gz",
"has_sig": false,
"md5_digest": "a8ee8885ce50442b3d8b17116ff334b2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 25149,
"upload_time": "2024-06-21T16:31:54",
"upload_time_iso_8601": "2024-06-21T16:31:54.614247Z",
"url": "https://files.pythonhosted.org/packages/2a/0a/c539146dd55bb729c5cd902ac82ccbc1f301f05ca667582554ec0933b046/quests-2024.6.21.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-06-21 16:31:54",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "quests"
}