dxtb


Namedxtb JSON
Version 0.0.1 PyPI version JSON
download
home_pagehttps://github.com/grimme-lab/dxtb
SummaryFully Differentiable Approach to Extended Tight Binding
upload_time2024-06-30 10:17:50
maintainerNone
docs_urlNone
author"Sebastian Ehlert, Marvin Friede, Christian Hölzer"
requires_python<3.12,>=3.8
licenseApache-2.0
keywords pytorch autograd tight-binding xtb computational chemistry quantum chemistry machine learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <h1 align="center"><img src="assets/logo.png" width="300"></h3>

<h3 align="center">Fully Differentiable Extended Tight-Binding</h3>
<p align="center">- Combining semi-empirical quantum chemistry with machine learning in PyTorch -</p>

<p align="center">
  <a href="https://github.com/grimme-lab/dxtb/releases/latest">
    <img src="https://img.shields.io/github/v/release/grimme-lab/dxtb?color=orange" alt="Release"/>
  </a>
  <a href="http://www.apache.org/licenses/LICENSE-2.0">
    <img src="https://img.shields.io/badge/License-Apache%202.0-orange.svg" alt="Apache-2.0"/>
  </a>
  <!---->
  <br>
  <!---->
  <a href="https://github.com/grimme-lab/dxtb/actions/workflows/ubuntu.yaml">
    <img src="https://github.com/grimme-lab/dxtb/actions/workflows/ubuntu.yaml/badge.svg" alt="Test Status Ubuntu"/>
  </a>
  <!-- <a href="https://github.com/grimme-lab/dxtb/actions/workflows/macos.yaml">
    <img src="https://github.com/grimme-lab/dxtb/actions/workflows/macos.yaml/badge.svg" alt="Test Status macOS"/>
  </a>
  <a href="https://github.com/grimme-lab/dxtb/actions/workflows/windows.yaml">
    <img src="https://github.com/grimme-lab/dxtb/actions/workflows/windows.yaml/badge.svg" alt="Test Status Windows"/>
  </a> -->
  <a href="https://github.com/grimme-lab/dxtb/actions/workflows/release.yaml">
    <img src="https://github.com/grimme-lab/dxtb/actions/workflows/release.yaml/badge.svg" alt="Build Status"/>
  </a>
  <a href="https://dxtb.readthedocs.io">
    <img src="https://readthedocs.org/projects/dxtb/badge/?version=latest" alt="Documentation Status"/>
  </a>
  <a href="https://results.pre-commit.ci/latest/github/grimme-lab/dxtb/main">
    <img src="https://results.pre-commit.ci/badge/github/grimme-lab/dxtb/main.svg" alt="pre-commit.ci Status"/>
  </a>
  <a href="https://codecov.io/gh/grimme-lab/dxtb">
    <img src="https://codecov.io/gh/grimme-lab/dxtb/branch/main/graph/badge.svg?token=O18EZ1CNE3" alt="Coverage"/>
  </a>
  <!---->
  <br>
  <!---->
  <a href="https://img.shields.io/badge/Python-3.8%20|%203.9%20|%203.10%20|%203.11-blue.svg">
    <img src="https://img.shields.io/badge/Python-3.8%20|%203.9%20|%203.10%20|%203.11-blue.svg" alt="Python Versions"/>
  </a>
  <a href="https://img.shields.io/badge/PyTorch-%3E=1.11.0-blue.svg">
    <img src="https://img.shields.io/badge/PyTorch-%3E=1.11.0-blue.svg" alt="PyTorch Versions"/>
  </a>
</p>

<br>

The xTB methods (GFNn-xTB) are a series of semi-empirical quantum chemical methods that provide a good balance between accuracy and computational cost.

With *dxtb*, we provide a re-implementation of the xTB methods in PyTorch, which allows for automatic differentiation and seamless integration into machine learning frameworks.


## Installation

### pip <a href="https://pypi.org/project/dxtb/"><img src="https://img.shields.io/pypi/v/dxtb" alt="PyPI Version"></a>

*dxtb* can easily be installed with ``pip``.

```sh
pip install dxtb
```

### conda <a href="https://anaconda.org/conda-forge/dxtb"><img src="https://img.shields.io/conda/vn/conda-forge/dxtb.svg" alt="Conda Version"></a>


*dxtb* is also available on [conda](https://conda.io/).

```sh
conda install dxtb
```

### Other

For more options, see the [installation guide](https://dxtb.readthedocs.io/en/latest/installation.html) in the documentation.


## Example

The following example demonstrates how to compute the energy and forces using GFN1-xTB.

```python
import torch
import dxtb

dd = {"dtype": torch.double, "device": torch.device("cpu")}

# LiH
numbers = torch.tensor([3, 1], device=dd["device"])
positions = torch.tensor([[0.0, 0.0, 0.0], [0.0, 0.0, 1.5]], **dd)

# instantiate a calculator
calc = dxtb.calculators.GFN1Calculator(numbers, **dd)

# compute the energy
pos = positions.clone().requires_grad_(True)
energy = calc.get_energy(pos)

# obtain gradient (dE/dR) via autograd
(g,) = torch.autograd.grad(energy, pos)

# Alternatively, forces can directly be requested from the calculator.
# (Don't forget to reset the calculator manually when the inputs are identical.)
calc.reset()
pos = positions.clone().requires_grad_(True)
forces = calc.get_forces(pos)

assert torch.equal(forces, -g)
```

For more examples and details, check out [the documentation](https://dxtb.readthedocs.io).


## Citation

If you use *dxtb* in your research, please cite the following paper:

- M. Friede, C. Hölzer, S. Ehlert, S. Grimme, *dxtb -- An Efficient and Fully Differentiable Framework for Extended Tight-Binding*, *J. Chem. Phys.*, **2024**

The Supporting Information can be found [here](https://github.com/grimme-lab/dxtb-data).


For details on the xTB methods, see

- C. Bannwarth, E. Caldeweyher, S. Ehlert, A. Hansen, P. Pracht, J. Seibert, S. Spicher, S. Grimme,
  *WIREs Comput. Mol. Sci.*, **2020**, 11, e01493.
  ([DOI](https://doi.org/10.1002/wcms.1493))
- C. Bannwarth, S. Ehlert, S. Grimme,
  *J. Chem. Theory Comput.*, **2019**, 15, 1652-1671.
  ([DOI](https://dx.doi.org/10.1021/acs.jctc.8b01176))
- S. Grimme, C. Bannwarth, P. Shushkov,
  *J. Chem. Theory Comput.*, **2017**, 13, 1989-2009.
  ([DOI](https://dx.doi.org/10.1021/acs.jctc.7b00118))


## Contributing

This is a volunteer open source projects and contributions are always welcome.
Please, take a moment to read the [contributing guidelines](CONTRIBUTING.md).

## License

This project is licensed under the Apache License, Version 2.0 (the "License"); you may not use this project's files except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/grimme-lab/dxtb",
    "name": "dxtb",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.12,>=3.8",
    "maintainer_email": null,
    "keywords": "pytorch, autograd, tight-binding, xtb, computational chemistry, quantum chemistry, machine learning",
    "author": "\"Sebastian Ehlert, Marvin Friede, Christian H\u00f6lzer\"",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/a4/52/fb3e7092283a1cc6a50efc88cf988b2a48232446555f19dae4d0583c3a99/dxtb-0.0.1.tar.gz",
    "platform": null,
    "description": "<h1 align=\"center\"><img src=\"assets/logo.png\" width=\"300\"></h3>\n\n<h3 align=\"center\">Fully Differentiable Extended Tight-Binding</h3>\n<p align=\"center\">- Combining semi-empirical quantum chemistry with machine learning in PyTorch -</p>\n\n<p align=\"center\">\n  <a href=\"https://github.com/grimme-lab/dxtb/releases/latest\">\n    <img src=\"https://img.shields.io/github/v/release/grimme-lab/dxtb?color=orange\" alt=\"Release\"/>\n  </a>\n  <a href=\"http://www.apache.org/licenses/LICENSE-2.0\">\n    <img src=\"https://img.shields.io/badge/License-Apache%202.0-orange.svg\" alt=\"Apache-2.0\"/>\n  </a>\n  <!---->\n  <br>\n  <!---->\n  <a href=\"https://github.com/grimme-lab/dxtb/actions/workflows/ubuntu.yaml\">\n    <img src=\"https://github.com/grimme-lab/dxtb/actions/workflows/ubuntu.yaml/badge.svg\" alt=\"Test Status Ubuntu\"/>\n  </a>\n  <!-- <a href=\"https://github.com/grimme-lab/dxtb/actions/workflows/macos.yaml\">\n    <img src=\"https://github.com/grimme-lab/dxtb/actions/workflows/macos.yaml/badge.svg\" alt=\"Test Status macOS\"/>\n  </a>\n  <a href=\"https://github.com/grimme-lab/dxtb/actions/workflows/windows.yaml\">\n    <img src=\"https://github.com/grimme-lab/dxtb/actions/workflows/windows.yaml/badge.svg\" alt=\"Test Status Windows\"/>\n  </a> -->\n  <a href=\"https://github.com/grimme-lab/dxtb/actions/workflows/release.yaml\">\n    <img src=\"https://github.com/grimme-lab/dxtb/actions/workflows/release.yaml/badge.svg\" alt=\"Build Status\"/>\n  </a>\n  <a href=\"https://dxtb.readthedocs.io\">\n    <img src=\"https://readthedocs.org/projects/dxtb/badge/?version=latest\" alt=\"Documentation Status\"/>\n  </a>\n  <a href=\"https://results.pre-commit.ci/latest/github/grimme-lab/dxtb/main\">\n    <img src=\"https://results.pre-commit.ci/badge/github/grimme-lab/dxtb/main.svg\" alt=\"pre-commit.ci Status\"/>\n  </a>\n  <a href=\"https://codecov.io/gh/grimme-lab/dxtb\">\n    <img src=\"https://codecov.io/gh/grimme-lab/dxtb/branch/main/graph/badge.svg?token=O18EZ1CNE3\" alt=\"Coverage\"/>\n  </a>\n  <!---->\n  <br>\n  <!---->\n  <a href=\"https://img.shields.io/badge/Python-3.8%20|%203.9%20|%203.10%20|%203.11-blue.svg\">\n    <img src=\"https://img.shields.io/badge/Python-3.8%20|%203.9%20|%203.10%20|%203.11-blue.svg\" alt=\"Python Versions\"/>\n  </a>\n  <a href=\"https://img.shields.io/badge/PyTorch-%3E=1.11.0-blue.svg\">\n    <img src=\"https://img.shields.io/badge/PyTorch-%3E=1.11.0-blue.svg\" alt=\"PyTorch Versions\"/>\n  </a>\n</p>\n\n<br>\n\nThe xTB methods (GFNn-xTB) are a series of semi-empirical quantum chemical methods that provide a good balance between accuracy and computational cost.\n\nWith *dxtb*, we provide a re-implementation of the xTB methods in PyTorch, which allows for automatic differentiation and seamless integration into machine learning frameworks.\n\n\n## Installation\n\n### pip <a href=\"https://pypi.org/project/dxtb/\"><img src=\"https://img.shields.io/pypi/v/dxtb\" alt=\"PyPI Version\"></a>\n\n*dxtb* can easily be installed with ``pip``.\n\n```sh\npip install dxtb\n```\n\n### conda <a href=\"https://anaconda.org/conda-forge/dxtb\"><img src=\"https://img.shields.io/conda/vn/conda-forge/dxtb.svg\" alt=\"Conda Version\"></a>\n\n\n*dxtb* is also available on [conda](https://conda.io/).\n\n```sh\nconda install dxtb\n```\n\n### Other\n\nFor more options, see the [installation guide](https://dxtb.readthedocs.io/en/latest/installation.html) in the documentation.\n\n\n## Example\n\nThe following example demonstrates how to compute the energy and forces using GFN1-xTB.\n\n```python\nimport torch\nimport dxtb\n\ndd = {\"dtype\": torch.double, \"device\": torch.device(\"cpu\")}\n\n# LiH\nnumbers = torch.tensor([3, 1], device=dd[\"device\"])\npositions = torch.tensor([[0.0, 0.0, 0.0], [0.0, 0.0, 1.5]], **dd)\n\n# instantiate a calculator\ncalc = dxtb.calculators.GFN1Calculator(numbers, **dd)\n\n# compute the energy\npos = positions.clone().requires_grad_(True)\nenergy = calc.get_energy(pos)\n\n# obtain gradient (dE/dR) via autograd\n(g,) = torch.autograd.grad(energy, pos)\n\n# Alternatively, forces can directly be requested from the calculator.\n# (Don't forget to reset the calculator manually when the inputs are identical.)\ncalc.reset()\npos = positions.clone().requires_grad_(True)\nforces = calc.get_forces(pos)\n\nassert torch.equal(forces, -g)\n```\n\nFor more examples and details, check out [the documentation](https://dxtb.readthedocs.io).\n\n\n## Citation\n\nIf you use *dxtb* in your research, please cite the following paper:\n\n- M. Friede, C. H\u00f6lzer, S. Ehlert, S. Grimme, *dxtb -- An Efficient and Fully Differentiable Framework for Extended Tight-Binding*, *J. Chem. Phys.*, **2024**\n\nThe Supporting Information can be found [here](https://github.com/grimme-lab/dxtb-data).\n\n\nFor details on the xTB methods, see\n\n- C. Bannwarth, E. Caldeweyher, S. Ehlert, A. Hansen, P. Pracht, J. Seibert, S. Spicher, S. Grimme,\n  *WIREs Comput. Mol. Sci.*, **2020**, 11, e01493.\n  ([DOI](https://doi.org/10.1002/wcms.1493))\n- C. Bannwarth, S. Ehlert, S. Grimme,\n  *J. Chem. Theory Comput.*, **2019**, 15, 1652-1671.\n  ([DOI](https://dx.doi.org/10.1021/acs.jctc.8b01176))\n- S. Grimme, C. Bannwarth, P. Shushkov,\n  *J. Chem. Theory Comput.*, **2017**, 13, 1989-2009.\n  ([DOI](https://dx.doi.org/10.1021/acs.jctc.7b00118))\n\n\n## Contributing\n\nThis is a volunteer open source projects and contributions are always welcome.\nPlease, take a moment to read the [contributing guidelines](CONTRIBUTING.md).\n\n## License\n\nThis project is licensed under the Apache License, Version 2.0 (the \"License\"); you may not use this project's files except in compliance with the License. You may obtain a copy of the License at\n\nhttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Fully Differentiable Approach to Extended Tight Binding",
    "version": "0.0.1",
    "project_urls": {
        "Documentation": "https://github.com/grimme-lab/dxtb",
        "Homepage": "https://github.com/grimme-lab/dxtb",
        "Source Code": "https://github.com/grimme-lab/dxtb",
        "Tracker": "https://github.com/grimme-lab/dxtb/issues"
    },
    "split_keywords": [
        "pytorch",
        " autograd",
        " tight-binding",
        " xtb",
        " computational chemistry",
        " quantum chemistry",
        " machine learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0f1fa4eb8c9fe35e170efd612c682d041123a543818c3028d45a762ecedd100e",
                "md5": "86a18a63daebbe49e7ef461b4b963021",
                "sha256": "f6ae6b0050e05eb2bf59a43f1c9697ffffc6a977987d17ceadc9399b85f3c060"
            },
            "downloads": -1,
            "filename": "dxtb-0.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "86a18a63daebbe49e7ef461b4b963021",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.12,>=3.8",
            "size": 505118,
            "upload_time": "2024-06-30T10:17:48",
            "upload_time_iso_8601": "2024-06-30T10:17:48.534632Z",
            "url": "https://files.pythonhosted.org/packages/0f/1f/a4eb8c9fe35e170efd612c682d041123a543818c3028d45a762ecedd100e/dxtb-0.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a452fb3e7092283a1cc6a50efc88cf988b2a48232446555f19dae4d0583c3a99",
                "md5": "93a0fe985e64b654a0a1e42cd1c0a064",
                "sha256": "14484fab751566d585130cd43c4453f51076f78292860be308c0112f8ae2efba"
            },
            "downloads": -1,
            "filename": "dxtb-0.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "93a0fe985e64b654a0a1e42cd1c0a064",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.12,>=3.8",
            "size": 312304,
            "upload_time": "2024-06-30T10:17:50",
            "upload_time_iso_8601": "2024-06-30T10:17:50.853314Z",
            "url": "https://files.pythonhosted.org/packages/a4/52/fb3e7092283a1cc6a50efc88cf988b2a48232446555f19dae4d0583c3a99/dxtb-0.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-30 10:17:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "grimme-lab",
    "github_project": "dxtb",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "tox": true,
    "lcname": "dxtb"
}
        
Elapsed time: 0.28764s