tensornetwork-ng


Nametensornetwork-ng JSON
Version 0.5.1 PyPI version JSON
download
home_pagehttps://github.com/refraction-ray/TensorNetwork-NG
SummaryA high level tensor network API for accelerated tensor network calculations.
upload_time2024-12-02 02:32:58
maintainerNone
docs_urlNone
authorThe TensorNetwork Developers
requires_python>=3.7.0
licenseApache 2.0
keywords
VCS
bugtrack_url
requirements numpy graphviz opt_einsum h5py scipy
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <img src="https://user-images.githubusercontent.com/8702042/67589472-5a1d0e80-f70d-11e9-8812-64647814ae96.png" width="60%" height="60%">

To make the upstream of [TensorCircuit-NG](https://github.com/tensorcircuit/tensorcircuit-ng) more robust and survive the breaking API changes from Python to Jax, this fork version of TensorNetwork will be further maintained (still at very early development stage).

A tensor network wrapper for TensorFlow, JAX, PyTorch, and Numpy.

For an overview of tensor networks please see the following:

- [Matrices as Tensor Network Diagrams](https://www.math3ma.com/blog/matrices-as-tensor-network-diagrams)

- [Crash Course in Tensor Networks (video)](https://www.youtube.com/watch?v=YN2YBB0viKo)

- [Hand-waving and interpretive dance: an introductory course on tensor networks](https://iopscience.iop.org/article/10.1088/1751-8121/aa6dc3)

- [Tensor Networks in a Nutshell](https://arxiv.org/abs/1708.00006)

- [A Practical Introduction to Tensor Networks](https://arxiv.org/abs/1306.2164)

More information can be found in our TensorNetwork papers:

- [TensorNetwork: A Library for Physics and Machine Learning](https://arxiv.org/abs/1905.01330)

- [TensorNetwork on TensorFlow: A Spin Chain Application Using Tree Tensor Networks](https://arxiv.org/abs/1905.01331)

- [TensorNetwork on TensorFlow: Entanglement Renormalization for quantum critical lattice models](https://arxiv.org/abs/1906.12030)

- [TensorNetwork for Machine Learning](https://arxiv.org/abs/1906.06329)

## Installation

```
pip3 install tensornetwork
```

## Documentation

For details about the TensorNetwork API, see the [reference documentation.](https://tensornetwork.readthedocs.io)

## Tutorials

[Basic API tutorial](https://colab.research.google.com/drive/1Fp9DolkPT-P_Dkg_s9PLbTOKSq64EVSu)

[Tensor Networks inside Neural Networks using Keras](https://colab.research.google.com/github/google/TensorNetwork/blob/master/colabs/Tensor_Networks_in_Neural_Networks.ipynb)

## Basic Example

Here, we build a simple 2 node contraction.

```python
import numpy as np
import tensornetwork as tn

# Create the nodes
a = tn.Node(np.ones((10,)))
b = tn.Node(np.ones((10,)))
edge = a[0] ^ b[0] # Equal to tn.connect(a[0], b[0])
final_node = tn.contract(edge)
print(final_node.tensor) # Should print 10.0
```

## Optimized Contractions.

Usually, it is more computationally effective to flatten parallel edges before contracting them in order to avoid trace edges.
We have `contract_between` and `contract_parallel` that do this automatically for your convenience.

```python
# Contract all of the edges between a and b
# and create a new node `c`.
c = tn.contract_between(a, b)
# This is the same as above, but much shorter.
c = a @ b

# Contract all of edges that are parallel to edge
# (parallel means connected to the same nodes).
c = tn.contract_parallel(edge)
```

## Split Node

You can split a node by doing a singular value decomposition.

```python
# This will return two nodes and a tensor of the truncation error.
# The two nodes are the unitary matrices multiplied by the square root of the
# singular values.
# The `left_edges` are the edges that will end up on the `u_s` node, and `right_edges`
# will be on the `vh_s` node.
u_s, vh_s, trun_error = tn.split_node(node, left_edges, right_edges)
# If you want the singular values in it's own node, you can use `split_node_full_svd`.
u, s, vh, trun_error = tn.split_node_full_svd(node, left_edges, right_edges)
```

## Node and Edge names.

You can optionally name your nodes/edges. This can be useful for debugging,
as all error messages will print the name of the broken edge/node.

```python
node = tn.Node(np.eye(2), name="Identity Matrix")
print("Name of node: {}".format(node.name))
edge = tn.connect(node[0], node[1], name="Trace Edge")
print("Name of the edge: {}".format(edge.name))
# Adding name to a contraction will add the name to the new edge created.
final_result = tn.contract(edge, name="Trace Of Identity")
print("Name of new node after contraction: {}".format(final_result.name))
```

## Named axes.

To make remembering what an axis does easier, you can optionally name a node's axes.

```python
a = tn.Node(np.zeros((2, 2)), axis_names=["alpha", "beta"])
edge = a["beta"] ^ a["alpha"]
```

## Edge reordering.

To assert that your result's axes are in the correct order, you can reorder a node at any time during computation.

```python
a = tn.Node(np.zeros((1, 2, 3)))
e1 = a[0]
e2 = a[1]
e3 = a[2]
a.reorder_edges([e3, e1, e2])
# If you already know the axis values, you can equivalently do
# a.reorder_axes([2, 0, 1])
print(a.tensor.shape) # Should print (3, 1, 2)
```

## NCON interface.

For a more compact specification of a tensor network and its contraction, there is `ncon()`. For example:

```python
from tensornetwork import ncon
a = np.ones((2, 2))
b = np.ones((2, 2))
c = ncon([a, b], [(-1, 1), (1, -2)])
print(c)
```

## Different backend support.

Currently, we support JAX, TensorFlow, PyTorch and NumPy as TensorNetwork backends.
We also support tensors with Abelian symmetries via a `symmetric` backend, see the [reference
documentation](https://tensornetwork.readthedocs.io/en/latest/block_sparse_tutorial.html) for more details.

To change the default global backend, you can do:

```python
tn.set_default_backend("jax") # tensorflow, pytorch, numpy, symmetric
```

Or, if you only want to change the backend for a single `Node`, you can do:

```python
tn.Node(tensor, backend="jax")
```

If you want to run your contractions on a GPU, we highly recommend using JAX, as it has the closet API to NumPy.

## Disclaimer

This library is in _alpha_ and will be going through a lot of breaking changes. While releases will be stable enough for research, we do not recommend using this in any production environment yet.

TensorNetwork is not an official Google product. Copyright 2019 The TensorNetwork Developers.

## Citation

If you are using TensorNetwork for your research please cite this work using the following bibtex entry:

```
@misc{roberts2019tensornetwork,
      title={TensorNetwork: A Library for Physics and Machine Learning},
      author={Chase Roberts and Ashley Milsted and Martin Ganahl and Adam Zalcman and Bruce Fontaine and Yijian Zou and Jack Hidary and Guifre Vidal and Stefan Leichenauer},
      year={2019},
      eprint={1905.01330},
      archivePrefix={arXiv},
      primaryClass={physics.comp-ph}
}
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/refraction-ray/TensorNetwork-NG",
    "name": "tensornetwork-ng",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7.0",
    "maintainer_email": null,
    "keywords": null,
    "author": "The TensorNetwork Developers",
    "author_email": "znfesnpbh@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/30/e3/738c9f253c790c0964f388711757e9f0af3b3f75e3fe2cf7427c9c34d290/tensornetwork_ng-0.5.1.tar.gz",
    "platform": null,
    "description": "<img src=\"https://user-images.githubusercontent.com/8702042/67589472-5a1d0e80-f70d-11e9-8812-64647814ae96.png\" width=\"60%\" height=\"60%\">\n\nTo make the upstream of [TensorCircuit-NG](https://github.com/tensorcircuit/tensorcircuit-ng) more robust and survive the breaking API changes from Python to Jax, this fork version of TensorNetwork will be further maintained (still at very early development stage).\n\nA tensor network wrapper for TensorFlow, JAX, PyTorch, and Numpy.\n\nFor an overview of tensor networks please see the following:\n\n- [Matrices as Tensor Network Diagrams](https://www.math3ma.com/blog/matrices-as-tensor-network-diagrams)\n\n- [Crash Course in Tensor Networks (video)](https://www.youtube.com/watch?v=YN2YBB0viKo)\n\n- [Hand-waving and interpretive dance: an introductory course on tensor networks](https://iopscience.iop.org/article/10.1088/1751-8121/aa6dc3)\n\n- [Tensor Networks in a Nutshell](https://arxiv.org/abs/1708.00006)\n\n- [A Practical Introduction to Tensor Networks](https://arxiv.org/abs/1306.2164)\n\nMore information can be found in our TensorNetwork papers:\n\n- [TensorNetwork: A Library for Physics and Machine Learning](https://arxiv.org/abs/1905.01330)\n\n- [TensorNetwork on TensorFlow: A Spin Chain Application Using Tree Tensor Networks](https://arxiv.org/abs/1905.01331)\n\n- [TensorNetwork on TensorFlow: Entanglement Renormalization for quantum critical lattice models](https://arxiv.org/abs/1906.12030)\n\n- [TensorNetwork for Machine Learning](https://arxiv.org/abs/1906.06329)\n\n## Installation\n\n```\npip3 install tensornetwork\n```\n\n## Documentation\n\nFor details about the TensorNetwork API, see the [reference documentation.](https://tensornetwork.readthedocs.io)\n\n## Tutorials\n\n[Basic API tutorial](https://colab.research.google.com/drive/1Fp9DolkPT-P_Dkg_s9PLbTOKSq64EVSu)\n\n[Tensor Networks inside Neural Networks using Keras](https://colab.research.google.com/github/google/TensorNetwork/blob/master/colabs/Tensor_Networks_in_Neural_Networks.ipynb)\n\n## Basic Example\n\nHere, we build a simple 2 node contraction.\n\n```python\nimport numpy as np\nimport tensornetwork as tn\n\n# Create the nodes\na = tn.Node(np.ones((10,)))\nb = tn.Node(np.ones((10,)))\nedge = a[0] ^ b[0] # Equal to tn.connect(a[0], b[0])\nfinal_node = tn.contract(edge)\nprint(final_node.tensor) # Should print 10.0\n```\n\n## Optimized Contractions.\n\nUsually, it is more computationally effective to flatten parallel edges before contracting them in order to avoid trace edges.\nWe have `contract_between` and `contract_parallel` that do this automatically for your convenience.\n\n```python\n# Contract all of the edges between a and b\n# and create a new node `c`.\nc = tn.contract_between(a, b)\n# This is the same as above, but much shorter.\nc = a @ b\n\n# Contract all of edges that are parallel to edge\n# (parallel means connected to the same nodes).\nc = tn.contract_parallel(edge)\n```\n\n## Split Node\n\nYou can split a node by doing a singular value decomposition.\n\n```python\n# This will return two nodes and a tensor of the truncation error.\n# The two nodes are the unitary matrices multiplied by the square root of the\n# singular values.\n# The `left_edges` are the edges that will end up on the `u_s` node, and `right_edges`\n# will be on the `vh_s` node.\nu_s, vh_s, trun_error = tn.split_node(node, left_edges, right_edges)\n# If you want the singular values in it's own node, you can use `split_node_full_svd`.\nu, s, vh, trun_error = tn.split_node_full_svd(node, left_edges, right_edges)\n```\n\n## Node and Edge names.\n\nYou can optionally name your nodes/edges. This can be useful for debugging,\nas all error messages will print the name of the broken edge/node.\n\n```python\nnode = tn.Node(np.eye(2), name=\"Identity Matrix\")\nprint(\"Name of node: {}\".format(node.name))\nedge = tn.connect(node[0], node[1], name=\"Trace Edge\")\nprint(\"Name of the edge: {}\".format(edge.name))\n# Adding name to a contraction will add the name to the new edge created.\nfinal_result = tn.contract(edge, name=\"Trace Of Identity\")\nprint(\"Name of new node after contraction: {}\".format(final_result.name))\n```\n\n## Named axes.\n\nTo make remembering what an axis does easier, you can optionally name a node's axes.\n\n```python\na = tn.Node(np.zeros((2, 2)), axis_names=[\"alpha\", \"beta\"])\nedge = a[\"beta\"] ^ a[\"alpha\"]\n```\n\n## Edge reordering.\n\nTo assert that your result's axes are in the correct order, you can reorder a node at any time during computation.\n\n```python\na = tn.Node(np.zeros((1, 2, 3)))\ne1 = a[0]\ne2 = a[1]\ne3 = a[2]\na.reorder_edges([e3, e1, e2])\n# If you already know the axis values, you can equivalently do\n# a.reorder_axes([2, 0, 1])\nprint(a.tensor.shape) # Should print (3, 1, 2)\n```\n\n## NCON interface.\n\nFor a more compact specification of a tensor network and its contraction, there is `ncon()`. For example:\n\n```python\nfrom tensornetwork import ncon\na = np.ones((2, 2))\nb = np.ones((2, 2))\nc = ncon([a, b], [(-1, 1), (1, -2)])\nprint(c)\n```\n\n## Different backend support.\n\nCurrently, we support JAX, TensorFlow, PyTorch and NumPy as TensorNetwork backends.\nWe also support tensors with Abelian symmetries via a `symmetric` backend, see the [reference\ndocumentation](https://tensornetwork.readthedocs.io/en/latest/block_sparse_tutorial.html) for more details.\n\nTo change the default global backend, you can do:\n\n```python\ntn.set_default_backend(\"jax\") # tensorflow, pytorch, numpy, symmetric\n```\n\nOr, if you only want to change the backend for a single `Node`, you can do:\n\n```python\ntn.Node(tensor, backend=\"jax\")\n```\n\nIf you want to run your contractions on a GPU, we highly recommend using JAX, as it has the closet API to NumPy.\n\n## Disclaimer\n\nThis library is in _alpha_ and will be going through a lot of breaking changes. While releases will be stable enough for research, we do not recommend using this in any production environment yet.\n\nTensorNetwork is not an official Google product. Copyright 2019 The TensorNetwork Developers.\n\n## Citation\n\nIf you are using TensorNetwork for your research please cite this work using the following bibtex entry:\n\n```\n@misc{roberts2019tensornetwork,\n      title={TensorNetwork: A Library for Physics and Machine Learning},\n      author={Chase Roberts and Ashley Milsted and Martin Ganahl and Adam Zalcman and Bruce Fontaine and Yijian Zou and Jack Hidary and Guifre Vidal and Stefan Leichenauer},\n      year={2019},\n      eprint={1905.01330},\n      archivePrefix={arXiv},\n      primaryClass={physics.comp-ph}\n}\n```\n",
    "bugtrack_url": null,
    "license": "Apache 2.0",
    "summary": "A high level tensor network API for accelerated tensor network calculations.",
    "version": "0.5.1",
    "project_urls": {
        "Homepage": "https://github.com/refraction-ray/TensorNetwork-NG"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "debe120e746deb786ebcb4d258c161c1863900ea76234b675053c22f64ad08b7",
                "md5": "b915a3726c315efcc69aceebeeee180a",
                "sha256": "d9479a8fc35c14aac327b1aba345672fe1e303ae88ed8d5c896c79a429a59afd"
            },
            "downloads": -1,
            "filename": "tensornetwork_ng-0.5.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b915a3726c315efcc69aceebeeee180a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7.0",
            "size": 244061,
            "upload_time": "2024-12-02T02:32:56",
            "upload_time_iso_8601": "2024-12-02T02:32:56.204734Z",
            "url": "https://files.pythonhosted.org/packages/de/be/120e746deb786ebcb4d258c161c1863900ea76234b675053c22f64ad08b7/tensornetwork_ng-0.5.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "30e3738c9f253c790c0964f388711757e9f0af3b3f75e3fe2cf7427c9c34d290",
                "md5": "95a73c5b8ccec4ff47cdd740f1deb75d",
                "sha256": "b123073969c3a5440bc364f7a0c1e428b19d3b7c0c7b4c7e6aa2e936bd7dbaf3"
            },
            "downloads": -1,
            "filename": "tensornetwork_ng-0.5.1.tar.gz",
            "has_sig": false,
            "md5_digest": "95a73c5b8ccec4ff47cdd740f1deb75d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7.0",
            "size": 130500,
            "upload_time": "2024-12-02T02:32:58",
            "upload_time_iso_8601": "2024-12-02T02:32:58.195294Z",
            "url": "https://files.pythonhosted.org/packages/30/e3/738c9f253c790c0964f388711757e9f0af3b3f75e3fe2cf7427c9c34d290/tensornetwork_ng-0.5.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-02 02:32:58",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "refraction-ray",
    "github_project": "TensorNetwork-NG",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "numpy",
            "specs": [
                [
                    ">=",
                    "1.17"
                ]
            ]
        },
        {
            "name": "graphviz",
            "specs": [
                [
                    ">=",
                    "0.11.1"
                ]
            ]
        },
        {
            "name": "opt_einsum",
            "specs": [
                [
                    ">=",
                    "2.3.0"
                ]
            ]
        },
        {
            "name": "h5py",
            "specs": [
                [
                    ">=",
                    "2.9.0"
                ]
            ]
        },
        {
            "name": "scipy",
            "specs": [
                [
                    ">=",
                    "1.1"
                ]
            ]
        }
    ],
    "lcname": "tensornetwork-ng"
}
        
Elapsed time: 0.63682s