torch-treecrf


Nametorch-treecrf JSON
Version 0.2.0 PyPI version JSON
download
home_pagehttps://github.com/althonos/torch-treecrf
SummaryA PyTorch implementation of Tree-structured Conditional Random Fields.
upload_time2024-02-16 12:37:02
maintainer
docs_urlNone
authorMartin Larralde
requires_python>=3.6
licenseMIT
keywords torch
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # 🌲 `torch-treecrf`

*A [PyTorch](https://pytorch.org/) implementation of Tree-structured Conditional Random Fields.*

[![Actions](https://img.shields.io/github/actions/workflow/status/althonos/torch-treecrf/test.yml?branch=main&logo=github&style=flat-square&maxAge=300)](https://github.com/althonos/torch-treecrf/actions)
[![Coverage](https://img.shields.io/codecov/c/gh/althonos/torch-treecrf?style=flat-square&maxAge=3600)](https://codecov.io/gh/althonos/torch-treecrf/)
[![License](https://img.shields.io/badge/license-GPLv3-blue.svg?style=flat-square&maxAge=2678400)](https://choosealicense.com/licenses/gpl-3.0/)
[![PyPI](https://img.shields.io/pypi/v/torch-treecrf.svg?style=flat-square&maxAge=3600)](https://pypi.org/project/torch-treecrf)
[![Wheel](https://img.shields.io/pypi/wheel/torch-treecrf.svg?style=flat-square&maxAge=3600)](https://pypi.org/project/torch-treecrf/#files)
[![Python Versions](https://img.shields.io/pypi/pyversions/torch-treecrf.svg?style=flat-square&maxAge=3600)](https://pypi.org/project/torch-treecrf/#files)
[![Python Implementations](https://img.shields.io/badge/impl-universal-success.svg?style=flat-square&maxAge=3600&label=impl)](https://pypi.org/project/torch-treecrf/#files)
[![Source](https://img.shields.io/badge/source-GitHub-303030.svg?maxAge=2678400&style=flat-square)](https://github.com/althonos/torch-treecrf/)
[![GitHub issues](https://img.shields.io/github/issues/althonos/torch-treecrf.svg?style=flat-square&maxAge=600)](https://github.com/althonos/torch-treecrf/issues)
[![Changelog](https://img.shields.io/badge/keep%20a-changelog-8A0707.svg?maxAge=2678400&style=flat-square)](https://github.com/althonos/torch-treecrf.py/blob/master/CHANGELOG.md)
[![Downloads](https://img.shields.io/badge/dynamic/json?style=flat-square&color=303f9f&maxAge=86400&label=downloads&query=%24.total_downloads&url=https%3A%2F%2Fapi.pepy.tech%2Fapi%2Fprojects%2Ftorch-treecrf)](https://pepy.tech/project/torch-treecrf)

## 🗺️ Overview

[Conditional Random Fields](https://en.wikipedia.org/wiki/Conditional_random_field)
(CRF) are a family of discriminative graphical learning models that can be used
to model the dependencies between variables. The most common
form of CRFs are Linear-chain CRF, where a prediction depends on
an observed variable, as well as the prediction before and after it
(the *context*). Linear-chain CRFs are widely used in Natural Language Processing.

<p align="center">
  <img height="150" src="https://github.com/althonos/torch-treecrf/raw/main/static/linear-chain-crf.svg?raw=true">
</p>

$$
P(Y | X) = \frac{1}{Z(X)} \prod_{i=1}^n{ \Psi_i(y_i, x_i) } \prod_{i=2}^n{ \Psi_{i-1,i}(y_{i-1}, y_i)}
$$

In 2006, Tang *et al.*[[1]](#ref1) introduced Tree-structured CRFs to model hierarchical
relationships between predicted variables, allowing dependencies between
a prediction variable and its parents and children.

<p align="center">
  <img height="280" src="https://github.com/althonos/torch-treecrf/raw/main/static/tree-structured-crf.svg?raw=true">
</p>

$$
P(Y | X) = \frac{1}{Z(X)} \prod_{i=1}^{n}{ \Psi_i(y_i, x_i) } \prod_{j \in \mathcal{N}(i)}{ \Psi_{j,i}(y_j, y_i)}
$$

This package implements a generic Tree-structured CRF layer in PyTorch. The
layer can be stacked on top of a [linear layer](https://pytorch.org/docs/stable/generated/torch.nn.Linear.html) to implement a proper Tree-structured CRF, or on any other kind of model
producing emission scores in log-space for every class of each label. Computation
of marginals is implemented using Belief Propagation[[2]](#ref2), allowing for
exact inference on trees[[3]](#ref3):

$$
\begin{aligned}
P(y_i | X)
& =
    \frac{1}{Z(X)} \Psi_i(y_i, x_i)
    & \underbrace{\prod_{j \in \mathcal{C}(i)}{\mu_{j \to i}(y_i)}} &
    & \underbrace{\prod_{j \in \mathcal{P}(i)}{\mu_{j \to i}(y_i)}} \\
& = \frac1Z \Psi_i(y_i, x_i)
    & \alpha_i(y_i) &
    & \beta_i(y_i)  \\
\end{aligned}
$$

where for every node $i$, the message from the parents $\mathcal{P}(i)$ and
the children $\mathcal{C}(i)$ is computed recursively with the sum-product algorithm[[4]](#ref4):

$$
\begin{aligned}
\forall j \in \mathcal{C}(i), \mu_{j \to i}(y_i) = \sum_{y_j}{
  \Psi_{i,j}(y_i, y_j)
  \Psi_j(y_j, x_j)
  \prod_{k \in \mathcal{C}(j)}{\mu_{k \to j}(y_j)}
} \\
\forall j \in \mathcal{P}(i), \mu_{j \to i}(y_i) = \sum_{y_j}{
  \Psi_{i,j}(y_i, y_j)
  \Psi_j(y_j, x_j)
  \prod_{k \in \mathcal{P}(j)}{\mu_{k \to j}(y_j)}
} \\
\end{aligned}
$$


*The implementation should be generic enough that any kind of [Directed acyclic graph](https://en.wikipedia.org/wiki/Directed_acyclic_graph) can be used as a label hierarchy,
not just trees.*

## 🔧 Installing

Install the `torch-treecrf` package directly from [PyPi](https://pypi.org/project/peptides)
which hosts universal wheels that can be installed with `pip`:
```console
$ pip install torch-treecrf
```

## 📋 Features

- Encoding of directed graphs in an adjacency matrix, with $\mathcal{O}(1)$ retrieval of children and parents for any node, and $\mathcal{O}(N+E)$ storage.
- Support for any acyclic hierarchy representable as a [Directed Acyclic Graph](https://en.wikipedia.org/wiki/Directed_acyclic_graph) and not just directed trees, allowing prediction of classes such as the [Gene Ontology](https://geneontology.org).
- Multiclass output, provided all the target labels have the same number of classes: $Y \in \left\\{ 0, .., C \right\\}^L$.
- Minibatch support, with vectorized computation of the messages $\alpha_i(y_i)$ and $\beta_i(y_i)$.


## 💡 Example

To create a Tree-structured CRF, you must first define the tree encoding the
relationships between variables. Let's build a simple CRF for a root variable
with two children:

<p align="center">
  <img height="150" src="https://github.com/althonos/torch-treecrf/raw/main/static/example.svg?raw=true">
</p>

First, define an adjacency matrix $M$ representing the hierarchy, such that
$M_{i,j}$ is $1$ if $j$ is a parent of $i$:
```python
adjacency = torch.tensor([
    [0, 0, 0],
    [1, 0, 0],
    [1, 0, 0]
])
```

Then, create the a CRF with the right number of features, depending on your
feature space, like you would for a `torch.nn.Linear` module, to obtain
a Torch model:
```python
crf = torch_treecrf.TreeCRF(n_features=30, hierarchy=hierarchy)
```

If you wish to use the CRF layer only, use the `TreeCRFLayer` module,
which expects and outputs an emission tensor of shape
$(\star, C, L)$, where $\star$ is the minibatch size, $L$ the number of labels and
$C$ the number of class per label.


## 💭 Feedback

### ⚠️ Issue Tracker

Found a bug ? Have an enhancement request ? Head over to the [GitHub issue
tracker](https://github.com/althonos/torch-treecrf/issues) if you need to report
or ask something. If you are filing in on a bug, please include as much
information as you can about the issue, and try to recreate the same bug
in a simple, easily reproducible situation.

### 🏗️ Contributing

Contributions are more than welcome! See
[`CONTRIBUTING.md`](https://github.com/althonos/torch-treecrf/blob/main/CONTRIBUTING.md)
for more details.

## ⚖️ License

This library is provided under the [MIT License](https://choosealicense.com/licenses/mit/).

*This library was developed by [Martin Larralde](https://github.com/althonos/)
during his PhD project at the [European Molecular Biology Laboratory](https://www.embl.de/)
in the [Zeller team](https://github.com/zellerlab).*

## 📚 References

- <a id="ref1">[1]</a> Tang, Jie, Mingcai Hong, Juanzi Li, and Bangyong Liang. ‘Tree-Structured Conditional Random Fields for Semantic Annotation’. In The Semantic Web - ISWC 2006, edited by Isabel Cruz, Stefan Decker, Dean Allemang, Chris Preist, Daniel Schwabe, Peter Mika, Mike Uschold, and Lora M. Aroyo, 640–53. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer, 2006. [doi:10.1007/11926078_46](https://doi.org/10.1007/11926078_46).
- <a id="ref2">[2]</a> Pearl, Judea. ‘Reverend Bayes on Inference Engines: A Distributed Hierarchical   Approach’. In Proceedings of the Second AAAI Conference on Artificial Intelligence, 133–136. AAAI’82. Pittsburgh, Pennsylvania: AAAI Press, 1982.
- <a id="ref3">[3]</a> Bach, Francis, and Guillaume Obozinski. ‘Sum Product Algorithm and Hidden Markov Model’, ENS Course Material, 2016. http://imagine.enpc.fr/%7Eobozinsg/teaching/mva_gm/lecture_notes/lecture7.pdf.
- <a id="ref4>">[4]</a> Kschischang, Frank R., Brendan J. Frey, and Hans-Andrea Loeliger. ‘Factor Graphs and the Sum-Product Algorithm’. IEEE Transactions on Information Theory 47, no. 2 (February 2001): 498–519. [doi:10.1109/18.910572](https://doi.org/10.1109/18.910572).



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/althonos/torch-treecrf",
    "name": "torch-treecrf",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "torch",
    "author": "Martin Larralde",
    "author_email": "martin.larralde@embl.de",
    "download_url": "https://files.pythonhosted.org/packages/f8/a0/2a7bfd00ee7f7be32eb816f318a795a5e9b7c246cab4861b60027299762a/torch-treecrf-0.2.0.tar.gz",
    "platform": "any",
    "description": "# \ud83c\udf32 `torch-treecrf`\n\n*A [PyTorch](https://pytorch.org/) implementation of Tree-structured Conditional Random Fields.*\n\n[![Actions](https://img.shields.io/github/actions/workflow/status/althonos/torch-treecrf/test.yml?branch=main&logo=github&style=flat-square&maxAge=300)](https://github.com/althonos/torch-treecrf/actions)\n[![Coverage](https://img.shields.io/codecov/c/gh/althonos/torch-treecrf?style=flat-square&maxAge=3600)](https://codecov.io/gh/althonos/torch-treecrf/)\n[![License](https://img.shields.io/badge/license-GPLv3-blue.svg?style=flat-square&maxAge=2678400)](https://choosealicense.com/licenses/gpl-3.0/)\n[![PyPI](https://img.shields.io/pypi/v/torch-treecrf.svg?style=flat-square&maxAge=3600)](https://pypi.org/project/torch-treecrf)\n[![Wheel](https://img.shields.io/pypi/wheel/torch-treecrf.svg?style=flat-square&maxAge=3600)](https://pypi.org/project/torch-treecrf/#files)\n[![Python Versions](https://img.shields.io/pypi/pyversions/torch-treecrf.svg?style=flat-square&maxAge=3600)](https://pypi.org/project/torch-treecrf/#files)\n[![Python Implementations](https://img.shields.io/badge/impl-universal-success.svg?style=flat-square&maxAge=3600&label=impl)](https://pypi.org/project/torch-treecrf/#files)\n[![Source](https://img.shields.io/badge/source-GitHub-303030.svg?maxAge=2678400&style=flat-square)](https://github.com/althonos/torch-treecrf/)\n[![GitHub issues](https://img.shields.io/github/issues/althonos/torch-treecrf.svg?style=flat-square&maxAge=600)](https://github.com/althonos/torch-treecrf/issues)\n[![Changelog](https://img.shields.io/badge/keep%20a-changelog-8A0707.svg?maxAge=2678400&style=flat-square)](https://github.com/althonos/torch-treecrf.py/blob/master/CHANGELOG.md)\n[![Downloads](https://img.shields.io/badge/dynamic/json?style=flat-square&color=303f9f&maxAge=86400&label=downloads&query=%24.total_downloads&url=https%3A%2F%2Fapi.pepy.tech%2Fapi%2Fprojects%2Ftorch-treecrf)](https://pepy.tech/project/torch-treecrf)\n\n## \ud83d\uddfa\ufe0f Overview\n\n[Conditional Random Fields](https://en.wikipedia.org/wiki/Conditional_random_field)\n(CRF) are a family of discriminative graphical learning models that can be used\nto model the dependencies between variables. The most common\nform of CRFs are Linear-chain CRF, where a prediction depends on\nan observed variable, as well as the prediction before and after it\n(the *context*). Linear-chain CRFs are widely used in Natural Language Processing.\n\n<p align=\"center\">\n  <img height=\"150\" src=\"https://github.com/althonos/torch-treecrf/raw/main/static/linear-chain-crf.svg?raw=true\">\n</p>\n\n$$\nP(Y | X) = \\frac{1}{Z(X)} \\prod_{i=1}^n{ \\Psi_i(y_i, x_i) } \\prod_{i=2}^n{ \\Psi_{i-1,i}(y_{i-1}, y_i)}\n$$\n\nIn 2006, Tang *et al.*[[1]](#ref1) introduced Tree-structured CRFs to model hierarchical\nrelationships between predicted variables, allowing dependencies between\na prediction variable and its parents and children.\n\n<p align=\"center\">\n  <img height=\"280\" src=\"https://github.com/althonos/torch-treecrf/raw/main/static/tree-structured-crf.svg?raw=true\">\n</p>\n\n$$\nP(Y | X) = \\frac{1}{Z(X)} \\prod_{i=1}^{n}{ \\Psi_i(y_i, x_i) } \\prod_{j \\in \\mathcal{N}(i)}{ \\Psi_{j,i}(y_j, y_i)}\n$$\n\nThis package implements a generic Tree-structured CRF layer in PyTorch. The\nlayer can be stacked on top of a [linear layer](https://pytorch.org/docs/stable/generated/torch.nn.Linear.html) to implement a proper Tree-structured CRF, or on any other kind of model\nproducing emission scores in log-space for every class of each label. Computation\nof marginals is implemented using Belief Propagation[[2]](#ref2), allowing for\nexact inference on trees[[3]](#ref3):\n\n$$\n\\begin{aligned}\nP(y_i | X)\n& =\n    \\frac{1}{Z(X)} \\Psi_i(y_i, x_i)\n    & \\underbrace{\\prod_{j \\in \\mathcal{C}(i)}{\\mu_{j \\to i}(y_i)}} &\n    & \\underbrace{\\prod_{j \\in \\mathcal{P}(i)}{\\mu_{j \\to i}(y_i)}} \\\\\n& = \\frac1Z \\Psi_i(y_i, x_i)\n    & \\alpha_i(y_i) &\n    & \\beta_i(y_i)  \\\\\n\\end{aligned}\n$$\n\nwhere for every node $i$, the message from the parents $\\mathcal{P}(i)$ and\nthe children $\\mathcal{C}(i)$ is computed recursively with the sum-product algorithm[[4]](#ref4):\n\n$$\n\\begin{aligned}\n\\forall j \\in \\mathcal{C}(i), \\mu_{j \\to i}(y_i) = \\sum_{y_j}{\n  \\Psi_{i,j}(y_i, y_j)\n  \\Psi_j(y_j, x_j)\n  \\prod_{k \\in \\mathcal{C}(j)}{\\mu_{k \\to j}(y_j)}\n} \\\\\n\\forall j \\in \\mathcal{P}(i), \\mu_{j \\to i}(y_i) = \\sum_{y_j}{\n  \\Psi_{i,j}(y_i, y_j)\n  \\Psi_j(y_j, x_j)\n  \\prod_{k \\in \\mathcal{P}(j)}{\\mu_{k \\to j}(y_j)}\n} \\\\\n\\end{aligned}\n$$\n\n\n*The implementation should be generic enough that any kind of [Directed acyclic graph](https://en.wikipedia.org/wiki/Directed_acyclic_graph) can be used as a label hierarchy,\nnot just trees.*\n\n## \ud83d\udd27 Installing\n\nInstall the `torch-treecrf` package directly from [PyPi](https://pypi.org/project/peptides)\nwhich hosts universal wheels that can be installed with `pip`:\n```console\n$ pip install torch-treecrf\n```\n\n## \ud83d\udccb Features\n\n- Encoding of directed graphs in an adjacency matrix, with $\\mathcal{O}(1)$ retrieval of children and parents for any node, and $\\mathcal{O}(N+E)$ storage.\n- Support for any acyclic hierarchy representable as a [Directed Acyclic Graph](https://en.wikipedia.org/wiki/Directed_acyclic_graph) and not just directed trees, allowing prediction of classes such as the [Gene Ontology](https://geneontology.org).\n- Multiclass output, provided all the target labels have the same number of classes: $Y \\in \\left\\\\{ 0, .., C \\right\\\\}^L$.\n- Minibatch support, with vectorized computation of the messages $\\alpha_i(y_i)$ and $\\beta_i(y_i)$.\n\n\n## \ud83d\udca1 Example\n\nTo create a Tree-structured CRF, you must first define the tree encoding the\nrelationships between variables. Let's build a simple CRF for a root variable\nwith two children:\n\n<p align=\"center\">\n  <img height=\"150\" src=\"https://github.com/althonos/torch-treecrf/raw/main/static/example.svg?raw=true\">\n</p>\n\nFirst, define an adjacency matrix $M$ representing the hierarchy, such that\n$M_{i,j}$ is $1$ if $j$ is a parent of $i$:\n```python\nadjacency = torch.tensor([\n    [0, 0, 0],\n    [1, 0, 0],\n    [1, 0, 0]\n])\n```\n\nThen, create the a CRF with the right number of features, depending on your\nfeature space, like you would for a `torch.nn.Linear` module, to obtain\na Torch model:\n```python\ncrf = torch_treecrf.TreeCRF(n_features=30, hierarchy=hierarchy)\n```\n\nIf you wish to use the CRF layer only, use the `TreeCRFLayer` module,\nwhich expects and outputs an emission tensor of shape\n$(\\star, C, L)$, where $\\star$ is the minibatch size, $L$ the number of labels and\n$C$ the number of class per label.\n\n\n## \ud83d\udcad Feedback\n\n### \u26a0\ufe0f Issue Tracker\n\nFound a bug ? Have an enhancement request ? Head over to the [GitHub issue\ntracker](https://github.com/althonos/torch-treecrf/issues) if you need to report\nor ask something. If you are filing in on a bug, please include as much\ninformation as you can about the issue, and try to recreate the same bug\nin a simple, easily reproducible situation.\n\n### \ud83c\udfd7\ufe0f Contributing\n\nContributions are more than welcome! See\n[`CONTRIBUTING.md`](https://github.com/althonos/torch-treecrf/blob/main/CONTRIBUTING.md)\nfor more details.\n\n## \u2696\ufe0f License\n\nThis library is provided under the [MIT License](https://choosealicense.com/licenses/mit/).\n\n*This library was developed by [Martin Larralde](https://github.com/althonos/)\nduring his PhD project at the [European Molecular Biology Laboratory](https://www.embl.de/)\nin the [Zeller team](https://github.com/zellerlab).*\n\n## \ud83d\udcda References\n\n- <a id=\"ref1\">[1]</a> Tang, Jie, Mingcai Hong, Juanzi Li, and Bangyong Liang. \u2018Tree-Structured Conditional Random Fields for Semantic Annotation\u2019. In The Semantic Web - ISWC 2006, edited by Isabel Cruz, Stefan Decker, Dean Allemang, Chris Preist, Daniel Schwabe, Peter Mika, Mike Uschold, and Lora M. Aroyo, 640\u201353. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer, 2006. [doi:10.1007/11926078_46](https://doi.org/10.1007/11926078_46).\n- <a id=\"ref2\">[2]</a> Pearl, Judea. \u2018Reverend Bayes on Inference Engines: A Distributed Hierarchical   Approach\u2019. In Proceedings of the Second AAAI Conference on Artificial Intelligence, 133\u2013136. AAAI\u201982. Pittsburgh, Pennsylvania: AAAI Press, 1982.\n- <a id=\"ref3\">[3]</a> Bach, Francis, and Guillaume Obozinski. \u2018Sum Product Algorithm and Hidden Markov Model\u2019, ENS Course Material, 2016. http://imagine.enpc.fr/%7Eobozinsg/teaching/mva_gm/lecture_notes/lecture7.pdf.\n- <a id=\"ref4>\">[4]</a> Kschischang, Frank R., Brendan J. Frey, and Hans-Andrea Loeliger. \u2018Factor Graphs and the Sum-Product Algorithm\u2019. IEEE Transactions on Information Theory 47, no. 2 (February 2001): 498\u2013519. [doi:10.1109/18.910572](https://doi.org/10.1109/18.910572).\n\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A PyTorch implementation of Tree-structured Conditional Random Fields.",
    "version": "0.2.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/althonos/torch-treecrf/issues",
        "Builds": "https://github.com/althonos/torch-treecrf/actions",
        "Changelog": "https://github.com/althonos/torch-treecrf/blob/master/CHANGELOG.md",
        "Coverage": "https://codecov.io/gh/althonos/torch-treecrf/",
        "Homepage": "https://github.com/althonos/torch-treecrf",
        "PyPI": "https://pypi.org/project/torch-treecrf"
    },
    "split_keywords": [
        "torch"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "03b9f485613f1a431edf14b738bc5f5054f94381a2c1e4710ca0c0b73b72cd51",
                "md5": "7f592c44d9581c04960295f8f2240d75",
                "sha256": "fb2b9672ec89de36ea09f7e67333e3aa671dd0ac068c16222862595dc6372652"
            },
            "downloads": -1,
            "filename": "torch_treecrf-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7f592c44d9581c04960295f8f2240d75",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 9816,
            "upload_time": "2024-02-16T12:36:52",
            "upload_time_iso_8601": "2024-02-16T12:36:52.971457Z",
            "url": "https://files.pythonhosted.org/packages/03/b9/f485613f1a431edf14b738bc5f5054f94381a2c1e4710ca0c0b73b72cd51/torch_treecrf-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f8a02a7bfd00ee7f7be32eb816f318a795a5e9b7c246cab4861b60027299762a",
                "md5": "8153fd3529fdb9a6e03c0879d5324282",
                "sha256": "323fe782221fa7e6d9d41bd0fc9545feed8e1b10b4d547c4749a3ec92fbab532"
            },
            "downloads": -1,
            "filename": "torch-treecrf-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "8153fd3529fdb9a6e03c0879d5324282",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 12226,
            "upload_time": "2024-02-16T12:37:02",
            "upload_time_iso_8601": "2024-02-16T12:37:02.652258Z",
            "url": "https://files.pythonhosted.org/packages/f8/a0/2a7bfd00ee7f7be32eb816f318a795a5e9b7c246cab4861b60027299762a/torch-treecrf-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-16 12:37:02",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "althonos",
    "github_project": "torch-treecrf",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "torch-treecrf"
}
        
Elapsed time: 0.21065s