================================================================
hierarchicalsoftmax
================================================================
.. start-badges
|testing badge| |coverage badge| |docs badge| |black badge|
.. |testing badge| image:: https://github.com/rbturnbull/hierarchicalsoftmax/actions/workflows/testing.yml/badge.svg
:target: https://github.com/rbturnbull/hierarchicalsoftmax/actions
.. |docs badge| image:: https://github.com/rbturnbull/hierarchicalsoftmax/actions/workflows/docs.yml/badge.svg
:target: https://rbturnbull.github.io/hierarchicalsoftmax
.. |black badge| image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/black
.. |coverage badge| image:: https://img.shields.io/endpoint?url=https://gist.githubusercontent.com/rbturnbull/f99aea7ea203d16edd063a8dd5ed395f/raw/coverage-badge.json
:target: https://rbturnbull.github.io/hierarchicalsoftmax/coverage/
.. end-badges
A Hierarchical Softmax Framework for PyTorch.
.. start-quickstart
Installation
==================================
hierarchicalsoftmax can be installed from PyPI:
.. code-block:: bash
pip install hierarchicalsoftmax
Alternatively, hierarchicalsoftmax can be installed using pip from the git repository:
.. code-block:: bash
pip install git+https://github.com/rbturnbull/hierarchicalsoftmax.git
Usage
==================================
Build up a hierarchy tree for your categories using the `SoftmaxNode` instances:
.. code-block:: python
from hierarchicalsoftmax import SoftmaxNode
root = SoftmaxNode("root")
a = SoftmaxNode("a", parent=root)
aa = SoftmaxNode("aa", parent=a)
ab = SoftmaxNode("ab", parent=a)
b = SoftmaxNode("b", parent=root)
ba = SoftmaxNode("ba", parent=b)
bb = SoftmaxNode("bb", parent=b)
The `SoftmaxNode` class inherits from the `anytree <https://anytree.readthedocs.io/en/latest/index.html>`_ `Node` class
which means that you can use methods from that library to build and interact with your hierarchy tree.
The tree can be rendered as a string with the `render` method:
.. code-block:: python
root.render(print=True)
This results in a text representation of the tree::
root
├── a
│ ├── aa
│ └── ab
└── b
├── ba
└── bb
The tree can also be rendered to a file using `graphviz` if it is installed:
.. code-block:: python
root.render(filepath="tree.svg")
.. image:: https://raw.githubusercontent.com/rbturnbull/hierarchicalsoftmax/main/docs/images/example-tree.svg
:alt: An example tree rendering.
Then you can add a final layer to your network that has the right size of outputs for the softmax layers.
You can do that manually by setting the output number of features to `root.layer_size`.
Alternatively you can use the `HierarchicalSoftmaxLinear` or `HierarchicalSoftmaxLazyLinear` classes:
.. code-block:: python
from torch import nn
from hierarchicalsoftmax import HierarchicalSoftmaxLinear
model = nn.Sequential(
nn.Linear(in_features=20, out_features=100),
nn.ReLU(),
HierarchicalSoftmaxLinear(in_features=100, root=root)
)
Once you have the hierarchy tree, then you can use the `HierarchicalSoftmaxLoss` module:
.. code-block:: python
from hierarchicalsoftmax import HierarchicalSoftmaxLoss
loss = HierarchicalSoftmaxLoss(root=root)
Metric functions are provided to show accuracy and the F1 score:
.. code-block:: python
from hierarchicalsoftmax import greedy_accuracy, greedy_f1_score
accuracy = greedy_accuracy(predictions, targets, root=root)
f1 = greedy_f1_score(predictions, targets, root=root)
The nodes predicted from the final layer of the model can be inferred using the `greedy_predictions` function which provides a list of the predicted nodes:
.. code-block:: python
from hierarchicalsoftmax import greedy_predictions
outputs = model(inputs)
inferred_nodes = greedy_predictions(outputs)
Relative contributions to the loss
==================================
The loss for each node can be weighted relative to each other by setting the `alpha` value for each parent node.
By default the `alpha` value of a node is 1.
For example, the loss for the first level of classification (under the `root` node) will contribute twice as much to the loss than under the `a` or `b` nodes.
.. code-block:: python
from hierarchicalsoftmax import SoftmaxNode
root = SoftmaxNode("root", alpha=2.0)
a = SoftmaxNode("a", parent=root)
aa = SoftmaxNode("aa", parent=a)
ab = SoftmaxNode("ab", parent=a)
b = SoftmaxNode("b", parent=root)
ba = SoftmaxNode("ba", parent=b)
bb = SoftmaxNode("bb", parent=b)
Label Smoothing
==================================
You can add label smoothing to the loss by setting the `label_smoothing` parameter to any of the nodes.
Focal Loss
==================================
You can use the Focal Loss instead of a basic cross-entropy loss for any of the nodes by setting the `gamma` parameter to any of the nodes.
.. end-quickstart
Credits
==================================
* Robert Turnbull <robert.turnbull@unimelb.edu.au>
Raw data
{
"_id": null,
"home_page": "https://github.com/rbturnbull/hierarchicalsoftmax",
"name": "hierarchicalsoftmax",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8,<4.0",
"maintainer_email": "",
"keywords": "pytorch,softmax",
"author": "Robert Turnbull",
"author_email": "robert.turnbull@unimelb.edu.au",
"download_url": "https://files.pythonhosted.org/packages/28/f3/c39d040184ac01a53976ca272b4cc601f3e1177017d2afbcd8e0c253f323/hierarchicalsoftmax-1.0.2.tar.gz",
"platform": null,
"description": "================================================================\nhierarchicalsoftmax\n================================================================\n\n.. start-badges\n\n|testing badge| |coverage badge| |docs badge| |black badge|\n\n.. |testing badge| image:: https://github.com/rbturnbull/hierarchicalsoftmax/actions/workflows/testing.yml/badge.svg\n :target: https://github.com/rbturnbull/hierarchicalsoftmax/actions\n\n.. |docs badge| image:: https://github.com/rbturnbull/hierarchicalsoftmax/actions/workflows/docs.yml/badge.svg\n :target: https://rbturnbull.github.io/hierarchicalsoftmax\n \n.. |black badge| image:: https://img.shields.io/badge/code%20style-black-000000.svg\n :target: https://github.com/psf/black\n \n.. |coverage badge| image:: https://img.shields.io/endpoint?url=https://gist.githubusercontent.com/rbturnbull/f99aea7ea203d16edd063a8dd5ed395f/raw/coverage-badge.json\n :target: https://rbturnbull.github.io/hierarchicalsoftmax/coverage/\n \n.. end-badges\n\nA Hierarchical Softmax Framework for PyTorch.\n\n\n.. start-quickstart\n\n\nInstallation\n==================================\n\nhierarchicalsoftmax can be installed from PyPI:\n\n.. code-block:: bash\n\n pip install hierarchicalsoftmax\n\n\nAlternatively, hierarchicalsoftmax can be installed using pip from the git repository:\n\n.. code-block:: bash\n\n pip install git+https://github.com/rbturnbull/hierarchicalsoftmax.git\n\n\nUsage\n==================================\n\nBuild up a hierarchy tree for your categories using the `SoftmaxNode` instances:\n\n.. code-block:: python\n\n from hierarchicalsoftmax import SoftmaxNode\n\n root = SoftmaxNode(\"root\")\n a = SoftmaxNode(\"a\", parent=root)\n aa = SoftmaxNode(\"aa\", parent=a)\n ab = SoftmaxNode(\"ab\", parent=a)\n b = SoftmaxNode(\"b\", parent=root)\n ba = SoftmaxNode(\"ba\", parent=b)\n bb = SoftmaxNode(\"bb\", parent=b)\n\nThe `SoftmaxNode` class inherits from the `anytree <https://anytree.readthedocs.io/en/latest/index.html>`_ `Node` class \nwhich means that you can use methods from that library to build and interact with your hierarchy tree.\n\nThe tree can be rendered as a string with the `render` method:\n\n.. code-block:: python\n\n root.render(print=True)\n\nThis results in a text representation of the tree::\n\n root\n \u251c\u2500\u2500 a\n \u2502 \u251c\u2500\u2500 aa\n \u2502 \u2514\u2500\u2500 ab\n \u2514\u2500\u2500 b\n \u251c\u2500\u2500 ba\n \u2514\u2500\u2500 bb\n\nThe tree can also be rendered to a file using `graphviz` if it is installed:\n\n.. code-block:: python\n\n root.render(filepath=\"tree.svg\")\n\n.. image:: https://raw.githubusercontent.com/rbturnbull/hierarchicalsoftmax/main/docs/images/example-tree.svg\n :alt: An example tree rendering.\n\n\nThen you can add a final layer to your network that has the right size of outputs for the softmax layers.\nYou can do that manually by setting the output number of features to `root.layer_size`. \nAlternatively you can use the `HierarchicalSoftmaxLinear` or `HierarchicalSoftmaxLazyLinear` classes:\n\n.. code-block:: python\n\n from torch import nn\n from hierarchicalsoftmax import HierarchicalSoftmaxLinear\n\n model = nn.Sequential(\n nn.Linear(in_features=20, out_features=100),\n nn.ReLU(),\n HierarchicalSoftmaxLinear(in_features=100, root=root)\n )\n\nOnce you have the hierarchy tree, then you can use the `HierarchicalSoftmaxLoss` module:\n\n.. code-block:: python\n\n from hierarchicalsoftmax import HierarchicalSoftmaxLoss\n\n loss = HierarchicalSoftmaxLoss(root=root)\n\nMetric functions are provided to show accuracy and the F1 score:\n\n.. code-block:: python\n\n from hierarchicalsoftmax import greedy_accuracy, greedy_f1_score\n\n accuracy = greedy_accuracy(predictions, targets, root=root)\n f1 = greedy_f1_score(predictions, targets, root=root)\n\nThe nodes predicted from the final layer of the model can be inferred using the `greedy_predictions` function which provides a list of the predicted nodes:\n\n.. code-block:: python\n\n from hierarchicalsoftmax import greedy_predictions\n\n outputs = model(inputs)\n inferred_nodes = greedy_predictions(outputs)\n\n\nRelative contributions to the loss\n==================================\n\nThe loss for each node can be weighted relative to each other by setting the `alpha` value for each parent node. \nBy default the `alpha` value of a node is 1.\n\nFor example, the loss for the first level of classification (under the `root` node) will contribute twice as much to the loss than under the `a` or `b` nodes.\n\n.. code-block:: python\n\n from hierarchicalsoftmax import SoftmaxNode\n\n root = SoftmaxNode(\"root\", alpha=2.0)\n a = SoftmaxNode(\"a\", parent=root)\n aa = SoftmaxNode(\"aa\", parent=a)\n ab = SoftmaxNode(\"ab\", parent=a)\n b = SoftmaxNode(\"b\", parent=root)\n ba = SoftmaxNode(\"ba\", parent=b)\n bb = SoftmaxNode(\"bb\", parent=b)\n\n\nLabel Smoothing\n==================================\n\nYou can add label smoothing to the loss by setting the `label_smoothing` parameter to any of the nodes.\n\nFocal Loss\n==================================\n\nYou can use the Focal Loss instead of a basic cross-entropy loss for any of the nodes by setting the `gamma` parameter to any of the nodes.\n\n\n.. end-quickstart\n\n\nCredits\n==================================\n\n* Robert Turnbull <robert.turnbull@unimelb.edu.au>\n\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "A Hierarchical Softmax Framework for PyTorch.",
"version": "1.0.2",
"project_urls": {
"Documentation": "https://rbturnbull.github.io/hierarchicalsoftmax/",
"Homepage": "https://github.com/rbturnbull/hierarchicalsoftmax",
"Repository": "https://github.com/rbturnbull/hierarchicalsoftmax"
},
"split_keywords": [
"pytorch",
"softmax"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "94585bfef43e4e00cafa01b0e483aa500d43d9f38b203b917b7e83b1d5b4a2ce",
"md5": "c9d0595da838b2b321a0ec9b6d476df1",
"sha256": "63cbbd76bc80f9980ce90e76394101850d780cdd7ddcaef5636fd12f5d829da0"
},
"downloads": -1,
"filename": "hierarchicalsoftmax-1.0.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c9d0595da838b2b321a0ec9b6d476df1",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8,<4.0",
"size": 16051,
"upload_time": "2023-08-22T12:32:56",
"upload_time_iso_8601": "2023-08-22T12:32:56.183222Z",
"url": "https://files.pythonhosted.org/packages/94/58/5bfef43e4e00cafa01b0e483aa500d43d9f38b203b917b7e83b1d5b4a2ce/hierarchicalsoftmax-1.0.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "28f3c39d040184ac01a53976ca272b4cc601f3e1177017d2afbcd8e0c253f323",
"md5": "4899890eedce5ea978f5f84bfe92847e",
"sha256": "a894dffdc7ec366d6c4846badcb3e21bf9b09b9eed247f4f116b310231195135"
},
"downloads": -1,
"filename": "hierarchicalsoftmax-1.0.2.tar.gz",
"has_sig": false,
"md5_digest": "4899890eedce5ea978f5f84bfe92847e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8,<4.0",
"size": 14984,
"upload_time": "2023-08-22T12:32:58",
"upload_time_iso_8601": "2023-08-22T12:32:58.004496Z",
"url": "https://files.pythonhosted.org/packages/28/f3/c39d040184ac01a53976ca272b4cc601f3e1177017d2afbcd8e0c253f323/hierarchicalsoftmax-1.0.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-08-22 12:32:58",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "rbturnbull",
"github_project": "hierarchicalsoftmax",
"travis_ci": false,
"coveralls": true,
"github_actions": true,
"lcname": "hierarchicalsoftmax"
}