================================================================
hierarchicalsoftmax
================================================================
.. start-badges
|pypi badge| |testing badge| |coverage badge| |docs badge| |black badge|
.. |pypi badge| image:: https://img.shields.io/pypi/v/hierarchicalsoftmax?color=blue
:target: https://pypi.org/project/hierarchicalsoftmax/
.. |testing badge| image:: https://github.com/rbturnbull/hierarchicalsoftmax/actions/workflows/testing.yml/badge.svg
:target: https://github.com/rbturnbull/hierarchicalsoftmax/actions
.. |docs badge| image:: https://github.com/rbturnbull/hierarchicalsoftmax/actions/workflows/docs.yml/badge.svg
:target: https://rbturnbull.github.io/hierarchicalsoftmax
.. |black badge| image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/black
.. |coverage badge| image:: https://img.shields.io/endpoint?url=https://gist.githubusercontent.com/rbturnbull/f99aea7ea203d16edd063a8dd5ed395f/raw/coverage-badge.json
:target: https://rbturnbull.github.io/hierarchicalsoftmax/coverage/
.. end-badges
A Hierarchical Softmax Framework for PyTorch.
Documentation available here: `https://rbturnbull.github.io/hierarchicalsoftmax/ <https://rbturnbull.github.io/hierarchicalsoftmax/>`_.
.. start-quickstart
Installation
==================================
hierarchicalsoftmax can be installed from PyPI:
.. code-block:: bash
pip install hierarchicalsoftmax
Alternatively, hierarchicalsoftmax can be installed using pip from the git repository:
.. code-block:: bash
pip install git+https://github.com/rbturnbull/hierarchicalsoftmax.git
Usage
==================================
Build up a hierarchy tree for your categories using the `SoftmaxNode` instances:
.. code-block:: python
from hierarchicalsoftmax import SoftmaxNode
root = SoftmaxNode("root")
a = SoftmaxNode("a", parent=root)
aa = SoftmaxNode("aa", parent=a)
ab = SoftmaxNode("ab", parent=a)
b = SoftmaxNode("b", parent=root)
ba = SoftmaxNode("ba", parent=b)
bb = SoftmaxNode("bb", parent=b)
The `SoftmaxNode` class inherits from the `anytree <https://anytree.readthedocs.io/en/latest/index.html>`_ `Node` class
which means that you can use methods from that library to build and interact with your hierarchy tree.
The tree can be rendered as a string with the `render` method:
.. code-block:: python
root.render(print=True)
This results in a text representation of the tree::
root
├── a
│ ├── aa
│ └── ab
└── b
├── ba
└── bb
The tree can also be rendered to a file using `graphviz` if it is installed:
.. code-block:: python
root.render(filepath="tree.svg")
.. image:: https://raw.githubusercontent.com/rbturnbull/hierarchicalsoftmax/main/docs/images/example-tree.svg
:alt: An example tree rendering.
Then you can add a final layer to your network that has the right size of outputs for the softmax layers.
You can do that manually by setting the output number of features to `root.layer_size`.
Alternatively you can use the `HierarchicalSoftmaxLinear` or `HierarchicalSoftmaxLazyLinear` classes:
.. code-block:: python
from torch import nn
from hierarchicalsoftmax import HierarchicalSoftmaxLinear
model = nn.Sequential(
nn.Linear(in_features=20, out_features=100),
nn.ReLU(),
HierarchicalSoftmaxLinear(in_features=100, root=root)
)
Once you have the hierarchy tree, then you can use the `HierarchicalSoftmaxLoss` module:
.. code-block:: python
from hierarchicalsoftmax import HierarchicalSoftmaxLoss
loss = HierarchicalSoftmaxLoss(root=root)
Metric functions are provided to show accuracy and the F1 score:
.. code-block:: python
from hierarchicalsoftmax import greedy_accuracy, greedy_f1_score
accuracy = greedy_accuracy(predictions, targets, root=root)
f1 = greedy_f1_score(predictions, targets, root=root)
The nodes predicted from the final layer of the model can be inferred using the `greedy_predictions` function which provides a list of the predicted nodes:
.. code-block:: python
from hierarchicalsoftmax import greedy_predictions
outputs = model(inputs)
inferred_nodes = greedy_predictions(outputs)
Relative contributions to the loss
==================================
The loss for each node can be weighted relative to each other by setting the `alpha` value for each parent node.
By default the `alpha` value of a node is 1.
For example, the loss for the first level of classification (under the `root` node) will contribute twice as much to the loss than under the `a` or `b` nodes.
.. code-block:: python
from hierarchicalsoftmax import SoftmaxNode
root = SoftmaxNode("root", alpha=2.0)
a = SoftmaxNode("a", parent=root)
aa = SoftmaxNode("aa", parent=a)
ab = SoftmaxNode("ab", parent=a)
b = SoftmaxNode("b", parent=root)
ba = SoftmaxNode("ba", parent=b)
bb = SoftmaxNode("bb", parent=b)
Example Usage
==================================
In the documentation, we provide `an example of how to use this package with the CIFAR-10 and CIFAR-100 datasets <https://rbturnbull.github.io/hierarchicalsoftmax/cifar.html>`_.:
TreeDict
==================================
We provide a helper class to create a dictionary where items point to the nodes in the tree.
This is useful for creating a dataloader in a machine learning model.
See the `TreeDict documentation <https://rbturnbull.github.io/hierarchicalsoftmax/cifar.html>`_ for more information.
Label Smoothing
==================================
You can add label smoothing to the loss by setting the `label_smoothing` parameter to any of the nodes.
Focal Loss
==================================
You can use the Focal Loss instead of a basic cross-entropy loss for any of the nodes by setting the `gamma` parameter to any of the nodes.
.. end-quickstart
Credits
==================================
* Robert Turnbull <robert.turnbull@unimelb.edu.au>
Raw data
{
"_id": null,
"home_page": "https://github.com/rbturnbull/hierarchicalsoftmax",
"name": "hierarchicalsoftmax",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": "pytorch, softmax",
"author": "Robert Turnbull",
"author_email": "robert.turnbull@unimelb.edu.au",
"download_url": "https://files.pythonhosted.org/packages/79/9c/f7e04251e053360fffd0dfcaa10ce7dc8602f5baa37a72f57300cb73e8af/hierarchicalsoftmax-1.4.4.tar.gz",
"platform": null,
"description": "================================================================\nhierarchicalsoftmax\n================================================================\n\n.. start-badges\n\n|pypi badge| |testing badge| |coverage badge| |docs badge| |black badge|\n\n.. |pypi badge| image:: https://img.shields.io/pypi/v/hierarchicalsoftmax?color=blue\n :target: https://pypi.org/project/hierarchicalsoftmax/\n\n.. |testing badge| image:: https://github.com/rbturnbull/hierarchicalsoftmax/actions/workflows/testing.yml/badge.svg\n :target: https://github.com/rbturnbull/hierarchicalsoftmax/actions\n\n.. |docs badge| image:: https://github.com/rbturnbull/hierarchicalsoftmax/actions/workflows/docs.yml/badge.svg\n :target: https://rbturnbull.github.io/hierarchicalsoftmax\n \n.. |black badge| image:: https://img.shields.io/badge/code%20style-black-000000.svg\n :target: https://github.com/psf/black\n \n.. |coverage badge| image:: https://img.shields.io/endpoint?url=https://gist.githubusercontent.com/rbturnbull/f99aea7ea203d16edd063a8dd5ed395f/raw/coverage-badge.json\n :target: https://rbturnbull.github.io/hierarchicalsoftmax/coverage/\n \n.. end-badges\n\nA Hierarchical Softmax Framework for PyTorch.\n\nDocumentation available here: `https://rbturnbull.github.io/hierarchicalsoftmax/ <https://rbturnbull.github.io/hierarchicalsoftmax/>`_.\n\n.. start-quickstart\n\n\nInstallation\n==================================\n\nhierarchicalsoftmax can be installed from PyPI:\n\n.. code-block:: bash\n\n pip install hierarchicalsoftmax\n\n\nAlternatively, hierarchicalsoftmax can be installed using pip from the git repository:\n\n.. code-block:: bash\n\n pip install git+https://github.com/rbturnbull/hierarchicalsoftmax.git\n\n\nUsage\n==================================\n\nBuild up a hierarchy tree for your categories using the `SoftmaxNode` instances:\n\n.. code-block:: python\n\n from hierarchicalsoftmax import SoftmaxNode\n\n root = SoftmaxNode(\"root\")\n a = SoftmaxNode(\"a\", parent=root)\n aa = SoftmaxNode(\"aa\", parent=a)\n ab = SoftmaxNode(\"ab\", parent=a)\n b = SoftmaxNode(\"b\", parent=root)\n ba = SoftmaxNode(\"ba\", parent=b)\n bb = SoftmaxNode(\"bb\", parent=b)\n\nThe `SoftmaxNode` class inherits from the `anytree <https://anytree.readthedocs.io/en/latest/index.html>`_ `Node` class \nwhich means that you can use methods from that library to build and interact with your hierarchy tree.\n\nThe tree can be rendered as a string with the `render` method:\n\n.. code-block:: python\n\n root.render(print=True)\n\nThis results in a text representation of the tree::\n\n root\n \u251c\u2500\u2500 a\n \u2502 \u251c\u2500\u2500 aa\n \u2502 \u2514\u2500\u2500 ab\n \u2514\u2500\u2500 b\n \u251c\u2500\u2500 ba\n \u2514\u2500\u2500 bb\n\nThe tree can also be rendered to a file using `graphviz` if it is installed:\n\n.. code-block:: python\n\n root.render(filepath=\"tree.svg\")\n\n.. image:: https://raw.githubusercontent.com/rbturnbull/hierarchicalsoftmax/main/docs/images/example-tree.svg\n :alt: An example tree rendering.\n\n\nThen you can add a final layer to your network that has the right size of outputs for the softmax layers.\nYou can do that manually by setting the output number of features to `root.layer_size`. \nAlternatively you can use the `HierarchicalSoftmaxLinear` or `HierarchicalSoftmaxLazyLinear` classes:\n\n.. code-block:: python\n\n from torch import nn\n from hierarchicalsoftmax import HierarchicalSoftmaxLinear\n\n model = nn.Sequential(\n nn.Linear(in_features=20, out_features=100),\n nn.ReLU(),\n HierarchicalSoftmaxLinear(in_features=100, root=root)\n )\n\nOnce you have the hierarchy tree, then you can use the `HierarchicalSoftmaxLoss` module:\n\n.. code-block:: python\n\n from hierarchicalsoftmax import HierarchicalSoftmaxLoss\n\n loss = HierarchicalSoftmaxLoss(root=root)\n\nMetric functions are provided to show accuracy and the F1 score:\n\n.. code-block:: python\n\n from hierarchicalsoftmax import greedy_accuracy, greedy_f1_score\n\n accuracy = greedy_accuracy(predictions, targets, root=root)\n f1 = greedy_f1_score(predictions, targets, root=root)\n\nThe nodes predicted from the final layer of the model can be inferred using the `greedy_predictions` function which provides a list of the predicted nodes:\n\n.. code-block:: python\n\n from hierarchicalsoftmax import greedy_predictions\n\n outputs = model(inputs)\n inferred_nodes = greedy_predictions(outputs)\n\n\nRelative contributions to the loss\n==================================\n\nThe loss for each node can be weighted relative to each other by setting the `alpha` value for each parent node. \nBy default the `alpha` value of a node is 1.\n\nFor example, the loss for the first level of classification (under the `root` node) will contribute twice as much to the loss than under the `a` or `b` nodes.\n\n.. code-block:: python\n\n from hierarchicalsoftmax import SoftmaxNode\n\n root = SoftmaxNode(\"root\", alpha=2.0)\n a = SoftmaxNode(\"a\", parent=root)\n aa = SoftmaxNode(\"aa\", parent=a)\n ab = SoftmaxNode(\"ab\", parent=a)\n b = SoftmaxNode(\"b\", parent=root)\n ba = SoftmaxNode(\"ba\", parent=b)\n bb = SoftmaxNode(\"bb\", parent=b)\n\n\nExample Usage\n==================================\n\nIn the documentation, we provide `an example of how to use this package with the CIFAR-10 and CIFAR-100 datasets <https://rbturnbull.github.io/hierarchicalsoftmax/cifar.html>`_.:\n\n\nTreeDict\n==================================\n\nWe provide a helper class to create a dictionary where items point to the nodes in the tree.\nThis is useful for creating a dataloader in a machine learning model.\nSee the `TreeDict documentation <https://rbturnbull.github.io/hierarchicalsoftmax/cifar.html>`_ for more information.\n\nLabel Smoothing\n==================================\n\nYou can add label smoothing to the loss by setting the `label_smoothing` parameter to any of the nodes.\n\nFocal Loss\n==================================\n\nYou can use the Focal Loss instead of a basic cross-entropy loss for any of the nodes by setting the `gamma` parameter to any of the nodes.\n\n\n.. end-quickstart\n\n\nCredits\n==================================\n\n* Robert Turnbull <robert.turnbull@unimelb.edu.au>\n\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "A Hierarchical Softmax Framework for PyTorch.",
"version": "1.4.4",
"project_urls": {
"Documentation": "https://rbturnbull.github.io/hierarchicalsoftmax/",
"Homepage": "https://github.com/rbturnbull/hierarchicalsoftmax",
"Repository": "https://github.com/rbturnbull/hierarchicalsoftmax"
},
"split_keywords": [
"pytorch",
" softmax"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "d2301752e04c65360d49a4fcd93806bd3cc18fabc91219ac02cf98a59cf23c30",
"md5": "815a87e826526a911413535b68e8874a",
"sha256": "7187c0a672312073128ca50b92a750a0c8f8f143d62933469cca9d2ff18d03c7"
},
"downloads": -1,
"filename": "hierarchicalsoftmax-1.4.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "815a87e826526a911413535b68e8874a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 189724,
"upload_time": "2025-07-10T00:11:39",
"upload_time_iso_8601": "2025-07-10T00:11:39.475280Z",
"url": "https://files.pythonhosted.org/packages/d2/30/1752e04c65360d49a4fcd93806bd3cc18fabc91219ac02cf98a59cf23c30/hierarchicalsoftmax-1.4.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "799cf7e04251e053360fffd0dfcaa10ce7dc8602f5baa37a72f57300cb73e8af",
"md5": "9ffe71da33f931f44d95060fb4ffebac",
"sha256": "0f072d83193311b48ad36492c0e070a6e9d6729ef3e85e2a5e67a434ae1f0486"
},
"downloads": -1,
"filename": "hierarchicalsoftmax-1.4.4.tar.gz",
"has_sig": false,
"md5_digest": "9ffe71da33f931f44d95060fb4ffebac",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 181860,
"upload_time": "2025-07-10T00:11:41",
"upload_time_iso_8601": "2025-07-10T00:11:41.066682Z",
"url": "https://files.pythonhosted.org/packages/79/9c/f7e04251e053360fffd0dfcaa10ce7dc8602f5baa37a72f57300cb73e8af/hierarchicalsoftmax-1.4.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-10 00:11:41",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "rbturnbull",
"github_project": "hierarchicalsoftmax",
"travis_ci": false,
"coveralls": true,
"github_actions": true,
"lcname": "hierarchicalsoftmax"
}