[](https://github.com/kaelzhang/handy-nn/actions/workflows/python.yml)
[](https://codecov.io/gh/kaelzhang/handy-nn)
[](https://pypi.org/project/handy-nn/)
[](https://github.com/kaelzhang/handy-nn)
# handy-nn
Delightful and useful neural networks models, including OrdinalRegressionLoss, etc.
## Install
```sh
$ pip install handy-nn
```
## Usage
```py
from handy_nn import OrdinalRegressionLoss
# Initialize the loss function
num_classes = 5
criterion = OrdinalRegressionLoss(num_classes)
# For training
logits = model(inputs) # Shape: (batch_size, 1)
loss = criterion(logits, targets)
loss.backward() # shape: torch.Size([])
# To get class probabilities
probas = criterion.predict_probas(logits) # Shape: (batch_size, num_classes)
```
### Shapes
Variable | Shape
-------- | ----
`logits` | `(batch_size, 1)`
`targets` | `(batch_size,)` or one-hot `(batch_size, num_classes)`
`loss` | `torch.Size([])`
`probas` | `(batch_size, num_classes)`
# APIs
## OrdinalRegressionLoss(num_classes, learn_thresholds=True, init_scale=2.0)
- **num_classes** `int`: Number of ordinal classes (ranks)
- **learn_thresholds** `bool=True`: Whether to learn threshold parameters or use fixed ones, defaults to `True`.
- **init_scale** `float=2.0`: Scale for initializing thresholds, defaults to `2.0`
Creates the loss function for ordinal regression.
The goal of [ordinal regression](https://en.wikipedia.org/wiki/Ordinal_regression) is to model the relationship between one or more independent variables and an ordinal dependent variable. It predicts the probability that an observation falls into a specific ordinal category or a category higher than a certain threshold. This is particularly useful in fields like social sciences, medicine, and customer surveys where outcomes are often ordinal.
## TrendAwareLoss()
```py
criterion = TrendAwareLoss()
loss = criterion(logits, targets)
loss.backward()
```
`TrendAwareLoss` penalizes "too-early / too-late" misclassification inside a label segment more heavily by multiplying per-sample cross-entropy with the segment remaining-length weight.
This loss function is useful for those situations where misclassification leads to an indirect loss, such as financial trading, etc.
## License
[MIT](LICENSE)
Raw data
{
"_id": null,
"home_page": null,
"name": "handy-nn",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "neural-networks, pytorch, machine-learning, ordinal-regression, models",
"author": null,
"author_email": "Kael Zhang <i+pypi@kael.me>",
"download_url": "https://files.pythonhosted.org/packages/ee/6c/236adbdbfa5ccb97842190813255c0c8e3a62e7fac09ed3c384b47c316e6/handy_nn-0.0.6.tar.gz",
"platform": null,
"description": "[](https://github.com/kaelzhang/handy-nn/actions/workflows/python.yml)\n[](https://codecov.io/gh/kaelzhang/handy-nn)\n[](https://pypi.org/project/handy-nn/)\n[](https://github.com/kaelzhang/handy-nn)\n\n# handy-nn\n\nDelightful and useful neural networks models, including OrdinalRegressionLoss, etc.\n\n## Install\n\n```sh\n$ pip install handy-nn\n```\n\n## Usage\n\n```py\nfrom handy_nn import OrdinalRegressionLoss\n\n# Initialize the loss function\nnum_classes = 5\ncriterion = OrdinalRegressionLoss(num_classes)\n\n# For training\nlogits = model(inputs) # Shape: (batch_size, 1)\nloss = criterion(logits, targets)\nloss.backward() # shape: torch.Size([])\n\n# To get class probabilities\nprobas = criterion.predict_probas(logits) # Shape: (batch_size, num_classes)\n```\n\n### Shapes\n\nVariable | Shape\n-------- | ----\n`logits` | `(batch_size, 1)`\n`targets` | `(batch_size,)` or one-hot `(batch_size, num_classes)`\n`loss` | `torch.Size([])`\n`probas` | `(batch_size, num_classes)`\n\n# APIs\n\n## OrdinalRegressionLoss(num_classes, learn_thresholds=True, init_scale=2.0)\n\n- **num_classes** `int`: Number of ordinal classes (ranks)\n- **learn_thresholds** `bool=True`: Whether to learn threshold parameters or use fixed ones, defaults to `True`.\n- **init_scale** `float=2.0`: Scale for initializing thresholds, defaults to `2.0`\n\nCreates the loss function for ordinal regression.\n\nThe goal of [ordinal regression](https://en.wikipedia.org/wiki/Ordinal_regression) is to model the relationship between one or more independent variables and an ordinal dependent variable. It predicts the probability that an observation falls into a specific ordinal category or a category higher than a certain threshold. This is particularly useful in fields like social sciences, medicine, and customer surveys where outcomes are often ordinal.\n\n## TrendAwareLoss()\n\n```py\ncriterion = TrendAwareLoss()\nloss = criterion(logits, targets)\nloss.backward()\n```\n\n`TrendAwareLoss` penalizes \"too-early / too-late\" misclassification inside a label segment more heavily by multiplying per-sample cross-entropy with the segment remaining-length weight.\n\nThis loss function is useful for those situations where misclassification leads to an indirect loss, such as financial trading, etc.\n\n## License\n\n[MIT](LICENSE)\n",
"bugtrack_url": null,
"license": null,
"summary": "Delightful and useful neural networks models, including OrdinalRegressionLoss, etc.",
"version": "0.0.6",
"project_urls": {
"Homepage": "https://github.com/kaelzhang/handy-nn"
},
"split_keywords": [
"neural-networks",
" pytorch",
" machine-learning",
" ordinal-regression",
" models"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "c40c1403ef048f9da11d752f8d02600b721ae8b5593d493dae5a9e2e0519a62a",
"md5": "7c2a46b8cd4d931ea2235375a0e8a668",
"sha256": "ab78f85dc101ed59ef913a76bcbb808f5aa0a86ae8dd1063efa1083423f2876b"
},
"downloads": -1,
"filename": "handy_nn-0.0.6-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7c2a46b8cd4d931ea2235375a0e8a668",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 5985,
"upload_time": "2025-08-27T10:59:27",
"upload_time_iso_8601": "2025-08-27T10:59:27.383726Z",
"url": "https://files.pythonhosted.org/packages/c4/0c/1403ef048f9da11d752f8d02600b721ae8b5593d493dae5a9e2e0519a62a/handy_nn-0.0.6-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "ee6c236adbdbfa5ccb97842190813255c0c8e3a62e7fac09ed3c384b47c316e6",
"md5": "8e0e51beeff0c6c6969e6c9577b664b2",
"sha256": "d9ba501f4be4a7d3abff83a6c8885b0519e91037992ad1b799fc623fb7ec62d4"
},
"downloads": -1,
"filename": "handy_nn-0.0.6.tar.gz",
"has_sig": false,
"md5_digest": "8e0e51beeff0c6c6969e6c9577b664b2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 6030,
"upload_time": "2025-08-27T10:59:28",
"upload_time_iso_8601": "2025-08-27T10:59:28.712431Z",
"url": "https://files.pythonhosted.org/packages/ee/6c/236adbdbfa5ccb97842190813255c0c8e3a62e7fac09ed3c384b47c316e6/handy_nn-0.0.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-27 10:59:28",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "kaelzhang",
"github_project": "handy-nn",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "handy-nn"
}