Name | nrtk JSON |
Version |
0.24.0
JSON |
| download |
home_page | None |
Summary | Natural Robustness Toolkit (NRTK) is a platform for generating validated, sensor-specific perturbations and transformations used to evaluate the robustness of computer vision models. |
upload_time | 2025-07-24 17:31:00 |
maintainer | None |
docs_url | None |
author | Kitware, Inc. |
requires_python | <4.0,>=3.9 |
license | Apache-2.0 |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|

<hr/>
<!-- :auto badges: -->
[](https://pypi.org/project/nrtk/)

[](https://nrtk.readthedocs.io/en/latest/?badge=latest)
<!-- :auto badges: -->
# Natural Robustness Toolkit (NRTK)
> The Natural Robustness Toolkit (NRTK) is an open source toolkit for generating
> operationally realistic perturbations to evaluate the natural robustness of
> computer vision algorithms.
The `nrtk` package evaluates the natural robustness of computer vision
algorithms to various perturbations, including sensor-specific changes to camera
focal length, aperture diameter, etc.
We have also created `nrtk.interop.maite` module to support AI T&E use cases and
workflows, through interoperability with
[MAITE](https://github.com/mit-ll-ai-technology/maite) and integration with
other [JATIC](https://cdao.pages.jatic.net/public/) tools. Users seeking to use
NRTK to perturb MAITE-wrapped datasets or evaluate MAITE-wrapped models should
utilize this module. Explore our
[T&E guides](https://nrtk.readthedocs.io/en/latest/testing_and_evaluation_notebooks.html)
which demonstrate how `nrtk` perturbations and `maite` can be applied to assess
operational risks.
## Why NRTK?
NRTK addresses the critical gap in evaluating computer vision model resilience
to real-world operational conditions beyond what traditional image augmentation
libraries cover. T&E engineers need precise methods to assess how models respond
to sensor-specific variables (focal length, aperture diameter, pixel pitch) and
environmental factors without the prohibitive costs of exhaustive data
collection. NRTK leverages pyBSM's physics-based models to rigorously simulate
how imaging sensors capture and process light, enabling systematic robustness
testing across parameter sweeps, identification of performance boundaries, and
visualization of model degradation. This capability is particularly valuable for
satellite and aerial imaging applications, where engineers can simulate
hypothetical sensor configurations to support cost-performance trade-off
analysis during system design—ensuring AI models maintain reliability when
deployed on actual hardware facing natural perturbations in the field.
## Target Audience
This toolkit is intended to help data scientists, developers, and T&E engineers
who want to rigorously evaluate and enhance the robustness of their computer
vision models. For users of the JATIC product suite, this toolkit is used to
assess model robustness against natural perturbations.
<!-- :auto installation: -->
## Installation
`nrtk` installation has been tested on Unix and Linux systems.
To install the current version via `pip`:
```bash
pip install nrtk
```
To install the current version via `conda-forge`:
```bash
conda install -c conda-forge nrtk
```
This installs core functionality, but many specific perturbers require
additional dependencies.
### Installation with Optional Features (Extras)
NRTK uses optional "extras" to avoid installing unncessary dependencies. You can
install extras with square brackets:
```bash
# Install with extras (note: no spaces after commas)
pip install nrtk[<extra1>,<extra2>]
```
#### Common Installation Patterns
```bash
# For basic OpenCV image perturbations
pip install nrtk[graphics]
# For basic Pillow image perturbations
pip install nrtk[Pillow]
# For pybsm's sensor-based perturbations
pip install nrtk[pybsm,graphics]
```
**Note**: Choose either `graphics` or `headless` for OpenCV, not both.
More information on extras and related perturbers, including a complete list of
extras, can be found
[here](https://nrtk.readthedocs.io/en/latest/installation.html#extras).
Details on the perturbers and their dependencies can be found
[here](https://nrtk.readthedocs.io/en/latest/implementations.html).
For more detailed installation instructions, visit the
[installation documentation](https://nrtk.readthedocs.io/en/latest/installation.html).
<!-- :auto installation: -->
<!-- :auto getting-started: -->
## Getting Started
Explore usage examples of the `nrtk` package in various contexts using the
Jupyter notebooks provided in the `./docs/examples/` directory.
<!-- :auto getting-started: -->
## Example: A First Look at NRTK Perturbations
Via the pyBSM package, NRTK exposes a large set of Optical Transfer Functions
(OTFs). These OTFs can simulate different environmental and sensor-based
effects. For example, the :ref:`JitterOTFPerturber <JitterOTFPerturber>`
simulates different levels of sensor jitter. By modifying its input parameters,
you can observe how sensor jitter affects image quality.
#### Input Image
Below is an example of an input image that will undergo a Jitter OTF
perturbation. This image represents the initial state before any transformation.

#### Code Sample
Below is some example code that applies a Jitter OTF transformation::
```
from nrtk.impls.perturb_image.pybsm.jitter_otf_perturber import JitterOTFPerturber
import numpy as np
from PIL import Image
INPUT_IMG_FILE = 'docs/images/input.jpg'
image = np.array(Image.open(INPUT_IMG_FILE))
otf = JitterOTFPerturber(sx=8e-6, sy=8e-6, name="test_name")
out_image = otf.perturb(image)
```
This code uses default values and provides a sample input image. However, you
can adjust the parameters and use your own image to visualize the perturbation.
The sx and sy parameters (the root-mean-squared jitter amplitudes in radians, in
the x and y directions) are the primary way to customize a jitter perturber.
Larger jitter amplitude generate a larger Gaussian blur kernel.
#### Resulting Image
The output image below shows the effects of the Jitter OTF on the original
input. This result illustrates the Gaussian blur introduced due to simulated
sensor jitter.

<!-- :auto documentation: -->
## Documentation
Documentation for both release snapshots and the latest main branch is available
on [ReadTheDocs](https://nrtk.readthedocs.io).
To build the Sphinx-based documentation locally for the latest reference:
```bash
# Install dependencies
poetry install --sync --with main,linting,tests,docs
# Navigate to the documentation root
cd docs
# Build the documentation
poetry run make html
# Open the generated documentation in your browser
firefox _build/html/index.html
```
<!-- :auto documentation: -->
<!-- :auto contributing: -->
## Contributing
Contributions are encouraged!
The following points help ensure contributions follow development practices.
- Follow the
[JATIC Design Principles](https://cdao.pages.jatic.net/public/program/design-principles/).
- Adopt the Git Flow branching strategy.
- See the
[release process documentation](https://nrtk.readthedocs.io/en/latest/release_process.html)
for detailed release information.
- Additional contribution guidelines and issue reporting steps can be found in
[CONTRIBUTING.md](./CONTRIBUTING.md).
<!-- :auto contributing: -->
<!-- :auto developer-tools: -->
### Developer Tools
Ensure the source tree is acquired locally before proceeding.
#### Poetry Install
You can install using [Poetry](https://python-poetry.org/):
> [!IMPORTANT] NRTK currently requires `poetry<2.0`
> [!WARNING] Users unfamiliar with Poetry should use caution. See
> [installation documentation](https://nrtk.readthedocs.io/en/latest/installation.html#from-source)
> for more information.
```bash
poetry install --with main,linting,tests,docs --extras "<extra1> <extra2> ..."
```
#### Pre-commit Hooks
Pre-commit hooks ensure that code complies with required linting and formatting
guidelines. These hooks run automatically before commits but can also be
executed manually. To bypass checks during a commit, use the `--no-verify` flag.
To install and use pre-commit hooks:
```bash
# Install required dependencies
poetry install --sync --with main,linting,tests,docs
# Initialize pre-commit hooks for the repository
poetry run pre-commit install
# Run pre-commit checks on all files
poetry run pre-commit run --all-files
```
<!-- :auto developer-tools: -->
## NRTK Demonstration Tool
This [associated project](https://github.com/Kitware/nrtk-explorer) provides a
local web application that provides a demonstration of visual saliency
generation in a user interface. This provides an example of how image
perturbation, as generated by this package, can be utilized in a user interface
to facilitate dataset exploration. This tool uses the
[trame framework](https://kitware.github.io/trame/).

<!-- :auto license: -->
## License
[Apache 2.0](./LICENSE)
<!-- :auto license: -->
<!-- :auto contacts: -->
## Contacts
**Principal Investigator / Product Owner**: Brian Hu (Kitware) @brian.hu
**Scrum Master / Maintainer**: Brandon RichardWebster (Kitware)
@b.richardwebster
**Deputy Scrum Master / Maintainer**: Emily Veenhuis (Kitware) @emily.veenhuis
**Project Manager**: Keith Fieldhouse (Kitware) @keith.fieldhouse
**Program Representative**: Austin Whitesell (MITRE) @awhitesell
<!-- :auto contacts: -->
<!-- :auto acknowledgment: -->
## Acknowledgment
This material is based upon work supported by the Chief Digital and Artificial
Intelligence Office under Contract No. 519TC-23-9-2032. The views and
conclusions contained herein are those of the author(s) and should not be
interpreted as necessarily representing the official policies or endorsements,
either expressed or implied, of the U.S. Government.
<!-- :auto acknowledgment: -->
Raw data
{
"_id": null,
"home_page": null,
"name": "nrtk",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": null,
"keywords": null,
"author": "Kitware, Inc.",
"author_email": "nrtk@kitware.com",
"download_url": "https://files.pythonhosted.org/packages/05/0d/6521b72ec169957045bfdd1f49f24f3e6ae13c63f04ffc118fd240cccd2a/nrtk-0.24.0.tar.gz",
"platform": null,
"description": "\n\n<hr/>\n\n<!-- :auto badges: -->\n\n[](https://pypi.org/project/nrtk/)\n\n[](https://nrtk.readthedocs.io/en/latest/?badge=latest)\n\n<!-- :auto badges: -->\n\n# Natural Robustness Toolkit (NRTK)\n\n> The Natural Robustness Toolkit (NRTK) is an open source toolkit for generating\n> operationally realistic perturbations to evaluate the natural robustness of\n> computer vision algorithms.\n\nThe `nrtk` package evaluates the natural robustness of computer vision\nalgorithms to various perturbations, including sensor-specific changes to camera\nfocal length, aperture diameter, etc.\n\nWe have also created `nrtk.interop.maite` module to support AI T&E use cases and\nworkflows, through interoperability with\n[MAITE](https://github.com/mit-ll-ai-technology/maite) and integration with\nother [JATIC](https://cdao.pages.jatic.net/public/) tools. Users seeking to use\nNRTK to perturb MAITE-wrapped datasets or evaluate MAITE-wrapped models should\nutilize this module. Explore our\n[T&E guides](https://nrtk.readthedocs.io/en/latest/testing_and_evaluation_notebooks.html)\nwhich demonstrate how `nrtk` perturbations and `maite` can be applied to assess\noperational risks.\n\n## Why NRTK?\n\nNRTK addresses the critical gap in evaluating computer vision model resilience\nto real-world operational conditions beyond what traditional image augmentation\nlibraries cover. T&E engineers need precise methods to assess how models respond\nto sensor-specific variables (focal length, aperture diameter, pixel pitch) and\nenvironmental factors without the prohibitive costs of exhaustive data\ncollection. NRTK leverages pyBSM's physics-based models to rigorously simulate\nhow imaging sensors capture and process light, enabling systematic robustness\ntesting across parameter sweeps, identification of performance boundaries, and\nvisualization of model degradation. This capability is particularly valuable for\nsatellite and aerial imaging applications, where engineers can simulate\nhypothetical sensor configurations to support cost-performance trade-off\nanalysis during system design\u2014ensuring AI models maintain reliability when\ndeployed on actual hardware facing natural perturbations in the field.\n\n## Target Audience\n\nThis toolkit is intended to help data scientists, developers, and T&E engineers\nwho want to rigorously evaluate and enhance the robustness of their computer\nvision models. For users of the JATIC product suite, this toolkit is used to\nassess model robustness against natural perturbations.\n\n<!-- :auto installation: -->\n\n## Installation\n\n`nrtk` installation has been tested on Unix and Linux systems.\n\nTo install the current version via `pip`:\n\n```bash\npip install nrtk\n```\n\nTo install the current version via `conda-forge`:\n\n```bash\nconda install -c conda-forge nrtk\n```\n\nThis installs core functionality, but many specific perturbers require\nadditional dependencies.\n\n### Installation with Optional Features (Extras)\n\nNRTK uses optional \"extras\" to avoid installing unncessary dependencies. You can\ninstall extras with square brackets:\n\n```bash\n# Install with extras (note: no spaces after commas)\npip install nrtk[<extra1>,<extra2>]\n```\n\n#### Common Installation Patterns\n\n```bash\n# For basic OpenCV image perturbations\npip install nrtk[graphics]\n# For basic Pillow image perturbations\npip install nrtk[Pillow]\n# For pybsm's sensor-based perturbations\npip install nrtk[pybsm,graphics]\n```\n\n**Note**: Choose either `graphics` or `headless` for OpenCV, not both.\n\nMore information on extras and related perturbers, including a complete list of\nextras, can be found\n[here](https://nrtk.readthedocs.io/en/latest/installation.html#extras).\n\nDetails on the perturbers and their dependencies can be found\n[here](https://nrtk.readthedocs.io/en/latest/implementations.html).\n\nFor more detailed installation instructions, visit the\n[installation documentation](https://nrtk.readthedocs.io/en/latest/installation.html).\n\n<!-- :auto installation: -->\n\n<!-- :auto getting-started: -->\n\n## Getting Started\n\nExplore usage examples of the `nrtk` package in various contexts using the\nJupyter notebooks provided in the `./docs/examples/` directory.\n\n<!-- :auto getting-started: -->\n\n## Example: A First Look at NRTK Perturbations\n\nVia the pyBSM package, NRTK exposes a large set of Optical Transfer Functions\n(OTFs). These OTFs can simulate different environmental and sensor-based\neffects. For example, the :ref:`JitterOTFPerturber <JitterOTFPerturber>`\nsimulates different levels of sensor jitter. By modifying its input parameters,\nyou can observe how sensor jitter affects image quality.\n\n#### Input Image\n\nBelow is an example of an input image that will undergo a Jitter OTF\nperturbation. This image represents the initial state before any transformation.\n\n\n\n#### Code Sample\n\nBelow is some example code that applies a Jitter OTF transformation::\n\n```\nfrom nrtk.impls.perturb_image.pybsm.jitter_otf_perturber import JitterOTFPerturber\nimport numpy as np\nfrom PIL import Image\n\nINPUT_IMG_FILE = 'docs/images/input.jpg'\nimage = np.array(Image.open(INPUT_IMG_FILE))\n\notf = JitterOTFPerturber(sx=8e-6, sy=8e-6, name=\"test_name\")\nout_image = otf.perturb(image)\n```\n\nThis code uses default values and provides a sample input image. However, you\ncan adjust the parameters and use your own image to visualize the perturbation.\nThe sx and sy parameters (the root-mean-squared jitter amplitudes in radians, in\nthe x and y directions) are the primary way to customize a jitter perturber.\nLarger jitter amplitude generate a larger Gaussian blur kernel.\n\n#### Resulting Image\n\nThe output image below shows the effects of the Jitter OTF on the original\ninput. This result illustrates the Gaussian blur introduced due to simulated\nsensor jitter.\n\n\n\n<!-- :auto documentation: -->\n\n## Documentation\n\nDocumentation for both release snapshots and the latest main branch is available\non [ReadTheDocs](https://nrtk.readthedocs.io).\n\nTo build the Sphinx-based documentation locally for the latest reference:\n\n```bash\n# Install dependencies\npoetry install --sync --with main,linting,tests,docs\n# Navigate to the documentation root\ncd docs\n# Build the documentation\npoetry run make html\n# Open the generated documentation in your browser\nfirefox _build/html/index.html\n```\n\n<!-- :auto documentation: -->\n\n<!-- :auto contributing: -->\n\n## Contributing\n\nContributions are encouraged!\n\nThe following points help ensure contributions follow development practices.\n\n- Follow the\n [JATIC Design Principles](https://cdao.pages.jatic.net/public/program/design-principles/).\n- Adopt the Git Flow branching strategy.\n- See the\n [release process documentation](https://nrtk.readthedocs.io/en/latest/release_process.html)\n for detailed release information.\n- Additional contribution guidelines and issue reporting steps can be found in\n [CONTRIBUTING.md](./CONTRIBUTING.md).\n\n<!-- :auto contributing: -->\n\n<!-- :auto developer-tools: -->\n\n### Developer Tools\n\nEnsure the source tree is acquired locally before proceeding.\n\n#### Poetry Install\n\nYou can install using [Poetry](https://python-poetry.org/):\n\n> [!IMPORTANT] NRTK currently requires `poetry<2.0`\n\n> [!WARNING] Users unfamiliar with Poetry should use caution. See\n> [installation documentation](https://nrtk.readthedocs.io/en/latest/installation.html#from-source)\n> for more information.\n\n```bash\npoetry install --with main,linting,tests,docs --extras \"<extra1> <extra2> ...\"\n```\n\n#### Pre-commit Hooks\n\nPre-commit hooks ensure that code complies with required linting and formatting\nguidelines. These hooks run automatically before commits but can also be\nexecuted manually. To bypass checks during a commit, use the `--no-verify` flag.\n\nTo install and use pre-commit hooks:\n\n```bash\n# Install required dependencies\npoetry install --sync --with main,linting,tests,docs\n# Initialize pre-commit hooks for the repository\npoetry run pre-commit install\n# Run pre-commit checks on all files\npoetry run pre-commit run --all-files\n```\n\n<!-- :auto developer-tools: -->\n\n## NRTK Demonstration Tool\n\nThis [associated project](https://github.com/Kitware/nrtk-explorer) provides a\nlocal web application that provides a demonstration of visual saliency\ngeneration in a user interface. This provides an example of how image\nperturbation, as generated by this package, can be utilized in a user interface\nto facilitate dataset exploration. This tool uses the\n[trame framework](https://kitware.github.io/trame/).\n\n\n\n<!-- :auto license: -->\n\n## License\n\n[Apache 2.0](./LICENSE)\n\n<!-- :auto license: -->\n\n<!-- :auto contacts: -->\n\n## Contacts\n\n**Principal Investigator / Product Owner**: Brian Hu (Kitware) @brian.hu\n\n**Scrum Master / Maintainer**: Brandon RichardWebster (Kitware)\n@b.richardwebster\n\n**Deputy Scrum Master / Maintainer**: Emily Veenhuis (Kitware) @emily.veenhuis\n\n**Project Manager**: Keith Fieldhouse (Kitware) @keith.fieldhouse\n\n**Program Representative**: Austin Whitesell (MITRE) @awhitesell\n\n<!-- :auto contacts: -->\n\n<!-- :auto acknowledgment: -->\n\n## Acknowledgment\n\nThis material is based upon work supported by the Chief Digital and Artificial\nIntelligence Office under Contract No. 519TC-23-9-2032. The views and\nconclusions contained herein are those of the author(s) and should not be\ninterpreted as necessarily representing the official policies or endorsements,\neither expressed or implied, of the U.S. Government.\n\n<!-- :auto acknowledgment: -->\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Natural Robustness Toolkit (NRTK) is a platform for generating validated, sensor-specific perturbations and transformations used to evaluate the robustness of computer vision models.",
"version": "0.24.0",
"project_urls": {
"Documentation": "https://nrtk.readthedocs.io/"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "a4c9d98c5c10e4ac31e787a919f941bc179024814f5a0ac058c4c2c051cb113a",
"md5": "faac58bf38c4399d63f7ca431e822c2a",
"sha256": "421d4d039b08e213998b8733c53c1f2fb9f7b8c1af0ce7f43460540053ec29d8"
},
"downloads": -1,
"filename": "nrtk-0.24.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "faac58bf38c4399d63f7ca431e822c2a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 133249,
"upload_time": "2025-07-24T17:30:58",
"upload_time_iso_8601": "2025-07-24T17:30:58.606606Z",
"url": "https://files.pythonhosted.org/packages/a4/c9/d98c5c10e4ac31e787a919f941bc179024814f5a0ac058c4c2c051cb113a/nrtk-0.24.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "050d6521b72ec169957045bfdd1f49f24f3e6ae13c63f04ffc118fd240cccd2a",
"md5": "3b8b8a4ae13e656368d51643da3e5987",
"sha256": "e46b04f60e8c692a0174a05de7e47b54fa3db2e559841213c864c28d54edd0f1"
},
"downloads": -1,
"filename": "nrtk-0.24.0.tar.gz",
"has_sig": false,
"md5_digest": "3b8b8a4ae13e656368d51643da3e5987",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 84008,
"upload_time": "2025-07-24T17:31:00",
"upload_time_iso_8601": "2025-07-24T17:31:00.076347Z",
"url": "https://files.pythonhosted.org/packages/05/0d/6521b72ec169957045bfdd1f49f24f3e6ae13c63f04ffc118fd240cccd2a/nrtk-0.24.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-24 17:31:00",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "nrtk"
}