Metadata-Version: 2.1
Name: nrtk
Version: 0.25.0
Summary: Natural Robustness Toolkit (NRTK) is a platform for generating validated, sensor-specific perturbations and transformations used to evaluate the robustness of computer vision models.
License: Apache-2.0
Author: Kitware, Inc.
Author-email: nrtk@kitware.com
Requires-Python: >=3.10,<3.13
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Operating System :: MacOS :: MacOS X
Classifier: Operating System :: Unix
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Provides-Extra: albumentations
Provides-Extra: diffusion
Provides-Extra: graphics
Provides-Extra: headless
Provides-Extra: maite
Provides-Extra: notebook-testing
Provides-Extra: pillow
Provides-Extra: pybsm
Provides-Extra: scikit-image
Provides-Extra: tools
Provides-Extra: waterdroplet
Requires-Dist: Pillow (>=10.3.0) ; extra == "tools" or extra == "pillow" or extra == "diffusion"
Requires-Dist: accelerate (>=0.20.0) ; extra == "diffusion"
Requires-Dist: albumentations (>=2.0.0) ; extra == "albumentations" or extra == "notebook-testing"
Requires-Dist: datasets (>=3.3.2) ; extra == "notebook-testing"
Requires-Dist: diffusers (>=0.21.0) ; extra == "diffusion"
Requires-Dist: fastapi (>=0.110.0) ; extra == "maite"
Requires-Dist: geopandas (>=0.14,<0.15) ; extra == "waterdroplet"
Requires-Dist: jupytext (>=1.16.7) ; extra == "notebook-testing"
Requires-Dist: kwcoco (>=0.2.18) ; extra == "tools"
Requires-Dist: maite (>=0.8.2,<0.9.0) ; extra == "maite"
Requires-Dist: matplotlib (>=3.7.1) ; extra == "notebook-testing"
Requires-Dist: numba (>=0.56.4) ; extra == "notebook-testing"
Requires-Dist: numpy (>=1.22,<2.0) ; python_version < "3.12"
Requires-Dist: numpy (>=1.26,<2.0) ; python_version >= "3.12" and python_version < "3.13"
Requires-Dist: opencv-python (>=4.6) ; extra == "graphics"
Requires-Dist: opencv-python-headless (>=4.6) ; extra == "headless"
Requires-Dist: pybsm (>=0.12.0) ; extra == "pybsm"
Requires-Dist: pycocotools (>=2.0.6)
Requires-Dist: pydantic (>=2.6.4) ; extra == "maite"
Requires-Dist: pydantic_settings (>=2.2.1) ; extra == "maite"
Requires-Dist: scikit-image (>=0.20) ; (python_version < "3.12") and (extra == "scikit-image")
Requires-Dist: scikit-image (>=0.22) ; (python_version >= "3.12") and (extra == "scikit-image")
Requires-Dist: scipy (>=1.10.0) ; extra == "waterdroplet"
Requires-Dist: setuptools (>=78.1.1)
Requires-Dist: shapely (>=2.0.7) ; extra == "waterdroplet"
Requires-Dist: smqtk-classifier (>=0.19.0)
Requires-Dist: smqtk-core (>=0.19)
Requires-Dist: smqtk-detection (>=0.22.0)
Requires-Dist: smqtk-image-io (>=0.17.1)
Requires-Dist: tabulate (>=0.9.0) ; extra == "notebook-testing"
Requires-Dist: torch (>=2.2.0) ; extra == "diffusion" or extra == "notebook-testing"
Requires-Dist: torchmetrics (>=1.0.0) ; extra == "notebook-testing"
Requires-Dist: torchvision (>=0.21.0) ; extra == "notebook-testing"
Requires-Dist: tqdm (>=4.64)
Requires-Dist: transformers (>=4.52.1) ; extra == "notebook-testing"
Requires-Dist: typing-extensions (>=4.5.0)
Requires-Dist: ultralytics (>=8.3.85) ; extra == "notebook-testing"
Requires-Dist: uvicorn (>=0.29.0) ; extra == "maite"
Requires-Dist: xaitk-jatic (>=0.7.0) ; extra == "notebook-testing"
Project-URL: Documentation, https://nrtk.readthedocs.io/
Description-Content-Type: text/markdown

![nrtk-logo](./docs/figures/nrtk-wordmark.png)

<hr/>

<!-- :auto badges: -->

[![PyPI - Python Version](https://img.shields.io/pypi/v/nrtk)](https://pypi.org/project/nrtk/)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/nrtk)
[![Documentation Status](https://readthedocs.org/projects/nrtk/badge/?version=latest)](https://nrtk.readthedocs.io/en/latest/?badge=latest)

<!-- :auto badges: -->

# Natural Robustness Toolkit (NRTK)

> The Natural Robustness Toolkit (NRTK) is an open source toolkit for generating
> operationally realistic perturbations to evaluate the natural robustness of
> computer vision algorithms.

The `nrtk` package evaluates the natural robustness of computer vision
algorithms to various perturbations, including sensor-specific changes to camera
focal length, aperture diameter, etc.

We have also created `nrtk.interop.maite` module to support AI T&E use cases and
workflows, through interoperability with
[MAITE](https://github.com/mit-ll-ai-technology/maite) and integration with
other [JATIC](https://cdao.pages.jatic.net/public/) tools. Users seeking to use
NRTK to perturb MAITE-wrapped datasets or evaluate MAITE-wrapped models should
utilize this module. Explore our
[T&E guides](https://nrtk.readthedocs.io/en/latest/testing_and_evaluation_notebooks.html)
which demonstrate how `nrtk` perturbations and `maite` can be applied to assess
operational risks.

## Why NRTK?

NRTK addresses the critical gap in evaluating computer vision model resilience
to real-world operational conditions beyond what traditional image augmentation
libraries cover. T&E engineers need precise methods to assess how models respond
to sensor-specific variables (focal length, aperture diameter, pixel pitch) and
environmental factors without the prohibitive costs of exhaustive data
collection. NRTK leverages pyBSM's physics-based models to rigorously simulate
how imaging sensors capture and process light, enabling systematic robustness
testing across parameter sweeps, identification of performance boundaries, and
visualization of model degradation. This capability is particularly valuable for
satellite and aerial imaging applications, where engineers can simulate
hypothetical sensor configurations to support cost-performance trade-off
analysis during system design—ensuring AI models maintain reliability when
deployed on actual hardware facing natural perturbations in the field.

## Target Audience

This toolkit is intended to help data scientists, developers, and T&E engineers
who want to rigorously evaluate and enhance the robustness of their computer
vision models. For users of the JATIC product suite, this toolkit is used to
assess model robustness against natural perturbations.

<!-- :auto installation: -->

## Installation

`nrtk` installation has been tested on Unix and Linux systems.

To install the current version via `pip`:

```bash
pip install nrtk
```

To install the current version via `conda-forge`:

```bash
conda install -c conda-forge nrtk
```

This installs core functionality, but many specific perturbers require
additional dependencies.

### Installation with Optional Features (Extras)

NRTK uses optional "extras" to avoid installing unncessary dependencies. You can
install extras with square brackets:

```bash
# Install with extras (note: no spaces after commas)
pip install nrtk[<extra1>,<extra2>]
```

#### Common Installation Patterns

```bash
# For basic OpenCV image perturbations
pip install nrtk[graphics]
# For basic Pillow image perturbations
pip install nrtk[Pillow]
# For pybsm's sensor-based perturbations
pip install nrtk[pybsm,graphics]
```

**Note**: Choose either `graphics` or `headless` for OpenCV, not both.

More information on extras and related perturbers, including a complete list of
extras, can be found
[here](https://nrtk.readthedocs.io/en/latest/installation.html#extras).

Details on the perturbers and their dependencies can be found
[here](https://nrtk.readthedocs.io/en/latest/implementations.html).

For more detailed installation instructions, visit the
[installation documentation](https://nrtk.readthedocs.io/en/latest/installation.html).

<!-- :auto installation: -->

<!-- :auto getting-started: -->

## Getting Started

Explore usage examples of the `nrtk` package in various contexts using the
Jupyter notebooks provided in the `./docs/examples/` directory.

<!-- :auto getting-started: -->

## Example: A First Look at NRTK Perturbations

Via the pyBSM package, NRTK exposes a large set of Optical Transfer Functions
(OTFs). These OTFs can simulate different environmental and sensor-based
effects. For example, the :ref:`JitterOTFPerturber <JitterOTFPerturber>`
simulates different levels of sensor jitter. By modifying its input parameters,
you can observe how sensor jitter affects image quality.

#### Input Image

Below is an example of an input image that will undergo a Jitter OTF
perturbation. This image represents the initial state before any transformation.

![augmented_image](./docs/images/input.jpg)

#### Code Sample

Below is some example code that applies a Jitter OTF transformation::

```
from nrtk.impls.perturb_image.pybsm.jitter_otf_perturber import JitterOTFPerturber
import numpy as np
from PIL import Image

INPUT_IMG_FILE = 'docs/images/input.jpg'
image = np.array(Image.open(INPUT_IMG_FILE))

otf = JitterOTFPerturber(sx=8e-6, sy=8e-6, name="test_name")
out_image = otf.perturb(image)
```

This code uses default values and provides a sample input image. However, you
can adjust the parameters and use your own image to visualize the perturbation.
The sx and sy parameters (the root-mean-squared jitter amplitudes in radians, in
the x and y directions) are the primary way to customize a jitter perturber.
Larger jitter amplitude generate a larger Gaussian blur kernel.

#### Resulting Image

The output image below shows the effects of the Jitter OTF on the original
input. This result illustrates the Gaussian blur introduced due to simulated
sensor jitter.

![augmented_image](./docs/images/output-jitter.jpg)

<!-- :auto documentation: -->

## Documentation

Documentation for both release snapshots and the latest main branch is available
on [ReadTheDocs](https://nrtk.readthedocs.io).

To build the Sphinx-based documentation locally for the latest reference:

```bash
# Install dependencies
poetry install --sync --with main,linting,tests,docs
# Navigate to the documentation root
cd docs
# Build the documentation
poetry run make html
# Open the generated documentation in your browser
firefox _build/html/index.html
```

<!-- :auto documentation: -->

<!-- :auto contributing: -->

## Contributing

Contributions are encouraged!

The following points help ensure contributions follow development practices.

- Follow the
  [JATIC Design Principles](https://cdao.pages.jatic.net/public/program/design-principles/).
- Adopt the Git Flow branching strategy.
- See the
  [release process documentation](https://nrtk.readthedocs.io/en/latest/release_process.html)
  for detailed release information.
- Additional contribution guidelines and issue reporting steps can be found in
  [CONTRIBUTING.md](./CONTRIBUTING.md).

<!-- :auto contributing: -->

<!-- :auto developer-tools: -->

### Developer Tools

Ensure the source tree is acquired locally before proceeding.

#### Poetry Install

You can install using [Poetry](https://python-poetry.org/):

> [!IMPORTANT] NRTK currently requires `poetry<2.0`

> [!WARNING] Users unfamiliar with Poetry should use caution. See
> [installation documentation](https://nrtk.readthedocs.io/en/latest/installation.html#from-source)
> for more information.

```bash
poetry install --with main,linting,tests,docs --extras "<extra1> <extra2> ..."
```

#### Pre-commit Hooks

Pre-commit hooks ensure that code complies with required linting and formatting
guidelines. These hooks run automatically before commits but can also be
executed manually. To bypass checks during a commit, use the `--no-verify` flag.

To install and use pre-commit hooks:

```bash
# Install required dependencies
poetry install --sync --with main,linting,tests,docs
# Initialize pre-commit hooks for the repository
poetry run pre-commit install
# Run pre-commit checks on all files
poetry run pre-commit run --all-files
```

<!-- :auto developer-tools: -->

## NRTK Demonstration Tool

This [associated project](https://github.com/Kitware/nrtk-explorer) provides a
local web application that provides a demonstration of visual saliency
generation in a user interface. This provides an example of how image
perturbation, as generated by this package, can be utilized in a user interface
to facilitate dataset exploration. This tool uses the
[trame framework](https://kitware.github.io/trame/).

![image1](./docs/figures/nrtk-explorer-example.png)

<!-- :auto license: -->

## License

[Apache 2.0](./LICENSE)

<!-- :auto license: -->

<!-- :auto contacts: -->

## Contacts

**Principal Investigator**: Brian Hu (Kitware) @brian.hu

**Project Manager / Product Owner**: Keith Fieldhouse (Kitware)
@keith.fieldhouse

**Scrum Master / Maintainer**: Brandon RichardWebster (Kitware)
@b.richardwebster

**Deputy Scrum Master / Deputy Maintainer**: Emily Veenhuis (Kitware)
@emily.veenhuis

**Program Representative**: Austin Whitesell (MITRE) @awhitesell

<!-- :auto contacts: -->

<!-- :auto acknowledgment: -->

## Acknowledgment

This material is based upon work supported by the Chief Digital and Artificial
Intelligence Office under Contract No. 519TC-23-9-2032. The views and
conclusions contained herein are those of the author(s) and should not be
interpreted as necessarily representing the official policies or endorsements,
either expressed or implied, of the U.S. Government.

<!-- :auto acknowledgment: -->

