vhr-cloudmask


Namevhr-cloudmask JSON
Version 1.3.2 PyPI version JSON
download
home_pagehttps://github.com/nasa-nccs-hpda/vhr-cloudmask
SummaryDeep learning pipeline to cloud mask VHR imagery
upload_time2024-10-03 18:14:01
maintainerNone
docs_urlNone
authorjordancaraballo
requires_python>=3.7
licenseBSD 3-Clause License
keywords cloudmask rioxarray rasterio
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # vhr-cloudmask

[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.7613207.svg)](https://doi.org/10.5281/zenodo.7613207)
![CI Workflow](https://github.com/nasa-nccs-hpda/vhr-cloudmask/actions/workflows/ci.yml/badge.svg)
![CI to DockerHub ](https://github.com/nasa-nccs-hpda/vhr-cloudmask/actions/workflows/dockerhub.yml/badge.svg)
![Code style: PEP8](https://github.com/nasa-nccs-hpda/vhr-cloudmask/actions/workflows/lint.yml/badge.svg)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Coverage Status](https://coveralls.io/repos/github/nasa-nccs-hpda/vhr-cloudmask/badge.svg?branch=main)](https://coveralls.io/github/nasa-nccs-hpda/vhr-cloudmask?branch=main)

Python library to perform semantic segmentation of clouds and cloud shadows using
very-high resolution remote sensing imagery by means of GPUs and CPU parallelization
for high performance and commodity base environments. 

- GitHub repo: https://github.com/nasa-nccs-hpda/vhr-cloudmask
- Documentation: https://nasa-nccs-hpda.github.io/vhr-cloudmask

## Objectives

- Library to process remote sensing imagery using GPU and CPU parallelization.
- Machine learning and deep learning cloud segmentation of VHR imagery.
- Large-scale image inference of very high-resolution imagery.

## Background

The detection of clouds is one of the first steps in the pre-processing of remotely sensed data. At coarse spatial resolution (> 100 m), clouds are bright and generally distinguishable from other landscape surfaces. At very high-resolution (< 3 m), detecting clouds becomes a significant challenge due to the presence of smaller features, with spectral characteristics similar to other land cover types, and thin (partially transparent) cloud forms. Furthermore, at this resolution, clouds can cover many thousands of pixels, making both the center and boundaries of the clouds prone to pixel contamination and variations in the spectral intensity. Techniques that rely solely on the spectral information of clouds underperform in these situations.

In this study, we propose a multi-regional and multi-sensor deep learning approach for the detection of clouds in very high-resolution WorldView satellite imagery. A modified UNet-like convolutional neural network (CNN) was used for the task of semantic segmentation in the regions of Vietnam, Senegal, and Ethiopia strictly using RGB + NIR spectral bands. In addition, we demonstrate the superiority of CNNs cloud predicted mapping accuracy of 81–91%, over traditional methods such as Random Forest algorithms of 57–88%. The best performing UNet model has an overall accuracy of 95% in all regions, while the Random Forest has an overall accuracy of 89%. We conclude with promising future research directions of the proposed methods for a global cloud cover implementation.

## Getting Started

The main recommended avenue for using vhr-cloudmask is through the publicly available set of containers
provided via this repository. If containers are not an option for your setup, follow the installation
instructions via PIP.

### Downloading the Container

All Python and GPU depenencies are installed in an OCI compliant Docker image. You can
download this image into a Singularity format to use in HPC systems.

```bash
singularity pull docker://nasanccs/vhr-cloudmask:latest
```

In some cases, HPC systems require Singularity containers to be built as sandbox environments because
of uid issues (this is the case of NCCS Explore). For that case you can build a sandbox using the following
command. Depending the filesystem, this can take between 5 minutes to an hour.

```bash
singularity build --sandbox vhr-cloudmask docker://nasanccs/vhr-cloudmask:latest
```

If you have done this step, you can skip the Installation step since the containers already
come with all dependencies installed.

### Installation

vhr-cloudmask can be installed by itself, but instructions for installing the full environments
are listed under the requirements directory so projects, examples, and notebooks can be run.

Note: PIP installations do not include CUDA libraries for GPU support. Make sure
NVIDIA libraries are installed locally in the system if not using conda.

vhr-cloudmask is available on [PyPI](https://pypi.org/project/vhr-cloudmask/).
To install vhr-cloudmask, run this command in your terminal or from inside a container:

```bash
pip install vhr-cloudmask
```

If you have installed vhr-cloudmask before and want to upgrade to the latest version,
you can run the following command in your terminal:

```bash
pip install -U vhr-cloudmask
```

### Running Inference of Clouds

Use the following command if you need to perform inference using a regex that points
to the necessary files and by leveraging the default global model. The following is
a singularity exec command with options from both Singularity and the cloud masking
application.

Singularity options:
- '-B': mounts a filesystem from the host into the container
- '--nv': mount container binaries/devices

vhr_cloumask_cli options:
- '-r': list of regex strings to find geotiff files to predict from
- '-o': output directory to store cloud masks
- '-s': pipeline step, to generate masks only we want to predict

```bash
singularity exec --nv -B $NOBACKUP,/explore/nobackup/people,/explore/nobackup/projects \
  /explore/nobackup/projects/ilab/containers/vhr-cloudmask.sif vhr-cloudmask-cli \
  -o '/explore/nobackup/projects/ilab/test/vhr-cloudmask' \
  -r '/explore/nobackup/projects/3sl/data/Tappan/Tappan16*_data.tif' '/explore/nobackup/projects/3sl/data/Tappan/Tappan15*_data.tif' \
  -s predict
```

To predict via slurm for a large set of files, use the following script which will start a large number
of jobs (up to your processing limit), and process the remaining files.

```bash
for i in {0..64}; do sbatch --mem-per-cpu=10240 -G1 -c10 -t05-00:00:00 -J clouds --wrap="singularity exec --nv -B $NOBACKUP,/explore/nobackup/people,/explore/nobackup/projects /explore/nobackup/projects/ilab/containers/vhr-cloudmask.sif vhr-cloudmask-cli -o '/explore/nobackup/projects/ilab/test/vhr-cloudmask' -r '/explore/nobackup/projects/3sl/data/Tappan/Tappan16*_data.tif' '/explore/nobackup/projects/3sl/data/Tappan/Tappan15*_data.tif' -s predict"; done
```

## Infrastructure

The vhr-cloudmask package is a set of CLI tools and Jupyter-based notebooks to manage and
structure the validation of remote sensing data. The CLI tools can be run from inside a container
or from any system where the vhr-cloudmask package is installed.

The main system requirements from this package are a system with GPUs to accelerate the training and
inference of imagery. If no GPU is available, the process will continue as expected but with a large
slowdown. There are no minimum system memory requirements given the sliding window procedures
implemented in the inference process.

## Package Structure

``` bash
├── archives              <- Legacy code stored to historical reference
├── docs                  <- Default documentation for working with this project
├── images                <- Store project images
├── notebooks             <- Jupyter notebooks
├── examples              <- Examples for utilizing the library
├── requirements          <- Requirements for installing the dependencies
├── scripts               <- Utility scripts for analysis
├── vhr_cloudmask         <- Library source code
├── README.md             <- The top-level README for developers using this project
├── CHANGELOG.md          <- Releases documentation
├── LICENSE               <- License documentation
└── setup.py              <- Script to install library
```

## Data Locations where this Workflow has been Validated

The vhr-cloudmask workflow has been validated in the following study areas
using WorldView imagery. Additional areas will be included into our validation
suite as part of upcoming efforts to improve the scalability of our models.

- Senegal
- Vietnam
- Ethiopia
- Oregon
- Alaska
- Whitesands
- Siberia

## Development Pipeline Details

When performing development (training a model, preprocessing, etc.), we want to run from the 
dev container so we can add the Python files to the PYTHONPATH. The following commmand is an example
command to run inference given a configuration file.

```bash
singularity exec --env PYTHONPATH="$NOBACKUP/development/tensorflow-caney:$NOBACKUP/development/vhr-cloudmask" \
  --nv -B $NOBACKUP,/explore/nobackup/people,/explore/nobackup/projects \
  /explore/nobackup/projects/ilab/containers/vhr-cloudmask.sif \
  python $NOBACKUP/development/vhr-cloudmask/vhr_cloudmask/view/cloudmask_cnn_pipeline_cli.py \
  -c $NOBACKUP/development/vhr-cloudmask/projects/cloud_cnn/configs/production/cloud_mask_alaska_senegal_3sl_cas.yaml \
  -s predict
```

If you do not have access to modify the configuration file, or just need to perform small changes to the model selection,
the regex to the files to predict, or the output directory, manually specify the arguments to the CLI file:

```bash
singularity exec --env PYTHONPATH="$NOBACKUP/development/tensorflow-caney:$NOBACKUP/development/vhr-cloudmask" \
  --nv -B $NOBACKUP,/explore/nobackup/people,/explore/nobackup/projects \
  /explore/nobackup/projects/ilab/containers/vhr-cloudmask.sif \
  python $NOBACKUP/development/vhr-cloudmask/vhr_cloudmask/view/cloudmask_cnn_pipeline_cli.py \
  -c $NOBACKUP/development/vhr-cloudmask/projects/cloud_cnn/configs/production/cloud_mask_alaska_senegal_3sl_cas.yaml \
  -o '/explore/nobackup/projects/ilab/test/vhr-cloudmask' \
  -r '/explore/nobackup/projects/3sl/data/Tappan/Tappan16*_data.tif' '/explore/nobackup/projects/3sl/data/Tappan/Tappan15*_data.tif' \
  -ib B G R N G1 G2 \
  -ob B G R N \
  -ps sieve smooth fill dilate \
  -s predict
```

## Manual Testing

For manual testing, you can always call the pytests component using the dev container for development. The following is
an example of manually testing the package components.

```bash
singularity exec --env PYTHONPATH="$NOBACKUP/development/tensorflow-caney:$NOBACKUP/development/vhr-cloudmask" --nv -B $NOBACKUP,/explore/nobackup/people,/explore/nobackup/projects,/css/nga /explore/nobackup/projects/ilab/containers/vhr-cloudmask.sif pytest $NOBACKUP/development/vhr-cloudmask/tests
```

## Authors

- Jordan Alexis Caraballo-Vega, jordan.a.caraballo-vega@nasa.gov
- Caleb S. Spradlin, caleb.s.spradlin@nasa.gov
- Margaret Wooten, margaret.wooten@nasa.gov

## Contributors

- Andrew Weis, aweis1998@icloud.com
- Brian Lee, brianlee52@bren.ucsb.edu

## Contributing

Please see our [guide for contributing to vhr-cloudmask](CONTRIBUTING.md). Contributions
are welcome, and they are greatly appreciated! Every little bit helps, and credit will
always be given.

You can contribute in many ways:

### Report Bugs

Report bugs at https://github.com/nasa-nccs-hpda/vhr-cloudmask/issues.

If you are reporting a bug, please include:
- Your operating system name and version.
- Any details about your local setup that might be helpful in troubleshooting.
- Detailed steps to reproduce the bug.

### Fix Bugs

Look through the GitHub issues for bugs. Anything tagged with "bug" and
"help wanted" is open to whoever wants to implement it.

### Implement Features

Look through the GitHub issues for features. Anything tagged with "enhancement" and "help wanted" is
open to whoever wants to implement it.

### Write Documentation

vhr-cloudmask could always use more documentation, whether as part of the official vhr-cloudmask docs,
in docstrings, or even on the web in blog posts, articles, and such.

### Submit Feedback

The best way to send feedback is to file an issue at https://github.com/nasa-nccs-hpda/vhr-cloudmask/issues.

If you are proposing a feature:
- Explain in detail how it would work.
- Keep the scope as narrow as possible, to make it easier to implement.
- Remember that this is a volunteer-driven project, and that contributions are welcome :)

## References

Tutorials will be published under [Medium](https://medium.com/@jordan.caraballo/) for additional support
and development, including how to use the library or any upcoming releases.

Please consider citing this when using vhr-cloudmask in a project. You can use the citation BibTeX to site
bot the software and the article:

### Paper

```bibtex
@article{caraballo2023optimizing,
  title={Optimizing WorldView-2,-3 cloud masking using machine learning approaches},
  author={Caraballo-Vega, JA and Carroll, ML and Neigh, CSR and Wooten, M and Lee, B and Weis, A and Aronne, M and Alemu, WG and Williams, Z},
  journal={Remote Sensing of Environment},
  volume={284},
  pages={113332},
  year={2023},
  publisher={Elsevier}
}
```

### Software

```bibtex
@software{jordan_alexis_caraballo_vega_2021_7613207,
  author       = {Jordan Alexis Caraballo-Vega},
  title        = {vhr-cloudmask},
  month        = dec,
  year         = 2021,
  publisher    = {Zenodo},
  version      = {1.0.0},
  doi          = {10.5281/zenodo.7613207},
  url          = {https://doi.org/10.5281/zenodo.7613207}
}
```

### Additional References

[1] Raschka, S., Patterson, J., & Nolet, C. (2020). Machine learning in python: Main developments and technology trends in data science, machine learning, and artificial intelligence. Information, 11(4), 193.

[2] Paszke, Adam; Gross, Sam; Chintala, Soumith; Chanan, Gregory; et all, PyTorch, (2016), GitHub repository, <https://github.com/pytorch/pytorch>. Accessed 13 February 2020.

[3] Caraballo-Vega, J., Carroll, M., Li, J., & Duffy, D. (2021, December). Towards Scalable & GPU Accelerated Earth Science Imagery Processing: An AI/ML Case Study. In AGU Fall Meeting 2021. AGU.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/nasa-nccs-hpda/vhr-cloudmask",
    "name": "vhr-cloudmask",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "cloudmask, rioxarray, rasterio",
    "author": "jordancaraballo",
    "author_email": "jordan.a.caraballo-vega@nasa.gov",
    "download_url": "https://files.pythonhosted.org/packages/7b/1e/e498d19529a57bf408f16e9def5479c40f2a6a1b0f7ca237efb0c1a99986/vhr_cloudmask-1.3.2.tar.gz",
    "platform": null,
    "description": "# vhr-cloudmask\n\n[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.7613207.svg)](https://doi.org/10.5281/zenodo.7613207)\n![CI Workflow](https://github.com/nasa-nccs-hpda/vhr-cloudmask/actions/workflows/ci.yml/badge.svg)\n![CI to DockerHub ](https://github.com/nasa-nccs-hpda/vhr-cloudmask/actions/workflows/dockerhub.yml/badge.svg)\n![Code style: PEP8](https://github.com/nasa-nccs-hpda/vhr-cloudmask/actions/workflows/lint.yml/badge.svg)\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n[![Coverage Status](https://coveralls.io/repos/github/nasa-nccs-hpda/vhr-cloudmask/badge.svg?branch=main)](https://coveralls.io/github/nasa-nccs-hpda/vhr-cloudmask?branch=main)\n\nPython library to perform semantic segmentation of clouds and cloud shadows using\nvery-high resolution remote sensing imagery by means of GPUs and CPU parallelization\nfor high performance and commodity base environments. \n\n- GitHub repo: https://github.com/nasa-nccs-hpda/vhr-cloudmask\n- Documentation: https://nasa-nccs-hpda.github.io/vhr-cloudmask\n\n## Objectives\n\n- Library to process remote sensing imagery using GPU and CPU parallelization.\n- Machine learning and deep learning cloud segmentation of VHR imagery.\n- Large-scale image inference of very high-resolution imagery.\n\n## Background\n\nThe detection of clouds is one of the first steps in the pre-processing of remotely sensed data. At coarse spatial resolution (> 100 m), clouds are bright and generally distinguishable from other landscape surfaces. At very high-resolution (< 3 m), detecting clouds becomes a significant challenge due to the presence of smaller features, with spectral characteristics similar to other land cover types, and thin (partially transparent) cloud forms. Furthermore, at this resolution, clouds can cover many thousands of pixels, making both the center and boundaries of the clouds prone to pixel contamination and variations in the spectral intensity. Techniques that rely solely on the spectral information of clouds underperform in these situations.\n\nIn this study, we propose a multi-regional and multi-sensor deep learning approach for the detection of clouds in very high-resolution WorldView satellite imagery. A modified UNet-like convolutional neural network (CNN) was used for the task of semantic segmentation in the regions of Vietnam, Senegal, and Ethiopia strictly using RGB + NIR spectral bands. In addition, we demonstrate the superiority of CNNs cloud predicted mapping accuracy of 81\u201391%, over traditional methods such as Random Forest algorithms of 57\u201388%. The best performing UNet model has an overall accuracy of 95% in all regions, while the Random Forest has an overall accuracy of 89%. We conclude with promising future research directions of the proposed methods for a global cloud cover implementation.\n\n## Getting Started\n\nThe main recommended avenue for using vhr-cloudmask is through the publicly available set of containers\nprovided via this repository. If containers are not an option for your setup, follow the installation\ninstructions via PIP.\n\n### Downloading the Container\n\nAll Python and GPU depenencies are installed in an OCI compliant Docker image. You can\ndownload this image into a Singularity format to use in HPC systems.\n\n```bash\nsingularity pull docker://nasanccs/vhr-cloudmask:latest\n```\n\nIn some cases, HPC systems require Singularity containers to be built as sandbox environments because\nof uid issues (this is the case of NCCS Explore). For that case you can build a sandbox using the following\ncommand. Depending the filesystem, this can take between 5 minutes to an hour.\n\n```bash\nsingularity build --sandbox vhr-cloudmask docker://nasanccs/vhr-cloudmask:latest\n```\n\nIf you have done this step, you can skip the Installation step since the containers already\ncome with all dependencies installed.\n\n### Installation\n\nvhr-cloudmask can be installed by itself, but instructions for installing the full environments\nare listed under the requirements directory so projects, examples, and notebooks can be run.\n\nNote: PIP installations do not include CUDA libraries for GPU support. Make sure\nNVIDIA libraries are installed locally in the system if not using conda.\n\nvhr-cloudmask is available on [PyPI](https://pypi.org/project/vhr-cloudmask/).\nTo install vhr-cloudmask, run this command in your terminal or from inside a container:\n\n```bash\npip install vhr-cloudmask\n```\n\nIf you have installed vhr-cloudmask before and want to upgrade to the latest version,\nyou can run the following command in your terminal:\n\n```bash\npip install -U vhr-cloudmask\n```\n\n### Running Inference of Clouds\n\nUse the following command if you need to perform inference using a regex that points\nto the necessary files and by leveraging the default global model. The following is\na singularity exec command with options from both Singularity and the cloud masking\napplication.\n\nSingularity options:\n- '-B': mounts a filesystem from the host into the container\n- '--nv': mount container binaries/devices\n\nvhr_cloumask_cli options:\n- '-r': list of regex strings to find geotiff files to predict from\n- '-o': output directory to store cloud masks\n- '-s': pipeline step, to generate masks only we want to predict\n\n```bash\nsingularity exec --nv -B $NOBACKUP,/explore/nobackup/people,/explore/nobackup/projects \\\n  /explore/nobackup/projects/ilab/containers/vhr-cloudmask.sif vhr-cloudmask-cli \\\n  -o '/explore/nobackup/projects/ilab/test/vhr-cloudmask' \\\n  -r '/explore/nobackup/projects/3sl/data/Tappan/Tappan16*_data.tif' '/explore/nobackup/projects/3sl/data/Tappan/Tappan15*_data.tif' \\\n  -s predict\n```\n\nTo predict via slurm for a large set of files, use the following script which will start a large number\nof jobs (up to your processing limit), and process the remaining files.\n\n```bash\nfor i in {0..64}; do sbatch --mem-per-cpu=10240 -G1 -c10 -t05-00:00:00 -J clouds --wrap=\"singularity exec --nv -B $NOBACKUP,/explore/nobackup/people,/explore/nobackup/projects /explore/nobackup/projects/ilab/containers/vhr-cloudmask.sif vhr-cloudmask-cli -o '/explore/nobackup/projects/ilab/test/vhr-cloudmask' -r '/explore/nobackup/projects/3sl/data/Tappan/Tappan16*_data.tif' '/explore/nobackup/projects/3sl/data/Tappan/Tappan15*_data.tif' -s predict\"; done\n```\n\n## Infrastructure\n\nThe vhr-cloudmask package is a set of CLI tools and Jupyter-based notebooks to manage and\nstructure the validation of remote sensing data. The CLI tools can be run from inside a container\nor from any system where the vhr-cloudmask package is installed.\n\nThe main system requirements from this package are a system with GPUs to accelerate the training and\ninference of imagery. If no GPU is available, the process will continue as expected but with a large\nslowdown. There are no minimum system memory requirements given the sliding window procedures\nimplemented in the inference process.\n\n## Package Structure\n\n``` bash\n\u251c\u2500\u2500 archives              <- Legacy code stored to historical reference\n\u251c\u2500\u2500 docs                  <- Default documentation for working with this project\n\u251c\u2500\u2500 images                <- Store project images\n\u251c\u2500\u2500 notebooks             <- Jupyter notebooks\n\u251c\u2500\u2500 examples              <- Examples for utilizing the library\n\u251c\u2500\u2500 requirements          <- Requirements for installing the dependencies\n\u251c\u2500\u2500 scripts               <- Utility scripts for analysis\n\u251c\u2500\u2500 vhr_cloudmask         <- Library source code\n\u251c\u2500\u2500 README.md             <- The top-level README for developers using this project\n\u251c\u2500\u2500 CHANGELOG.md          <- Releases documentation\n\u251c\u2500\u2500 LICENSE               <- License documentation\n\u2514\u2500\u2500 setup.py              <- Script to install library\n```\n\n## Data Locations where this Workflow has been Validated\n\nThe vhr-cloudmask workflow has been validated in the following study areas\nusing WorldView imagery. Additional areas will be included into our validation\nsuite as part of upcoming efforts to improve the scalability of our models.\n\n- Senegal\n- Vietnam\n- Ethiopia\n- Oregon\n- Alaska\n- Whitesands\n- Siberia\n\n## Development Pipeline Details\n\nWhen performing development (training a model, preprocessing, etc.), we want to run from the \ndev container so we can add the Python files to the PYTHONPATH. The following commmand is an example\ncommand to run inference given a configuration file.\n\n```bash\nsingularity exec --env PYTHONPATH=\"$NOBACKUP/development/tensorflow-caney:$NOBACKUP/development/vhr-cloudmask\" \\\n  --nv -B $NOBACKUP,/explore/nobackup/people,/explore/nobackup/projects \\\n  /explore/nobackup/projects/ilab/containers/vhr-cloudmask.sif \\\n  python $NOBACKUP/development/vhr-cloudmask/vhr_cloudmask/view/cloudmask_cnn_pipeline_cli.py \\\n  -c $NOBACKUP/development/vhr-cloudmask/projects/cloud_cnn/configs/production/cloud_mask_alaska_senegal_3sl_cas.yaml \\\n  -s predict\n```\n\nIf you do not have access to modify the configuration file, or just need to perform small changes to the model selection,\nthe regex to the files to predict, or the output directory, manually specify the arguments to the CLI file:\n\n```bash\nsingularity exec --env PYTHONPATH=\"$NOBACKUP/development/tensorflow-caney:$NOBACKUP/development/vhr-cloudmask\" \\\n  --nv -B $NOBACKUP,/explore/nobackup/people,/explore/nobackup/projects \\\n  /explore/nobackup/projects/ilab/containers/vhr-cloudmask.sif \\\n  python $NOBACKUP/development/vhr-cloudmask/vhr_cloudmask/view/cloudmask_cnn_pipeline_cli.py \\\n  -c $NOBACKUP/development/vhr-cloudmask/projects/cloud_cnn/configs/production/cloud_mask_alaska_senegal_3sl_cas.yaml \\\n  -o '/explore/nobackup/projects/ilab/test/vhr-cloudmask' \\\n  -r '/explore/nobackup/projects/3sl/data/Tappan/Tappan16*_data.tif' '/explore/nobackup/projects/3sl/data/Tappan/Tappan15*_data.tif' \\\n  -ib B G R N G1 G2 \\\n  -ob B G R N \\\n  -ps sieve smooth fill dilate \\\n  -s predict\n```\n\n## Manual Testing\n\nFor manual testing, you can always call the pytests component using the dev container for development. The following is\nan example of manually testing the package components.\n\n```bash\nsingularity exec --env PYTHONPATH=\"$NOBACKUP/development/tensorflow-caney:$NOBACKUP/development/vhr-cloudmask\" --nv -B $NOBACKUP,/explore/nobackup/people,/explore/nobackup/projects,/css/nga /explore/nobackup/projects/ilab/containers/vhr-cloudmask.sif pytest $NOBACKUP/development/vhr-cloudmask/tests\n```\n\n## Authors\n\n- Jordan Alexis Caraballo-Vega, jordan.a.caraballo-vega@nasa.gov\n- Caleb S. Spradlin, caleb.s.spradlin@nasa.gov\n- Margaret Wooten, margaret.wooten@nasa.gov\n\n## Contributors\n\n- Andrew Weis, aweis1998@icloud.com\n- Brian Lee, brianlee52@bren.ucsb.edu\n\n## Contributing\n\nPlease see our [guide for contributing to vhr-cloudmask](CONTRIBUTING.md). Contributions\nare welcome, and they are greatly appreciated! Every little bit helps, and credit will\nalways be given.\n\nYou can contribute in many ways:\n\n### Report Bugs\n\nReport bugs at https://github.com/nasa-nccs-hpda/vhr-cloudmask/issues.\n\nIf you are reporting a bug, please include:\n- Your operating system name and version.\n- Any details about your local setup that might be helpful in troubleshooting.\n- Detailed steps to reproduce the bug.\n\n### Fix Bugs\n\nLook through the GitHub issues for bugs. Anything tagged with \"bug\" and\n\"help wanted\" is open to whoever wants to implement it.\n\n### Implement Features\n\nLook through the GitHub issues for features. Anything tagged with \"enhancement\" and \"help wanted\" is\nopen to whoever wants to implement it.\n\n### Write Documentation\n\nvhr-cloudmask could always use more documentation, whether as part of the official vhr-cloudmask docs,\nin docstrings, or even on the web in blog posts, articles, and such.\n\n### Submit Feedback\n\nThe best way to send feedback is to file an issue at https://github.com/nasa-nccs-hpda/vhr-cloudmask/issues.\n\nIf you are proposing a feature:\n- Explain in detail how it would work.\n- Keep the scope as narrow as possible, to make it easier to implement.\n- Remember that this is a volunteer-driven project, and that contributions are welcome :)\n\n## References\n\nTutorials will be published under [Medium](https://medium.com/@jordan.caraballo/) for additional support\nand development, including how to use the library or any upcoming releases.\n\nPlease consider citing this when using vhr-cloudmask in a project. You can use the citation BibTeX to site\nbot the software and the article:\n\n### Paper\n\n```bibtex\n@article{caraballo2023optimizing,\n  title={Optimizing WorldView-2,-3 cloud masking using machine learning approaches},\n  author={Caraballo-Vega, JA and Carroll, ML and Neigh, CSR and Wooten, M and Lee, B and Weis, A and Aronne, M and Alemu, WG and Williams, Z},\n  journal={Remote Sensing of Environment},\n  volume={284},\n  pages={113332},\n  year={2023},\n  publisher={Elsevier}\n}\n```\n\n### Software\n\n```bibtex\n@software{jordan_alexis_caraballo_vega_2021_7613207,\n  author       = {Jordan Alexis Caraballo-Vega},\n  title        = {vhr-cloudmask},\n  month        = dec,\n  year         = 2021,\n  publisher    = {Zenodo},\n  version      = {1.0.0},\n  doi          = {10.5281/zenodo.7613207},\n  url          = {https://doi.org/10.5281/zenodo.7613207}\n}\n```\n\n### Additional References\n\n[1] Raschka, S., Patterson, J., & Nolet, C. (2020). Machine learning in python: Main developments and technology trends in data science, machine learning, and artificial intelligence. Information, 11(4), 193.\n\n[2] Paszke, Adam; Gross, Sam; Chintala, Soumith; Chanan, Gregory; et all, PyTorch, (2016), GitHub repository, <https://github.com/pytorch/pytorch>. Accessed 13 February 2020.\n\n[3] Caraballo-Vega, J., Carroll, M., Li, J., & Duffy, D. (2021, December). Towards Scalable & GPU Accelerated Earth Science Imagery Processing: An AI/ML Case Study. In AGU Fall Meeting 2021. AGU.\n",
    "bugtrack_url": null,
    "license": "BSD 3-Clause License",
    "summary": "Deep learning pipeline to cloud mask VHR imagery",
    "version": "1.3.2",
    "project_urls": {
        "Documentation": "https://github.com/nasa-nccs-hpda/vhr-cloudmask",
        "Homepage": "https://github.com/nasa-nccs-hpda/vhr-cloudmask",
        "Issues": "https://github.com/nasa-nccs-hpda/vhr-cloudmask/issues",
        "Source": "https://github.com/nasa-nccs-hpda/vhr-cloudmask"
    },
    "split_keywords": [
        "cloudmask",
        " rioxarray",
        " rasterio"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "57da6aca16be9dd853234780df565383a5830e293e0cc924e04a4cd385fe416a",
                "md5": "f45114712f7e650d24809580e061a660",
                "sha256": "b7338e9c013d4fea7973d1f0ff0de06cc72da49c27f13752ab97fbeb8243daa4"
            },
            "downloads": -1,
            "filename": "vhr_cloudmask-1.3.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f45114712f7e650d24809580e061a660",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 25700,
            "upload_time": "2024-10-03T18:14:00",
            "upload_time_iso_8601": "2024-10-03T18:14:00.049488Z",
            "url": "https://files.pythonhosted.org/packages/57/da/6aca16be9dd853234780df565383a5830e293e0cc924e04a4cd385fe416a/vhr_cloudmask-1.3.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7b1ee498d19529a57bf408f16e9def5479c40f2a6a1b0f7ca237efb0c1a99986",
                "md5": "c6e9a4678d2246499d583363a88ba53f",
                "sha256": "3414a888ddd4e0c8f3cc5824e9bb59f58b4dde4c2b79f9f5eef1ac48017f521e"
            },
            "downloads": -1,
            "filename": "vhr_cloudmask-1.3.2.tar.gz",
            "has_sig": false,
            "md5_digest": "c6e9a4678d2246499d583363a88ba53f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 29530,
            "upload_time": "2024-10-03T18:14:01",
            "upload_time_iso_8601": "2024-10-03T18:14:01.212456Z",
            "url": "https://files.pythonhosted.org/packages/7b/1e/e498d19529a57bf408f16e9def5479c40f2a6a1b0f7ca237efb0c1a99986/vhr_cloudmask-1.3.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-03 18:14:01",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "nasa-nccs-hpda",
    "github_project": "vhr-cloudmask",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "vhr-cloudmask"
}
        
Elapsed time: 0.46182s