<h1 align="center"> Pandora plugin mccnn </h1>
<h4 align="center">MC-CNN neural network plugin for <a href="https://github.com/CNES/Pandora"><img align="center" src="https://raw.githubusercontent.com/CNES/Pandora/master/doc/sources/Images/logo/logo_typo.svg?inline=false" width="64" height="64"/></a> .</h4>
<p align="center">
<a href="https://github.com/CNES/Pandora_plugin_mccnn/actions"><img src="https://github.com/CNES/Pandora_plugin_mccnn/actions/workflows/pandora_plugin_mccnn.yml/badge.svg?branch=master"></a>
<a href="https://opensource.org/licenses/Apache-2.0/"><img src="https://img.shields.io/badge/License-Apache%202.0-blue.svg"></a>
</p>
<p align="center">
<a href="#overview">Overview</a> •
<a href="#install">Install</a> •
<a href="#usage">Usage</a> •
<a href="#related">Related</a> •
<a href="#references">References</a>
</p>
## Overview
[Pandora](https://github.com/CNES/Pandora) stereo matching framework is designed to provide some state of the art stereo algorithms and to add others one as plugins.
This [Pandora plugin](https://pandora.readthedocs.io/en/stable/userguide/plugin.html) aims to compute the cost volume using the similarity measure produced by MC-CNN neural network [[MCCNN]](#MCCNN), with the [MCCNN](https://github.com/CNES/Pandora_MCCNN) library .
## Install
**pandora_plugin_mccnn** is available on Pypi and can be installed by:
```bash
pip install pandora_plugin_mccnn
```
This command will installed required dependencies as [Pandora](https://github.com/CNES/Pandora) and [MCCNN](https://github.com/CNES/Pandora_MCCNN).
## Usage
Let's refer to [Pandora's readme](https://github.com/CNES/Pandora/blob/master/README.md) or [online documentation](https://pandora.readthedocs.io/?badge=latest) for further information about Pandora general functionalities.
More specifically, you can find :
- [MCCNN configuration file example](https://raw.githubusercontent.com/CNES/Pandora/master/data_samples/json_conf_files/a_semi_global_matching_with_mccnn_similarity_measure.json)
- [documentation about MCCNN theory and parameters](https://pandora.readthedocs.io/en/stable/userguide/plugins/plugin_mccnn.html)
## Pretrained Weights for MCCNN networks
### Download weights files
Pretrained weights for mc-cnn fast and mc-cnn accurate neural networks are available in the weights directory :
- mc_cnn_fast_mb_weights.pt and mc_cnn_accurate_mb_weights.pt are the weights of the pretrained networks on the Middlebury dataset [[Middlebury]](#Middlebury)
- mc_cnn_fast_data_fusion_contest.pt and mc_cnn_accurate_data_fusion_contest.pt are the weights of the pretrained networks on the Data Fusion Contest dataset [[DFC]](#DFC)
To download the pretrained weights:
```bash
wget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_fast_mb_weights.pt
wget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_fast_data_fusion_contest.pt
wget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_accurate_mb_weights.pt
wget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_accurate_data_fusion_contest.pt
```
### Access weights from pip package
Pretrained weights are stored into the pip package and downloaded for any installation of mc_cnn pip package.
To access it, use the `weights` submodule :
```python
from mc_cnn.weights import get_weights
mc_cnn_fast_mb_weights_path = get_weights(arch="fast", training_dataset="middlebury")
mc_cnn_fast_data_fusion_contest_path = get_weights(arch="fast", training_dataset="dfc")
mc_cnn_accurate_mb_weights_path = get_weights(arch="accurate", training_dataset="middlebury")
mc_cnn_accurate_data_fusion_contest = get_weights(arch="accurate", training_dataset="dfc")
```
## Output example
The figures below show disparity maps produced on mountain, and desert areas generated with the Census and MCCNN similarity measures :
| Left image | Left disparity map using Census measure | Left disparity map using mc-cnn fast pretrained on Middlebury | Left disparity map using mc-cnn fast pretrained DFC |
| ------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------ | ---------------------------------------------------------------------- | ------------------------------------------------------------------------ |
| ![mountain_img](data_samples/mountain_img.png) | ![mountain_census](data_samples/mountain_census.png) | ![mountain_mid](data_samples/mountain_mccnn_fast_middlebury.png) | ![mountain_dfc](data_samples/mountain_mccnn_fast_data_fusion_contest.png)|
| ![desert_img](data_samples/desert_img.png) | ![desert_census](data_samples/desert_census.png) | ![desert_mid](data_samples/desert_mccnn_fast_middlebury.png) | ![desert_dfc](data_samples/desert_mccnn_fast_data_fusion_contest.png) |
## Related
[Pandora](https://github.com/CNES/Pandora) - A stereo matching framework
[MCCNN](https://github.com/CNES/Pandora_MCCNN) - Pytorch/python implementation of mc-cnn neural network
## References
Please cite the following paper when using Pandora and pandora_plugin_mccnn:
*Defonte, V., Dumas, L., Cournet, M., & Sarrazin, E. (2021, July). Evaluation of MC-CNN Based Stereo Matching Pipeline for the CO3D Earth Observation Program. In 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS (pp. 7670-7673). IEEE.*
*Cournet, M., Sarrazin, E., Dumas, L., Michel, J., Guinet, J., Youssefi, D., Defonte, V., Fardet, Q., 2020. Ground-truth generation and disparity estimation for optical satellite imagery. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences.*
<a id="MCCNN">[MCCNN]</a>
*Zbontar, J., & LeCun, Y. (2016). Stereo matching by training a convolutional neural network to compare image patches. J. Mach. Learn. Res., 17(1), 2287-2318.*
<a id="Middlebury">[Middlebury]</a>
*Scharstein, D., Hirschmüller, H., Kitajima, Y., Krathwohl, G., Nešić, N., Wang, X., & Westling, P. (2014, September). High-resolution stereo datasets with subpixel-accurate ground truth. In German conference on pattern recognition (pp. 31-42). Springer, Cham.*
<a id="DFC">[DFC]</a>
*Bosch, M., Foster, K., Christie, G., Wang, S., Hager, G. D., & Brown, M. (2019, January). Semantic stereo for incidental satellite images. In 2019 IEEE Winter Conference on Applications of Computer Vision (WACV) (pp. 1524-1532). IEEE.*
Raw data
{
"_id": null,
"home_page": "https://github.com/CNES/Pandora_plugin_mccnn",
"name": "pandora-plugin-mccnn",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": null,
"author": "CNES",
"author_email": "myriam.cournet@cnes.fr",
"download_url": "https://files.pythonhosted.org/packages/12/9a/8360ebece44a622724e1ff804442cca8b85c363fd007991108847fd9846c/pandora_plugin_mccnn-1.3.3.tar.gz",
"platform": null,
"description": "<h1 align=\"center\"> Pandora plugin mccnn </h1>\n\n<h4 align=\"center\">MC-CNN neural network plugin for <a href=\"https://github.com/CNES/Pandora\"><img align=\"center\" src=\"https://raw.githubusercontent.com/CNES/Pandora/master/doc/sources/Images/logo/logo_typo.svg?inline=false\" width=\"64\" height=\"64\"/></a> .</h4>\n\n<p align=\"center\">\n <a href=\"https://github.com/CNES/Pandora_plugin_mccnn/actions\"><img src=\"https://github.com/CNES/Pandora_plugin_mccnn/actions/workflows/pandora_plugin_mccnn.yml/badge.svg?branch=master\"></a>\n <a href=\"https://opensource.org/licenses/Apache-2.0/\"><img src=\"https://img.shields.io/badge/License-Apache%202.0-blue.svg\"></a>\n</p>\n\n<p align=\"center\">\n <a href=\"#overview\">Overview</a> \u2022\n <a href=\"#install\">Install</a> \u2022\n <a href=\"#usage\">Usage</a> \u2022\n <a href=\"#related\">Related</a> \u2022\n <a href=\"#references\">References</a>\n</p>\n\n## Overview\n\n[Pandora](https://github.com/CNES/Pandora) stereo matching framework is designed to provide some state of the art stereo algorithms and to add others one as plugins. \nThis [Pandora plugin](https://pandora.readthedocs.io/en/stable/userguide/plugin.html) aims to compute the cost volume using the similarity measure produced by MC-CNN neural network [[MCCNN]](#MCCNN), with the [MCCNN](https://github.com/CNES/Pandora_MCCNN) library .\n\n## Install\n\n**pandora_plugin_mccnn** is available on Pypi and can be installed by:\n\n```bash\npip install pandora_plugin_mccnn\n```\n\nThis command will installed required dependencies as [Pandora](https://github.com/CNES/Pandora) and [MCCNN](https://github.com/CNES/Pandora_MCCNN).\n\n## Usage\n\nLet's refer to [Pandora's readme](https://github.com/CNES/Pandora/blob/master/README.md) or [online documentation](https://pandora.readthedocs.io/?badge=latest) for further information about Pandora general functionalities. \n\nMore specifically, you can find :\n- [MCCNN configuration file example](https://raw.githubusercontent.com/CNES/Pandora/master/data_samples/json_conf_files/a_semi_global_matching_with_mccnn_similarity_measure.json)\n- [documentation about MCCNN theory and parameters](https://pandora.readthedocs.io/en/stable/userguide/plugins/plugin_mccnn.html)\n\n\n## Pretrained Weights for MCCNN networks\n\n### Download weights files\n\nPretrained weights for mc-cnn fast and mc-cnn accurate neural networks are available in the weights directory :\n- mc_cnn_fast_mb_weights.pt and mc_cnn_accurate_mb_weights.pt are the weights of the pretrained networks on the Middlebury dataset [[Middlebury]](#Middlebury)\n- mc_cnn_fast_data_fusion_contest.pt and mc_cnn_accurate_data_fusion_contest.pt are the weights of the pretrained networks on the Data Fusion Contest dataset [[DFC]](#DFC)\n\nTo download the pretrained weights:\n\n```bash\nwget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_fast_mb_weights.pt\nwget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_fast_data_fusion_contest.pt\nwget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_accurate_mb_weights.pt\nwget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_accurate_data_fusion_contest.pt\n```\n\n### Access weights from pip package\n\nPretrained weights are stored into the pip package and downloaded for any installation of mc_cnn pip package.\nTo access it, use the `weights` submodule :\n\n```python\nfrom mc_cnn.weights import get_weights\nmc_cnn_fast_mb_weights_path = get_weights(arch=\"fast\", training_dataset=\"middlebury\")\nmc_cnn_fast_data_fusion_contest_path = get_weights(arch=\"fast\", training_dataset=\"dfc\")\nmc_cnn_accurate_mb_weights_path = get_weights(arch=\"accurate\", training_dataset=\"middlebury\")\nmc_cnn_accurate_data_fusion_contest = get_weights(arch=\"accurate\", training_dataset=\"dfc\")\n```\n\n## Output example\n\nThe figures below show disparity maps produced on mountain, and desert areas generated with the Census and MCCNN similarity measures :\n\n| Left image | Left disparity map using Census measure | Left disparity map using mc-cnn fast pretrained on Middlebury | Left disparity map using mc-cnn fast pretrained DFC |\n| ------------------------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------ | ---------------------------------------------------------------------- | ------------------------------------------------------------------------ |\n| ![mountain_img](data_samples/mountain_img.png) | ![mountain_census](data_samples/mountain_census.png) | ![mountain_mid](data_samples/mountain_mccnn_fast_middlebury.png) | ![mountain_dfc](data_samples/mountain_mccnn_fast_data_fusion_contest.png)|\n| ![desert_img](data_samples/desert_img.png) | ![desert_census](data_samples/desert_census.png) | ![desert_mid](data_samples/desert_mccnn_fast_middlebury.png) | ![desert_dfc](data_samples/desert_mccnn_fast_data_fusion_contest.png) |\n\n## Related\n\n[Pandora](https://github.com/CNES/Pandora) - A stereo matching framework \n[MCCNN](https://github.com/CNES/Pandora_MCCNN) - Pytorch/python implementation of mc-cnn neural network\n\n## References\n\nPlease cite the following paper when using Pandora and pandora_plugin_mccnn: \n\n*Defonte, V., Dumas, L., Cournet, M., & Sarrazin, E. (2021, July). Evaluation of MC-CNN Based Stereo Matching Pipeline for the CO3D Earth Observation Program. In 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS (pp. 7670-7673). IEEE.*\n\n*Cournet, M., Sarrazin, E., Dumas, L., Michel, J., Guinet, J., Youssefi, D., Defonte, V., Fardet, Q., 2020. Ground-truth generation and disparity estimation for optical satellite imagery. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences.*\n\n<a id=\"MCCNN\">[MCCNN]</a> \n*Zbontar, J., & LeCun, Y. (2016). Stereo matching by training a convolutional neural network to compare image patches. J. Mach. Learn. Res., 17(1), 2287-2318.*\n\n<a id=\"Middlebury\">[Middlebury]</a> \n*Scharstein, D., Hirschm\u00fcller, H., Kitajima, Y., Krathwohl, G., Ne\u0161i\u0107, N., Wang, X., & Westling, P. (2014, September). High-resolution stereo datasets with subpixel-accurate ground truth. In German conference on pattern recognition (pp. 31-42). Springer, Cham.*\n\n<a id=\"DFC\">[DFC]</a> \n*Bosch, M., Foster, K., Christie, G., Wang, S., Hager, G. D., & Brown, M. (2019, January). Semantic stereo for incidental satellite images. In 2019 IEEE Winter Conference on Applications of Computer Vision (WACV) (pp. 1524-1532). IEEE.*\n",
"bugtrack_url": null,
"license": "Apache License 2.0",
"summary": "Pandora plugin to create the cost volume with the neural network mc-cnn",
"version": "1.3.3",
"project_urls": {
"Homepage": "https://github.com/CNES/Pandora_plugin_mccnn"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "129a8360ebece44a622724e1ff804442cca8b85c363fd007991108847fd9846c",
"md5": "0263afd351ff71f9550029d1348a9e1c",
"sha256": "64587ed463b14dd6fc7c1e55fd1a7e2842c00cd76383ff4a5d3042d61d67d97a"
},
"downloads": -1,
"filename": "pandora_plugin_mccnn-1.3.3.tar.gz",
"has_sig": false,
"md5_digest": "0263afd351ff71f9550029d1348a9e1c",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 4867015,
"upload_time": "2024-09-23T08:46:47",
"upload_time_iso_8601": "2024-09-23T08:46:47.170155Z",
"url": "https://files.pythonhosted.org/packages/12/9a/8360ebece44a622724e1ff804442cca8b85c363fd007991108847fd9846c/pandora_plugin_mccnn-1.3.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-23 08:46:47",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "CNES",
"github_project": "Pandora_plugin_mccnn",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "pandora-plugin-mccnn"
}