MCCNN


NameMCCNN JSON
Version 1.2.4 PyPI version JSON
download
home_pagehttps://github.com/CNES/Pandora_MCCNN
SummaryMCCNN is a neural network for learning a similarity measure on image patches
upload_time2024-09-20 13:12:58
maintainerNone
docs_urlNone
authorCNES
requires_python>=3.8
licenseApache License 2.0
keywords 3d ia dem pandora correlation cars photogrammetry
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            <h1 align="center"> MCCNN </h1>

<h4 align="center">MCCNN neural network for stereo matching cost.</h4>

<p align="center">
  <a href="https://github.com/CNES/Pandora_MCCNN/actions"> <img src="https://github.com/CNES/Pandora_MCCNN/actions/workflows/mccnn_ci.yml/badge.svg?branch=master"></a>
  <a href="https://opensource.org/licenses/Apache-2.0/"><img src="https://img.shields.io/badge/License-Apache%202.0-blue.svg"></a>
</p>

<p align="center">
  <a href="#overview">Overview</a> •
  <a href="#install">Install</a> •
  <a href="#usage">Usage</a> •
  <a href="#usage">Pretrained Weights for MCCNN networks</a> •
  <a href="#related">Related</a> •
  <a href="#references">References</a>
</p>

## Overview

Pytorch implementation of [[MCCNN]](#1.) neural network which computes a similarity measure on pair of small image patches.

## Install from Pypi

**MCCNN** is available on Pypi and can be installed by:

```bash
pip install MCCNN
```

## Developer install

After cloning source code from repository, do a local pip install in a virtualenv through MCCNN Makefile:

```bash
make install
```

## Usage

Documentation explains how to train and use MCCNN convolutional neural network.
To generate it, please execute the following commands:

```bash
make docs
```

Let's see [pandora_plugin_mccnn](https://github.com/CNES/Pandora_plugin_mccnn) for real life example.

## Pretrained Weights for MCCNN networks

### Download weights files

Pretrained weights for mc-cnn fast and mc-cnn accurate neural networks are available in the weights directory :
-  mc_cnn_fast_mb_weights.pt and mc_cnn_accurate_mb_weights.pt are the weights of the pretrained networks on the Middlebury dataset [[Middlebury]](#Middlebury)
-  mc_cnn_fast_data_fusion_contest.pt and mc_cnn_accurate_data_fusion_contest.pt are the weights of the pretrained networks on the Data Fusion Contest dataset [[DFC]](#DFC)

To download the pretrained weights:

```bash
wget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_fast_mb_weights.pt
wget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_fast_data_fusion_contest.pt
wget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_accurate_mb_weights.pt
wget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_accurate_data_fusion_contest.pt
```

### Access weights from pip package

Pretrained weights are stored into the pip package and downloaded for any installation of mc_cnn pip package.
To access it, use the `weights` submodule :

```python
from mc_cnn.weights import get_weights
mc_cnn_fast_mb_weights_path = get_weights(arch="fast", training_dataset="middlebury")
mc_cnn_fast_data_fusion_contest_path = get_weights(arch="fast", training_dataset="dfc")
mc_cnn_accurate_mb_weights_path = get_weights(arch="accurate", training_dataset="middlebury")
mc_cnn_accurate_data_fusion_contest = get_weights(arch="accurate", training_dataset="dfc")
```

## References

Please cite the following paper when using MCCNN:

*Defonte, V., Dumas, L., Cournet, M., & Sarrazin, E. (2021, July). Evaluation of MC-CNN Based Stereo Matching Pipeline for the CO3D Earth Observation Program. In 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS (pp. 7670-7673). IEEE.*
   
*Cournet, M., Sarrazin, E., Dumas, L., Michel, J., Guinet, J., Youssefi, D., Defonte, V., Fardet, Q., 2020. Ground-truth generation and disparity estimation for optical satellite imagery. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences.*

<a id="1.">[MCCNN]</a> 
*Zbontar, J., & LeCun, Y. (2016). Stereo matching by training a convolutional neural network to compare image patches. J. Mach. Learn. Res., 17(1), 2287-2318.*

<a id="Middlebury">[Middlebury]</a> 
*Scharstein, D., Hirschmüller, H., Kitajima, Y., Krathwohl, G., Nešić, N., Wang, X., & Westling, P. (2014, September). High-resolution stereo datasets with subpixel-accurate ground truth. In German conference on pattern recognition (pp. 31-42). Springer, Cham.*

<a id="DFC">[DFC]</a> 
*Bosch, M., Foster, K., Christie, G., Wang, S., Hager, G. D., & Brown, M. (2019, January). Semantic stereo for incidental satellite images. In 2019 IEEE Winter Conference on Applications of Computer Vision (WACV) (pp. 1524-1532). IEEE.*

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/CNES/Pandora_MCCNN",
    "name": "MCCNN",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "3D, IA, DEM, pandora, correlation, cars, photogrammetry",
    "author": "CNES",
    "author_email": "cars@cnes.fr",
    "download_url": "https://files.pythonhosted.org/packages/31/bd/78a1d6ff2f8ba618b3e178ad399716a8dd0778446d2e07e5d1fca8fc1554/mccnn-1.2.4.tar.gz",
    "platform": "any",
    "description": "<h1 align=\"center\"> MCCNN </h1>\n\n<h4 align=\"center\">MCCNN neural network for stereo matching cost.</h4>\n\n<p align=\"center\">\n  <a href=\"https://github.com/CNES/Pandora_MCCNN/actions\"> <img src=\"https://github.com/CNES/Pandora_MCCNN/actions/workflows/mccnn_ci.yml/badge.svg?branch=master\"></a>\n  <a href=\"https://opensource.org/licenses/Apache-2.0/\"><img src=\"https://img.shields.io/badge/License-Apache%202.0-blue.svg\"></a>\n</p>\n\n<p align=\"center\">\n  <a href=\"#overview\">Overview</a> \u2022\n  <a href=\"#install\">Install</a> \u2022\n  <a href=\"#usage\">Usage</a> \u2022\n  <a href=\"#usage\">Pretrained Weights for MCCNN networks</a> \u2022\n  <a href=\"#related\">Related</a> \u2022\n  <a href=\"#references\">References</a>\n</p>\n\n## Overview\n\nPytorch implementation of [[MCCNN]](#1.) neural network which computes a similarity measure on pair of small image patches.\n\n## Install from Pypi\n\n**MCCNN** is available on Pypi and can be installed by:\n\n```bash\npip install MCCNN\n```\n\n## Developer install\n\nAfter cloning source code from repository, do a local pip install in a virtualenv through MCCNN Makefile:\n\n```bash\nmake install\n```\n\n## Usage\n\nDocumentation explains how to train and use MCCNN convolutional neural network.\nTo generate it, please execute the following commands:\n\n```bash\nmake docs\n```\n\nLet's see [pandora_plugin_mccnn](https://github.com/CNES/Pandora_plugin_mccnn) for real life example.\n\n## Pretrained Weights for MCCNN networks\n\n### Download weights files\n\nPretrained weights for mc-cnn fast and mc-cnn accurate neural networks are available in the weights directory :\n-  mc_cnn_fast_mb_weights.pt and mc_cnn_accurate_mb_weights.pt are the weights of the pretrained networks on the Middlebury dataset [[Middlebury]](#Middlebury)\n-  mc_cnn_fast_data_fusion_contest.pt and mc_cnn_accurate_data_fusion_contest.pt are the weights of the pretrained networks on the Data Fusion Contest dataset [[DFC]](#DFC)\n\nTo download the pretrained weights:\n\n```bash\nwget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_fast_mb_weights.pt\nwget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_fast_data_fusion_contest.pt\nwget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_accurate_mb_weights.pt\nwget https://raw.githubusercontent.com/CNES/Pandora_MCCNN/master/mc_cnn/weights/mc_cnn_accurate_data_fusion_contest.pt\n```\n\n### Access weights from pip package\n\nPretrained weights are stored into the pip package and downloaded for any installation of mc_cnn pip package.\nTo access it, use the `weights` submodule :\n\n```python\nfrom mc_cnn.weights import get_weights\nmc_cnn_fast_mb_weights_path = get_weights(arch=\"fast\", training_dataset=\"middlebury\")\nmc_cnn_fast_data_fusion_contest_path = get_weights(arch=\"fast\", training_dataset=\"dfc\")\nmc_cnn_accurate_mb_weights_path = get_weights(arch=\"accurate\", training_dataset=\"middlebury\")\nmc_cnn_accurate_data_fusion_contest = get_weights(arch=\"accurate\", training_dataset=\"dfc\")\n```\n\n## References\n\nPlease cite the following paper when using MCCNN:\n\n*Defonte, V., Dumas, L., Cournet, M., & Sarrazin, E. (2021, July). Evaluation of MC-CNN Based Stereo Matching Pipeline for the CO3D Earth Observation Program. In 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS (pp. 7670-7673). IEEE.*\n   \n*Cournet, M., Sarrazin, E., Dumas, L., Michel, J., Guinet, J., Youssefi, D., Defonte, V., Fardet, Q., 2020. Ground-truth generation and disparity estimation for optical satellite imagery. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences.*\n\n<a id=\"1.\">[MCCNN]</a> \n*Zbontar, J., & LeCun, Y. (2016). Stereo matching by training a convolutional neural network to compare image patches. J. Mach. Learn. Res., 17(1), 2287-2318.*\n\n<a id=\"Middlebury\">[Middlebury]</a> \n*Scharstein, D., Hirschm\u00fcller, H., Kitajima, Y., Krathwohl, G., Ne\u0161i\u0107, N., Wang, X., & Westling, P. (2014, September). High-resolution stereo datasets with subpixel-accurate ground truth. In German conference on pattern recognition (pp. 31-42). Springer, Cham.*\n\n<a id=\"DFC\">[DFC]</a> \n*Bosch, M., Foster, K., Christie, G., Wang, S., Hager, G. D., & Brown, M. (2019, January). Semantic stereo for incidental satellite images. In 2019 IEEE Winter Conference on Applications of Computer Vision (WACV) (pp. 1524-1532). IEEE.*\n",
    "bugtrack_url": null,
    "license": "Apache License 2.0",
    "summary": "MCCNN is a neural network for learning a similarity measure on image patches",
    "version": "1.2.4",
    "project_urls": {
        "Homepage": "https://github.com/CNES/Pandora_MCCNN"
    },
    "split_keywords": [
        "3d",
        " ia",
        " dem",
        " pandora",
        " correlation",
        " cars",
        " photogrammetry"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "31bd78a1d6ff2f8ba618b3e178ad399716a8dd0778446d2e07e5d1fca8fc1554",
                "md5": "12c43d5293eecc622c3cddccce3d2c49",
                "sha256": "2dd98c451d8990b7a921aeebff8903abaf04d95314370979c2d572f63346f6e7"
            },
            "downloads": -1,
            "filename": "mccnn-1.2.4.tar.gz",
            "has_sig": false,
            "md5_digest": "12c43d5293eecc622c3cddccce3d2c49",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 14729874,
            "upload_time": "2024-09-20T13:12:58",
            "upload_time_iso_8601": "2024-09-20T13:12:58.535466Z",
            "url": "https://files.pythonhosted.org/packages/31/bd/78a1d6ff2f8ba618b3e178ad399716a8dd0778446d2e07e5d1fca8fc1554/mccnn-1.2.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-20 13:12:58",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "CNES",
    "github_project": "Pandora_MCCNN",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "tox": true,
    "lcname": "mccnn"
}
        
Elapsed time: 0.30281s