# Sat_MVSF (Production Version)
## Introduction
**This repository is an adaptation of the official Sat-MVSF framework ([GPCV/Sat-MVSF](http://gpcv.whu.edu.cn))**.
It is modified and optimized for practical multi-view satellite 3D reconstruction and production scenarios, with improvements in data organization, batch processing, and usability.
Sat-MVSF is a general deep learning MVS framework for three-dimensional (3D) reconstruction from multi-view optical satellite images.
## Differences from Official Sat-MVSF
- Data pipeline is optimized for large-scale satellite datasets and real production environments.
- Scripts and configurations support flexible multi-view grouping and practical project workflows.
- A CPU version of the MVS pipeline is implemented, where the depth-map projection to point clouds is processed in blocks, significantly improving efficiency on large datasets.
- Fully compatible with the original Sat-MVSF code and evaluation, while easier to integrate into automated or industrial workflows.
## Environment
The environment used for this project is listed below.
It is recommended to create it via **conda** using the provided `environment.yml` file.
```bash
conda env create -f environment.yml
conda activate MVS_env
```
## How to run
#### 1. Create info files for your data
The info files includes:
```
| File | Contents |
| --------------------- | ----------- |
| projection.prj | the projection infomation |
| border.txt | the extent and cell size of DSM |
| cameras_info.txt | the pathes of rpc files |
| images_info.txt | the pathes of image files |
| pair.txt | the pair infomation |
| range.txt | the searh range |
```
**(1) projection.prj**
The *.prj* files can be easily exported from GIS software such as Arcgis.
```
An example
PROJCS["WGS_1984_UTM_Zone_8N",GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137.0,298.257223563]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]],PROJECTION["Transverse_Mercator"],PARAMETER["False_Easting",500000.0],PARAMETER["False_Northing",0.0],PARAMETER["Central_Meridian",-135.0],PARAMETER["Scale_Factor",0.9996],PARAMETER["Latitude_Of_Origin",0.0],UNIT["Meter",1.0],AUTHORITY["EPSG",32608]]
```
**(2) border.txt**
```
x coordinate of the top-left grid cell # e.g. 493795.02546076314
y coordinate of the top-left grid cell # e.g. 3323843.8488957686
number of grid in x-direction # e.g. 2485
number of grid in y-direction # e.g. 2022
cell size in x-direction # e.g. 5.0
cell size in y-direction # e.g. 5.0
```
**(3) cameras_info.txt**
```
number_of_views
id_of_view0 the_path_to_the_rpc_file_of_view0
id_of_view1 the_path_to_the_rpc_file_of_view1
id_of_view2 the_path_to_the_rpc_file_of_view2
...
```
**(4) images_info.txt**
```
number_of_views
id_of_view0 the_path_to_the_img_file_of_view0
id_of_view1 the_path_to_the_img_file_of_view1
id_of_view2 the_path_to_the_img_file_of_view2
...
```
\* Note: For the same satellite image, the id needs to be the same in file (3) and file (4)
**(5) pair.txt**
```
number_of_pairs
the_reference_view_id0
number_of_source_view_for_the_reference0 the_source_view_id01 the_source_view_score01 the_source_view_id02 the_source_view_score01 ...
the_reference_view_id1
number_of_source_view_for_the_reference1 the_source_view_id11 the_source_view_score11 the_source_view_id12 the_source_view_score12 ...
...
```
\* Note: the_source_view_score1 is a const value here and it's the interface left for future work.
**(6) range.txt**
```
height_min
height_max
height_interval
```
When the height_min= height_max = 0, the script will automatically determine the range from the *.rpc* file.
#### 2. Modify the config file
The config options are store in a *.json* file:
```
{
"run_crop_img":true, # run image cropping or not
"run_mvs": true, # run mvs or not
"run_generate_points":true, # run points generation or not
"run_generate_dsm":true, # run dsm generation or not
"block_size_x": 768, # the block size in x-direction
"block_size_y": 384, # the block size in y-direction
"overlap_x": 0.0, # the overlap in x-direction
"overlap_y": 0.0, # the overlap in y-direction
"para": 64, # base size of the block
"invalid_value": -999, # invalid value in dsm
"position_threshold": 1, # the geometric consistency check threshold
"depth_threshold": 500, # the geometric consistency check threshold
"relative_depth_threshold": 100, # the geometric consistency check threshold
"geometric_num": 2 # the geometric consistency check threshold
}
```
#### 3. Run the script
```
python run_whu_tlc.py
```
If you want to run the pipeline for your own data, please refer the *run_whu_tlc.py* and write a new script for your own data. The core code is:
```
pipeline = Pipeline(image_paths, camera_paths, config, prj_str,
border_info, depth_range, output, logger, args)
pipeline.run()
```
## Citation
If you find this code helpful, please cite their work:
```
@article{GAO2023446,
title = {A general deep learning based framework for 3D reconstruction from multi-view stereo satellite images},
journal = {ISPRS Journal of Photogrammetry and Remote Sensing},
volume = {195},
pages = {446-461},
year = {2023},
issn = {0924-2716},
doi = {https://doi.org/10.1016/j.isprsjprs.2022.12.012},
url = {https://www.sciencedirect.com/science/article/pii/S0924271622003276},
author = {Jian Gao and Jin Liu and Shunping Ji},
}
```
## Acknowledgement
Thanks to the authors for opening up their outstanding work:
VisSat Satellite Stereo @https://github.com/Kai-46/VisSatToolSet
Cascade MVS-Net: https://github.com/alibaba/cascade-stereo
UCSNet: https://github.com/touristCheng/UCSNet
SP-NVS: https://github.com/Tian8du/SP-MVS
Raw data
{
"_id": null,
"home_page": null,
"name": "sat-mvsf",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.9",
"maintainer_email": null,
"keywords": "satellite, MVS, 3D reconstruction, remote sensing, PyTorch, GDAL",
"author": "Chen Liu (Wuhan University)",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/89/bf/72ae2cea95883a359c95fbd6665ebe0a986b82a8c9e14dc1e806f175f29d/sat_mvsf-0.1.4.tar.gz",
"platform": null,
"description": "# Sat_MVSF (Production Version)\r\n\r\n## Introduction\r\n\r\n**This repository is an adaptation of the official Sat-MVSF framework ([GPCV/Sat-MVSF](http://gpcv.whu.edu.cn))**. \r\nIt is modified and optimized for practical multi-view satellite 3D reconstruction and production scenarios, with improvements in data organization, batch processing, and usability.\r\n\r\nSat-MVSF is a general deep learning MVS framework for three-dimensional (3D) reconstruction from multi-view optical satellite images.\r\n\r\n## Differences from Official Sat-MVSF\r\n\r\n- Data pipeline is optimized for large-scale satellite datasets and real production environments. \r\n- Scripts and configurations support flexible multi-view grouping and practical project workflows. \r\n- A CPU version of the MVS pipeline is implemented, where the depth-map projection to point clouds is processed in blocks, significantly improving efficiency on large datasets. \r\n- Fully compatible with the original Sat-MVSF code and evaluation, while easier to integrate into automated or industrial workflows.\r\n\r\n## Environment\r\n\r\nThe environment used for this project is listed below. \r\nIt is recommended to create it via **conda** using the provided `environment.yml` file.\r\n\r\n```bash\r\nconda env create -f environment.yml\r\nconda activate MVS_env\r\n```\r\n\r\n## How to run\r\n#### 1. Create info files for your data\r\nThe info files includes: \r\n```\r\n| File | Contents |\r\n| --------------------- | ----------- |\r\n| projection.prj | the projection infomation |\r\n| border.txt | the extent and cell size of DSM |\r\n| cameras_info.txt | the pathes of rpc files |\r\n| images_info.txt | the pathes of image files |\r\n| pair.txt | the pair infomation |\r\n| range.txt | the searh range |\r\n```\r\n\r\n**(1) projection.prj**\r\nThe *.prj* files can be easily exported from GIS software such as Arcgis.\r\n```\r\nAn example\r\nPROJCS[\"WGS_1984_UTM_Zone_8N\",GEOGCS[\"GCS_WGS_1984\",DATUM[\"D_WGS_1984\",SPHEROID[\"WGS_1984\",6378137.0,298.257223563]],PRIMEM[\"Greenwich\",0.0],UNIT[\"Degree\",0.0174532925199433]],PROJECTION[\"Transverse_Mercator\"],PARAMETER[\"False_Easting\",500000.0],PARAMETER[\"False_Northing\",0.0],PARAMETER[\"Central_Meridian\",-135.0],PARAMETER[\"Scale_Factor\",0.9996],PARAMETER[\"Latitude_Of_Origin\",0.0],UNIT[\"Meter\",1.0],AUTHORITY[\"EPSG\",32608]]\r\n```\r\n**(2) border.txt**\r\n```\r\nx coordinate of the top-left grid cell # e.g. 493795.02546076314\r\ny coordinate of the top-left grid cell # e.g. 3323843.8488957686\r\nnumber of grid in x-direction # e.g. 2485\r\nnumber of grid in y-direction # e.g. 2022\r\ncell size in x-direction # e.g. 5.0\r\ncell size in y-direction # e.g. 5.0\r\n```\r\n**(3) cameras_info.txt**\r\n```\r\nnumber_of_views\r\nid_of_view0 the_path_to_the_rpc_file_of_view0\r\nid_of_view1 the_path_to_the_rpc_file_of_view1\r\nid_of_view2 the_path_to_the_rpc_file_of_view2\r\n...\r\n```\r\n**(4) images_info.txt**\r\n```\r\nnumber_of_views\r\nid_of_view0 the_path_to_the_img_file_of_view0\r\nid_of_view1 the_path_to_the_img_file_of_view1\r\nid_of_view2 the_path_to_the_img_file_of_view2\r\n...\r\n```\r\n\\* Note: For the same satellite image, the id needs to be the same in file (3) and file (4)\r\n\r\n**(5) pair.txt**\r\n```\r\nnumber_of_pairs\r\nthe_reference_view_id0\r\nnumber_of_source_view_for_the_reference0 the_source_view_id01 the_source_view_score01 the_source_view_id02 the_source_view_score01 ...\r\nthe_reference_view_id1\r\nnumber_of_source_view_for_the_reference1 the_source_view_id11 the_source_view_score11 the_source_view_id12 the_source_view_score12 ...\r\n...\r\n```\r\n\\* Note: the_source_view_score1 is a const value here and it's the interface left for future work.\r\n\r\n**(6) range.txt**\r\n```\r\nheight_min\r\nheight_max\r\nheight_interval\r\n```\r\nWhen the height_min= height_max = 0, the script will automatically determine the range from the *.rpc* file.\r\n#### 2. Modify the config file\r\nThe config options are store in a *.json* file:\r\n```\r\n{\r\n \"run_crop_img\":true, # run image cropping or not\r\n \"run_mvs\": true, # run mvs or not\r\n \"run_generate_points\":true, # run points generation or not\r\n \"run_generate_dsm\":true, # run dsm generation or not\r\n \"block_size_x\": 768, # the block size in x-direction\r\n \"block_size_y\": 384, # the block size in y-direction\r\n \"overlap_x\": 0.0, # the overlap in x-direction\r\n \"overlap_y\": 0.0, # the overlap in y-direction\r\n \"para\": 64, # base size of the block\r\n \"invalid_value\": -999, # invalid value in dsm\r\n \"position_threshold\": 1, # the geometric consistency check threshold\r\n \"depth_threshold\": 500, # the geometric consistency check threshold\r\n \"relative_depth_threshold\": 100, # the geometric consistency check threshold\r\n \"geometric_num\": 2 # the geometric consistency check threshold\r\n}\r\n```\r\n#### 3. Run the script\r\n```\r\npython run_whu_tlc.py\r\n```\r\nIf you want to run the pipeline for your own data, please refer the *run_whu_tlc.py* and write a new script for your own data. The core code is:\r\n```\r\npipeline = Pipeline(image_paths, camera_paths, config, prj_str,\r\n border_info, depth_range, output, logger, args)\r\npipeline.run()\r\n```\r\n\r\n## Citation\r\nIf you find this code helpful, please cite their work:\r\n```\r\n@article{GAO2023446,\r\ntitle = {A general deep learning based framework for 3D reconstruction from multi-view stereo satellite images},\r\njournal = {ISPRS Journal of Photogrammetry and Remote Sensing},\r\nvolume = {195},\r\npages = {446-461},\r\nyear = {2023},\r\nissn = {0924-2716},\r\ndoi = {https://doi.org/10.1016/j.isprsjprs.2022.12.012},\r\nurl = {https://www.sciencedirect.com/science/article/pii/S0924271622003276},\r\nauthor = {Jian Gao and Jin Liu and Shunping Ji},\r\n}\r\n```\r\n\r\n## Acknowledgement\r\nThanks to the authors for opening up their outstanding work:\r\nVisSat Satellite Stereo @https://github.com/Kai-46/VisSatToolSet\r\nCascade MVS-Net: https://github.com/alibaba/cascade-stereo\r\nUCSNet: https://github.com/touristCheng/UCSNet\r\nSP-NVS: https://github.com/Tian8du/SP-MVS\r\n",
"bugtrack_url": null,
"license": null,
"summary": "Satellite Multi-View Stereo reconstruction with CPU-blocked projection, PyTorch support, and GDAL/CuPy integration",
"version": "0.1.4",
"project_urls": null,
"split_keywords": [
"satellite",
" mvs",
" 3d reconstruction",
" remote sensing",
" pytorch",
" gdal"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "c2997087018436c937c8b28738f168aac19b549f8746c6b3a53aad8e8535ec4b",
"md5": "0057cf0ab0347b0252ef89f6d3e33a07",
"sha256": "25cb3ddcf0e89b884328657dc3d8fc0c6941912f534eeb6c7bf4b59aa0b4d776"
},
"downloads": -1,
"filename": "sat_mvsf-0.1.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "0057cf0ab0347b0252ef89f6d3e33a07",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.9",
"size": 8218285,
"upload_time": "2025-10-26T06:34:17",
"upload_time_iso_8601": "2025-10-26T06:34:17.999201Z",
"url": "https://files.pythonhosted.org/packages/c2/99/7087018436c937c8b28738f168aac19b549f8746c6b3a53aad8e8535ec4b/sat_mvsf-0.1.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "89bf72ae2cea95883a359c95fbd6665ebe0a986b82a8c9e14dc1e806f175f29d",
"md5": "5976bafe6ee9653d4e639b54fdab3911",
"sha256": "3c89bca3818b10ebe8696d0d1eb1260e16bd96372b2a32e8398e64507a02bc5d"
},
"downloads": -1,
"filename": "sat_mvsf-0.1.4.tar.gz",
"has_sig": false,
"md5_digest": "5976bafe6ee9653d4e639b54fdab3911",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.13,>=3.9",
"size": 8205163,
"upload_time": "2025-10-26T06:34:20",
"upload_time_iso_8601": "2025-10-26T06:34:20.412243Z",
"url": "https://files.pythonhosted.org/packages/89/bf/72ae2cea95883a359c95fbd6665ebe0a986b82a8c9e14dc1e806f175f29d/sat_mvsf-0.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-26 06:34:20",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "sat-mvsf"
}