Name | ramp-fair JSON |
Version |
0.1.2
JSON |
| download |
home_page | |
Summary | Replicable AI for Microplanning , Fork maintained for fAIr |
upload_time | 2022-12-14 08:30:52 |
maintainer | |
docs_url | None |
author | Carolyn Johnston |
requires_python | >=3 |
license | |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# [Replicable AI for Microplanning (ramp)](https://rampml.global/)
Our team aspires to turn over control of the data value chain to humanitarians. The Replicable AI for Microplanning (ramp) project is producing an open-source deep learning model to accurately digitize buildings in low-and-middle-income countries using satellite imagery as well as enable in-country users to build their own deep learning models for their regions of interest.
This codebase provides python-based machine learning and data processing tools, based on Tensorflow and the Python geospatial tool set, for using deep learning to predict building models from high-resolution satellite imagery.
The [ramp online documentation website](https://rampml.global/project-introduction/) contains complete documentation of our mission, the ramp project, the codebase, and the associated (very large) open source dataset of image chips and label geojson files.
The following screenshots several display examples of predicted building polygons, together with truth polygons, that have been produced by the ramp project group using this codebase (red polygons are predictions, green polygons are training labels provided by human labelers).
### St Vincent AOI
![St Vincent AOI](docs/images/st_vincent.png)
### Myanmar AOI
![Myanmar AOI](docs/images/myanmar.png)
### Ghana AOI
![Ghana AOI](docs/images/ghana.png)
And ramp is just getting started.
---
## About the codebase
Things you may want to do with this codebase include:
1. Running the scripts, including production scripts and data preparation tools.
2. Working with the Jupyter notebooks.
3. Working with the extensive ramp open source labeled dataset.
4. Training the models with your own data.
5. Modifying the underlying code for your own purposes.
The ramp codebase uses many Python libraries that are in standard use, some specialized libraries for image and geospatial processing, and the Tensorflow library for training and running deep neural networks. It can be very difficult to create a computational environment in which all these libraries are installed and play nicely with each other.
For this reason, we are also providing instructions to build a Docker image (based on a gpu-enabled Tensorflow 2.8 docker image with Jupyter notebook) that includes all of ramp's libraries and dependencies. All four of the above tasks can be performed from a docker container based on this image.
For the last 3 tasks, we recommend using vscode, Microsoft's open-source code editor. This code editor easily attaches to the running ramp docker container, and can run Jupyter notebooks, including the ones used to train the ramp models.
---
## Project structure
Note that the ramp project currently contains a fork of the [Solaris project](https://github.com/CosmiQ/solaris), which has not been under active development. Some bugfixes and modifications are in this fork, and some more extensive modifications of Solaris code have been moved into the ramp library.
```
ramp-staging
├── colab
│ └── README.md
│ └── jupyter_lab_on_colab.ipynb
│ └── train_ramp_model_on_colab.ipynb
├── data
├── docker
│ └── pipped-requirements.txt
├── Dockerfile
├── Dockerfile.dev
├── docs
│ ├── How_I_set_up_my_training_data.md
│ ├── how_to_debug_ramp_in_vscode.md
│ ├── How_to_run_production_and_evaluation.md
│ ├── list-of-ramp-scripts.md
│ └── using_the_ramp_training_configuration_file.md
| └── images
├── experiments
│ ├── dhaka_nw
│ ├── ghana
│ ├── gimmosss
│ ├── himmosss
├── notebooks
│ ├── augmentation_demo.ipynb
│ ├── Data_generator_demo.ipynb
│ ├── Duplicate_image_check.ipynb
│ ├── Independent_labelers_comparison_test.ipynb
│ ├── Train_ramp_model.ipynb
│ ├── Truncated_signed_distance_transform_example.ipynb
│ └── View_predictions.ipynb
│ ├── images
│ ├── sample-data
├── ramp
│ ├── __init__.py
│ ├── data_mgmt
│ │ ├── chip_label_pairs.py
│ │ ├── clr_callback.py
│ │ ├── data_generator.py
│ │ ├── display_data.py
│ │ ├── __init__.py
│ ├── models
│ │ ├── effunet_1.py
│ │ ├── __init__.py
│ │ ├── model_1_chollet_unet.py
│ ├── training
│ │ ├── augmentation_constructors.py
│ │ ├── callback_constructors.py
│ │ ├── __init__.py
│ │ ├── loss_constructors.py
│ │ ├── metric_constructors.py
│ │ ├── model_constructors.py
│ │ ├── optimizer_constructors.py
│ └── utils
│ ├── chip_utils.py
│ ├── eval_utils.py
│ ├── file_utils.py
│ ├── geo_utils.py
│ ├── imgproc_utils.py
│ ├── img_utils.py
│ ├── __init__.py
│ ├── label_utils.py
│ ├── log_fields.py
│ ├── lrfinder.py
│ ├── mask_to_vec_utils.py
│ ├── misc_ramp_utils.py
│ ├── model_utils.py
│ ├── multimask_utils.py
│ ├── ramp_exceptions.py
│ └── sdt_mask_utils.py
├── README.md
├── scripts
│ ├── add_area_to_labels.py
│ ├── binary_masks_from_polygons.py
│ ├── calculate_accuracy_iou.py
│ ├── find_learningrate.py
│ ├── get_chip_statistics.py
│ ├── get_dataset_loss_statistics.py
│ ├── get_labels_from_masks.py
│ ├── get_model_predictions.py
│ ├── make_train_val_split_lists.py
│ ├── move_chips_from_csv.py
│ ├── multi_masks_from_polygons.py
│ ├── polygonize_masks.py
│ ├── polygonize_multimasks.py
│ ├── remove_slivers.py
│ ├── sdt_masks_from_polygons.py
│ ├── tile_datasets.py
│ └── train_ramp.py
├── setup.py
├── shell-scripts
│ ├── create_aggregate_trainingset.bash
│ ├── create_masks_for_datasets.bash
│ ├── create_test_split_for_datasets.bash
│ ├── create_trainval_split_for_datasets.bash
│ ├── get_iou_metrics_for_datasets.bash
│ ├── get_iou_metrics_for_models.bash
│ ├── nvidia-check.sh
│ ├── run_production_on_datasets.bash
│ ├── run_production_on_single_dataset.bash
│ ├── write_predicted_masks_for_datasets.bash
│ └── write_truth_labels_for_datasets.bash
└── solaris
```
---
## How to get the ramp environment running on Google Colab
Instructions for getting started with ramp on Colab are in the colab/README.md file in this codebase.
Note that things will run very slowly and painfully in the free tier of Google Colab. If you will be running often on Google Colab, I recommend upgrading to Google Pro. If you will be using Google Colab as your compute platform for running large ramp training jobs, I recommend considering Google Pro Plus.
## How to get the RAMP environment running on a local server running Ubuntu 20.04 with GPU support
### High level Overview
1. You will need to run Ubuntu 20.04 Linux on a machine with at least one CUDA-enabled NVIDIA GPU. You will absolutely need to have sudo (root user) powers on it.
2. Install the currently recommended NVIDIA driver: [instructions here](https://linuxize.com/post/how-to-nvidia-drivers-on-ubuntu-20-04/). (It could be worse: happily, you do not need to install the CUDA libraries, as you would if you weren't using Docker).
3. Install docker CE, and the NVIDIA Container Toolkit ([instructions here](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html#docker)).
4. Create the 'docker' group and add yourself to it, so you can run docker without using sudo ([instructions here](https://docs.docker.com/engine/install/linux-postinstall/)).
5. Using docker, build the ramp base image, rampbase, as instructed below:
```text
# run from the ramp-code directory
docker build --tag rampbase .
```
**Important note**: You will have to rebuild rampbase after any change you make in the ramp module codebase (under ramp-code/ramp) so that the change will be installed in the container. This is not the case for running scripts or notebooks.
6. Start a docker container based on rampbase, and run a bash shell in it, as follows:
```
docker run -it --rm --gpus=all -v /home/carolyn/ramp-staging:/tf/ramp-staging -v /home/carolyn/ramp-data:/tf/ramp-data -p 8888:8888 rampbase bash
```
If you wish to run a script: do so in the bash shell, using the default python interpreter, which will be loaded with all the components of ramp.
Note that there is a Jupyter notebook server installed in a ramp container, and the *-p 8888:8888* portion of the 'docker run' command enables port forwarding so that you can run Jupyter notebook in a browser on your host machine.
If you wish to run a Jupyter notebook in your browser or in Jupyterlab, start your docker container using the same command without 'bash' at the end, as shown below. You will be given a link to the running Jupyter notebook server in the command output.
```
docker run -it --rm --gpus=all -v /home/carolyn/ramp-staging:/tf/ramp-staging -v /home/carolyn/ramp-data:/tf/ramp-data -p 8888:8888 rampbase
```
If you wish to run a bash shell in the Jupyter notebook container, so that you can run scripts as well as the Jupyter notebook, you can connect a bash shell to the same container using the following commands.
First, run:
```
docker ps
```
This will give an output listing of all the docker containers running on your machine, similar to that given by the Unix ps command:
```
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
209755699cea rampdev "bash" 3 hours ago Up 3 hours 0.0.0.0:8888->8888/tcp, :::8888->8888/tcp condescending_cerf
```
You can use either the container id or the container name to connect to it with a bash shell, using the following command:
```
docker exec -it condescending_cerf bash
```
This will give you a bash shell in the same container that is running your jupyter notebook.
Instructions on how to debug ramp code, and run Jupyter notebooks, in VScode on your desktop are given in [How to debug ramp in vscode](docs/how_to_debug_ramp_in_vscode.md).
---
## A note on running ramp as yourself, vs. as the root user
Note that by default, Docker runs containers as the root user. If you want to use vscode to attach to the container, you will need to run the container as the root user, because vscode needs root permission to install its server in the container.
This means that any files you create during the Docker session will have root user ownership. This is undesirable from a security standpoint, and is a hassle when you later need to change or delete the files you created on the local machine. (Note, to fix this problem run the following Linux command: *find . -user root | xargs sudo chown your-username*.)
If you are just going to interact with the bash shell (say to run production code or a script), I recommend running the container as yourself, rather than the root user. To do that, add the *--user 1000:1000* switch as shown below.
```text
# run from anywhere as yourself (as the non-root user)
docker run -it --rm --gpus=all --user 1000:1000 -v /home/carolyn/ramp-staging:/tf/ramp-staging -v /home/carolyn/ramp-data:/tf/ramp-data -p 8888:8888 rampbase
```
____
LICENSING:
This software has been licensed under the [Apache 2.0 software license](https://www.apache.org/licenses/LICENSE-2.0.txt).
Raw data
{
"_id": null,
"home_page": "",
"name": "ramp-fair",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3",
"maintainer_email": "",
"keywords": "",
"author": "Carolyn Johnston",
"author_email": "carolyn.johnston@dev.global",
"download_url": "https://files.pythonhosted.org/packages/4c/19/340b6311cc488750153265eeb7fc48f992942a689843dd53421ec6594b45/ramp-fair-0.1.2.tar.gz",
"platform": null,
"description": "# [Replicable AI for Microplanning (ramp)](https://rampml.global/) \n\nOur team aspires to turn over control of the data value chain to humanitarians. The Replicable AI for Microplanning (ramp) project is producing an open-source deep learning model to accurately digitize buildings in low-and-middle-income countries using satellite imagery as well as enable in-country users to build their own deep learning models for their regions of interest.\n\nThis codebase provides python-based machine learning and data processing tools, based on Tensorflow and the Python geospatial tool set, for using deep learning to predict building models from high-resolution satellite imagery.\n\nThe [ramp online documentation website](https://rampml.global/project-introduction/) contains complete documentation of our mission, the ramp project, the codebase, and the associated (very large) open source dataset of image chips and label geojson files.\n\nThe following screenshots several display examples of predicted building polygons, together with truth polygons, that have been produced by the ramp project group using this codebase (red polygons are predictions, green polygons are training labels provided by human labelers). \n\n### St Vincent AOI\n![St Vincent AOI](docs/images/st_vincent.png)\n\n### Myanmar AOI\n![Myanmar AOI](docs/images/myanmar.png)\n\n### Ghana AOI\n![Ghana AOI](docs/images/ghana.png)\n\nAnd ramp is just getting started.\n\n---\n\n## About the codebase\n\nThings you may want to do with this codebase include:\n\n1. Running the scripts, including production scripts and data preparation tools. \n2. Working with the Jupyter notebooks.\n3. Working with the extensive ramp open source labeled dataset.\n4. Training the models with your own data. \n5. Modifying the underlying code for your own purposes.\n\nThe ramp codebase uses many Python libraries that are in standard use, some specialized libraries for image and geospatial processing, and the Tensorflow library for training and running deep neural networks. It can be very difficult to create a computational environment in which all these libraries are installed and play nicely with each other. \n\nFor this reason, we are also providing instructions to build a Docker image (based on a gpu-enabled Tensorflow 2.8 docker image with Jupyter notebook) that includes all of ramp's libraries and dependencies. All four of the above tasks can be performed from a docker container based on this image. \n\nFor the last 3 tasks, we recommend using vscode, Microsoft's open-source code editor. This code editor easily attaches to the running ramp docker container, and can run Jupyter notebooks, including the ones used to train the ramp models. \n\n---\n\n## Project structure\n\nNote that the ramp project currently contains a fork of the [Solaris project](https://github.com/CosmiQ/solaris), which has not been under active development. Some bugfixes and modifications are in this fork, and some more extensive modifications of Solaris code have been moved into the ramp library.\n\n```\nramp-staging\n\u251c\u2500\u2500 colab\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 README.md\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 jupyter_lab_on_colab.ipynb\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 train_ramp_model_on_colab.ipynb\n\u251c\u2500\u2500 data\n\u251c\u2500\u2500 docker\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 pipped-requirements.txt\n\u251c\u2500\u2500 Dockerfile\n\u251c\u2500\u2500 Dockerfile.dev\n\u251c\u2500\u2500 docs\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 How_I_set_up_my_training_data.md\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 how_to_debug_ramp_in_vscode.md\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 How_to_run_production_and_evaluation.md\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 list-of-ramp-scripts.md\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 using_the_ramp_training_configuration_file.md\n| \u2514\u2500\u2500 images\n\u251c\u2500\u2500 experiments\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 dhaka_nw\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 ghana\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 gimmosss\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 himmosss \n\u251c\u2500\u2500 notebooks\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 augmentation_demo.ipynb\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 Data_generator_demo.ipynb\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 Duplicate_image_check.ipynb\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 Independent_labelers_comparison_test.ipynb\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 Train_ramp_model.ipynb\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 Truncated_signed_distance_transform_example.ipynb\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 View_predictions.ipynb\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 images\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 sample-data\n\u251c\u2500\u2500 ramp\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 __init__.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 data_mgmt\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 chip_label_pairs.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 clr_callback.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 data_generator.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 display_data.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 __init__.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 models\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 effunet_1.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 __init__.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 model_1_chollet_unet.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 training\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 augmentation_constructors.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 callback_constructors.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 __init__.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 loss_constructors.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 metric_constructors.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 model_constructors.py\n\u2502\u00a0\u00a0 \u2502\u00a0\u00a0 \u251c\u2500\u2500 optimizer_constructors.py\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 utils\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 chip_utils.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 eval_utils.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 file_utils.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 geo_utils.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 imgproc_utils.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 img_utils.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 __init__.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 label_utils.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 log_fields.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 lrfinder.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 mask_to_vec_utils.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 misc_ramp_utils.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 model_utils.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 multimask_utils.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 ramp_exceptions.py\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 sdt_mask_utils.py\n\u251c\u2500\u2500 README.md\n\u251c\u2500\u2500 scripts\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 add_area_to_labels.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 binary_masks_from_polygons.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 calculate_accuracy_iou.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 find_learningrate.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 get_chip_statistics.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 get_dataset_loss_statistics.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 get_labels_from_masks.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 get_model_predictions.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 make_train_val_split_lists.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 move_chips_from_csv.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 multi_masks_from_polygons.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 polygonize_masks.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 polygonize_multimasks.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 remove_slivers.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 sdt_masks_from_polygons.py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 tile_datasets.py\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 train_ramp.py\n\u251c\u2500\u2500 setup.py\n\u251c\u2500\u2500 shell-scripts\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 create_aggregate_trainingset.bash\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 create_masks_for_datasets.bash\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 create_test_split_for_datasets.bash\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 create_trainval_split_for_datasets.bash\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 get_iou_metrics_for_datasets.bash\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 get_iou_metrics_for_models.bash\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 nvidia-check.sh\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 run_production_on_datasets.bash\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 run_production_on_single_dataset.bash\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 write_predicted_masks_for_datasets.bash\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 write_truth_labels_for_datasets.bash\n\u2514\u2500\u2500 solaris\n```\n\n---\n\n## How to get the ramp environment running on Google Colab\n\nInstructions for getting started with ramp on Colab are in the colab/README.md file in this codebase. \n\nNote that things will run very slowly and painfully in the free tier of Google Colab. If you will be running often on Google Colab, I recommend upgrading to Google Pro. If you will be using Google Colab as your compute platform for running large ramp training jobs, I recommend considering Google Pro Plus. \n\n## How to get the RAMP environment running on a local server running Ubuntu 20.04 with GPU support\n\n### High level Overview\n\n1. You will need to run Ubuntu 20.04 Linux on a machine with at least one CUDA-enabled NVIDIA GPU. You will absolutely need to have sudo (root user) powers on it.\n2. Install the currently recommended NVIDIA driver: [instructions here](https://linuxize.com/post/how-to-nvidia-drivers-on-ubuntu-20-04/). (It could be worse: happily, you do not need to install the CUDA libraries, as you would if you weren't using Docker).\n3. Install docker CE, and the NVIDIA Container Toolkit ([instructions here](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html#docker)).\n4. Create the 'docker' group and add yourself to it, so you can run docker without using sudo ([instructions here](https://docs.docker.com/engine/install/linux-postinstall/)).\n5. Using docker, build the ramp base image, rampbase, as instructed below:\n\n```text\n# run from the ramp-code directory\ndocker build --tag rampbase .\n```\n\n**Important note**: You will have to rebuild rampbase after any change you make in the ramp module codebase (under ramp-code/ramp) so that the change will be installed in the container. This is not the case for running scripts or notebooks.\n\n6. Start a docker container based on rampbase, and run a bash shell in it, as follows:\n\n```\ndocker run -it --rm --gpus=all -v /home/carolyn/ramp-staging:/tf/ramp-staging -v /home/carolyn/ramp-data:/tf/ramp-data -p 8888:8888 rampbase bash\n```\n\nIf you wish to run a script: do so in the bash shell, using the default python interpreter, which will be loaded with all the components of ramp. \n\nNote that there is a Jupyter notebook server installed in a ramp container, and the *-p 8888:8888* portion of the 'docker run' command enables port forwarding so that you can run Jupyter notebook in a browser on your host machine. \n\nIf you wish to run a Jupyter notebook in your browser or in Jupyterlab, start your docker container using the same command without 'bash' at the end, as shown below. You will be given a link to the running Jupyter notebook server in the command output.\n\n```\ndocker run -it --rm --gpus=all -v /home/carolyn/ramp-staging:/tf/ramp-staging -v /home/carolyn/ramp-data:/tf/ramp-data -p 8888:8888 rampbase\n```\n\nIf you wish to run a bash shell in the Jupyter notebook container, so that you can run scripts as well as the Jupyter notebook, you can connect a bash shell to the same container using the following commands. \n\nFirst, run:\n\n```\ndocker ps\n```\n\nThis will give an output listing of all the docker containers running on your machine, similar to that given by the Unix ps command:\n\n```\nCONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES\n209755699cea rampdev \"bash\" 3 hours ago Up 3 hours 0.0.0.0:8888->8888/tcp, :::8888->8888/tcp condescending_cerf\n```\n\nYou can use either the container id or the container name to connect to it with a bash shell, using the following command:\n\n```\ndocker exec -it condescending_cerf bash\n```\n\nThis will give you a bash shell in the same container that is running your jupyter notebook. \n\nInstructions on how to debug ramp code, and run Jupyter notebooks, in VScode on your desktop are given in [How to debug ramp in vscode](docs/how_to_debug_ramp_in_vscode.md).\n\n---\n\n## A note on running ramp as yourself, vs. as the root user\n\nNote that by default, Docker runs containers as the root user. If you want to use vscode to attach to the container, you will need to run the container as the root user, because vscode needs root permission to install its server in the container. \n\nThis means that any files you create during the Docker session will have root user ownership. This is undesirable from a security standpoint, and is a hassle when you later need to change or delete the files you created on the local machine. (Note, to fix this problem run the following Linux command: *find . -user root | xargs sudo chown your-username*.)\n\nIf you are just going to interact with the bash shell (say to run production code or a script), I recommend running the container as yourself, rather than the root user. To do that, add the *--user 1000:1000* switch as shown below. \n\n```text\n# run from anywhere as yourself (as the non-root user)\ndocker run -it --rm --gpus=all --user 1000:1000 -v /home/carolyn/ramp-staging:/tf/ramp-staging -v /home/carolyn/ramp-data:/tf/ramp-data -p 8888:8888 rampbase\n```\n\n\n____\n\nLICENSING:\n\nThis software has been licensed under the [Apache 2.0 software license](https://www.apache.org/licenses/LICENSE-2.0.txt).\n",
"bugtrack_url": null,
"license": "",
"summary": "Replicable AI for Microplanning , Fork maintained for fAIr",
"version": "0.1.2",
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"md5": "42368054e551c92d99e9301afddd1776",
"sha256": "e7673dd7f253dc65c26d6927bd3ac3735e73852b32ba8a837a2c34453041d976"
},
"downloads": -1,
"filename": "ramp-fair-0.1.2.tar.gz",
"has_sig": false,
"md5_digest": "42368054e551c92d99e9301afddd1776",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3",
"size": 54983,
"upload_time": "2022-12-14T08:30:52",
"upload_time_iso_8601": "2022-12-14T08:30:52.687546Z",
"url": "https://files.pythonhosted.org/packages/4c/19/340b6311cc488750153265eeb7fc48f992942a689843dd53421ec6594b45/ramp-fair-0.1.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2022-12-14 08:30:52",
"github": false,
"gitlab": false,
"bitbucket": false,
"lcname": "ramp-fair"
}