<!--
Copyright (c) MONAI Consortium
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
# MONAI Label
[![License](https://img.shields.io/badge/license-Apache%202.0-green.svg)](https://opensource.org/licenses/Apache-2.0)
[![CI Build](https://github.com/Project-MONAI/MONAILabel/workflows/build/badge.svg?branch=main)](https://github.com/Project-MONAI/MONAILabel/commits/main)
[![Documentation Status](https://readthedocs.org/projects/monailabel/badge/?version=latest)](https://docs.monai.io/projects/label/en/latest/?badge=latest)
[![PyPI version](https://badge.fury.io/py/monailabel.svg)](https://badge.fury.io/py/monailabel)
[![Azure DevOps tests (compact)](https://img.shields.io/azure-devops/tests/projectmonai/monai-label/10?compact_message)](https://dev.azure.com/projectmonai/monai-label/_test/analytics?definitionId=10&contextType=build)
[![Azure DevOps coverage](https://img.shields.io/azure-devops/coverage/projectmonai/monai-label/10)](https://dev.azure.com/projectmonai/monai-label/_build?definitionId=10)
[![codecov](https://codecov.io/gh/Project-MONAI/MONAILabel/branch/main/graph/badge.svg)](https://codecov.io/gh/Project-MONAI/MONAILabel)
MONAI Label is an intelligent open source image labeling and learning tool that enables users to create annotated datasets and build AI annotation models for clinical evaluation. MONAI Label enables application developers to build labeling apps in a serverless way, where custom labeling apps are exposed as a service through the MONAI Label Server.
MONAI Label is a server-client system that facilitates interactive medical image annotation by using AI. It is an
open-source and easy-to-install ecosystem that can run locally on a machine with single or multiple GPUs. Both server
and client work on the same/different machine. It shares the same principles
with [MONAI](https://github.com/Project-MONAI).
Refer to full [MONAI Label documentations](https://docs.monai.io/projects/label/en/latest/index.html) for more details or check out our [MONAI Label Deep Dive videos series](https://www.youtube.com/playlist?list=PLtoSVSQ2XzyD4lc-lAacFBzOdv5Ou-9IA).
Refer to [MONAI Label Tutorial](https://github.com/Project-MONAI/tutorials/tree/main/monailabel) series for application and viewer workflows with different medical image tasks. Notebook-like tutorials are created for detailed instructions.
### Table of Contents
- [Overview](#overview)
- [Highlights and Features](#highlights-and-features)
- [Supported Matrix](#supported-matrix)
- [Getting Started with MONAI Label](#getting-started-with-monai-label)
- [Step 1. Installation](#step-1-installation)
- [Step 2. MONAI Label Sample Applications](#step-2-monai-label-sample-applications)
- [Step 3. MONAI Label Supported Viewers](#step-3-monai-label-supported-viewers)
- [Step 4. Data Preparation](#step-4-data-preparation)
- [Step 5. Start MONAI Label Server and Start Annotating!](#step-5-start-monai-label-server-and-start-annotating)
- [MONAI Label Tutorials](#monai-label-tutorials)
- [Cite MONAI Label](#cite)
- [Contributing](#contributing)
- [Community](#community)
- [Additional Resources](#additional-resources)
### Overview
MONAI Label reduces the time and effort of annotating new datasets and enables the adaptation of AI to the task at hand by continuously learning from user interactions and data. MONAI Label allows researchers and developers to make continuous improvements to their apps by allowing them to interact with their apps at the user would. End-users (clinicians, technologists, and annotators in general) benefit from AI continuously learning and becoming better at understanding what the end-user is trying to annotate.
MONAI Label aims to fill the gap between developers creating new annotation applications, and the end users which want to benefit from these innovations.
#### Highlights and Features
- Framework for developing and deploying MONAI Label Apps to train and infer AI models
- Compositional & portable APIs for ease of integration in existing workflows
- Customizable labeling app design for varying user expertise
- Annotation support via [3DSlicer](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/slicer)
& [OHIF](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/ohif) for radiology
- Annotation support via [QuPath](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/qupath), [Digital Slide Archive](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/dsa), and [CVAT](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/cvat) for
pathology
- Annotation support via [CVAT](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/cvat) for Endoscopy
- PACS connectivity via [DICOMWeb](https://www.dicomstandard.org/using/dicomweb)
- Automated Active Learning workflow for endoscopy using [CVAT](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/cvat)
#### Supported Matrix
MONAI Label supports many state-of-the-art(SOTA) models in Model-Zoo, and their integration with viewers and monaibundle app. Please refer to [monaibundle](https://github.com/Project-MONAI/MONAILabel/tree/main/sample-apps/monaibundle) app page for supported models, including whole body segmentation, whole brain segmentation, lung nodule detection, tumor segmentation and many more.
In addition, you can find a table of the basic supported fields, modalities, viewers, and general data types. However, these are only ones that we've explicitly test and that doesn't mean that your dataset or file type won't work with MONAI Label. Try MONAI for your given task and if you're having issues, reach out through GitHub Issues.
<table>
<tr>
<th>Field</th>
<th>Models</th>
<th>Viewers</th>
<th>Data Types</th>
<th>Image Modalities/Target</th>
</tr>
<td>Radiology</td>
<td>
<ul>
<li>Segmentation</li>
<li>DeepGrow</li>
<li>DeepEdit</li>
<li>SAM2 (2D/3D)</li>
</ul>
</td>
<td>
<ul>
<li>3DSlicer</li>
<li>MITK</li>
<li>OHIF</li>
</ul>
</td>
<td>
<ul>
<li>NIfTI</li>
<li>NRRD</li>
<li>DICOM</li>
</ul>
</td>
<td>
<ul>
<li>CT</li>
<li>MRI</li>
</ul>
</td>
<tr>
</tr>
<td>Pathology</td>
<td>
<ul>
<li>DeepEdit</li>
<li>NuClick</li>
<li>Segmentation</li>
<li>Classification</li>
<li>SAM2 (2D)</li>
</ul>
</td>
<td>
<ul>
<li>Digital Slide Archive</li>
<li>QuPath</li>
<li>CVAT</li>
</ul>
</td>
<td>
<ul>
<li>TIFF</li>
<li>SVS</li>
</ul>
</td>
<td>
<ul>
<li>Nuclei Segmentation</li>
<li>Nuclei Classification</li>
</ul>
</td>
<tr>
</tr>
<td>Video</td>
<td>
<ul>
<li>DeepEdit</li>
<li>Tooltracking</li>
<li>InBody/OutBody</li>
<li>SAM2 (2D)</li>
</ul>
</td>
<td>
<ul>
<li>CVAT</li>
</ul>
</td>
<td>
<ul>
<li>JPG</li>
<li>3-channel Video Frames</li>
</ul>
</td>
<td>
<ul>
<li>Endoscopy</li>
</ul>
</td>
<tr>
</table>
# Getting Started with MONAI Label
### MONAI Label requires a few steps to get started:
- Step 1: [Install MONAI Label](#step-1-installation)
- Step 2: [Download a MONAI Label sample app or write your own custom app](#step-2-monai-label-sample-applications)
- Step 3: [Install a compatible viewer and supported MONAI Label Plugin](#step-3-monai-label-supported-viewers)
- Step 4: [Prepare your Data](#step-4-data-preparation)
- Step 5: [Launch MONAI Label Server and start Annotating!](#step-5-start-monai-label-server-and-start-annotating)
## Step 1 Installation
### Current Stable Version
<a href="https://pypi.org/project/monailabel/#history"><img alt="GitHub release (latest SemVer)" src="https://img.shields.io/github/v/release/project-monai/monailabel"></a>
<pre>pip install -U monailabel</pre>
MONAI Label supports the following OS with **GPU/CUDA** enabled. For more details instruction, please see the installation guides.
- [Ubuntu](https://docs.monai.io/projects/label/en/latest/installation.html)
- [Windows](https://docs.monai.io/projects/label/en/latest/installation.html#windows)
### GPU Acceleration (Optional Dependencies)
Following are the optional dependencies which can help you to accelerate some GPU based transforms from MONAI. These dependencies are enabled by default if you are using `projectmonai/monailabel` docker.
- [CUCIM](https://pypi.org/project/cucim/)
- [CUPY](https://docs.cupy.dev/en/stable/install.html#installing-cupy)
- [CUDA Toolkit](https://developer.nvidia.com/cuda-downloads)
### Development version
To install the _**latest features**_ using one of the following options:
<details>
<summary><strong>Git Checkout (developer mode)</strong></summary>
<a href="https://github.com/Project-MONAI/MONAILabel"><img alt="GitHub tag (latest SemVer)" src="https://img.shields.io/github/v/tag/Project-MONAI/monailabel"></a>
<br>
<pre>
git clone https://github.com/Project-MONAI/MONAILabel
pip install -r MONAILabel/requirements.txt
export PATH=$PATH:`pwd`/MONAILabel/monailabel/scripts</pre>
<p>If you are using DICOM-Web + OHIF then you have to build OHIF package separate. Please refer [here](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/ohif#development-setup).</p>
</details>
<details>
<summary><strong>Docker</strong></summary>
<img alt="Docker Image Version (latest semver)" src="https://img.shields.io/docker/v/projectmonai/monailabel">
<br>
<pre>docker run --gpus all --rm -ti --ipc=host --net=host projectmonai/monailabel:latest bash</pre>
</details>
### SAM-2
> By default, [**SAM2**](https://github.com/facebookresearch/sam2/) model is included for all the Apps when **_python >= 3.10_**
> - **sam_2d**: for any organ or tissue and others over a given slice/2D image.
> - **sam_3d**: to support SAM2 propagation over multiple slices (Radiology/MONAI-Bundle).
If you are using `pip install monailabel` by default it uses [SAM-2](https://huggingface.co/facebook/sam2-hiera-large) models.
<br/>
To use [SAM-2.1](https://huggingface.co/facebook/sam2.1-hiera-large) use one of following options.
- Use monailabel [Docker](https://hub.docker.com/r/projectmonai/monailabel) instead of pip package
- Run monailabel in dev mode (git checkout)
- If you have installed monailabel via pip then uninstall **_sam2_** package `pip uninstall sam2` and then run `pip install -r requirements.txt` or install latest **SAM-2** from it's [github](https://github.com/facebookresearch/sam2/tree/main?tab=readme-ov-file#installation).
## Step 2 MONAI Label Sample Applications
<h3>Radiology</h3>
<p>This app has example models to do both interactive and automated segmentation over radiology (3D) images. Including auto segmentation with the latest deep learning models (e.g., UNet, UNETR) for multiple abdominal organs. Interactive tools include DeepEdit and Deepgrow for actively improving trained models and deployment.</p>
<ul>
<li>Deepedit</li>
<li>Deepgrow</li>
<li>Segmentation</li>
<li>Spleen Segmentation</li>
<li>Multi-Stage Vertebra Segmentation</li>
</ul>
<h3>Pathology</h3>
<p>This app has example models to do both interactive and automated segmentation over pathology (WSI) images. Including nuclei multi-label segmentation for Neoplastic cells, Inflammatory, Connective/Soft tissue cells, Dead Cells, and Epithelial. The app provides interactive tools including DeepEdits for interactive nuclei segmentation.</p>
<ul>
<li>Deepedit</li>
<li>Deepgrow</li>
<li>Segmentation</li>
<li>Spleen Segmentation</li>
<li>Multi-Stage Vertebra Segmentation</li>
</ul>
<h3>Video</h3>
<p>The Endoscopy app enables users to use interactive, automated segmentation and classification models over 2D images for endoscopy usecase. Combined with CVAT, it will demonstrate the fully automated Active Learning workflow to train + fine-tune a model.</p>
<ul>
<li>Deepedit</li>
<li>ToolTracking</li>
<li>InBody/OutBody</li>
</ul>
<h3>Bundles</h3>
<p>The Bundle app enables users with customized models for inference, training or pre and post processing any target anatomies. The specification for MONAILabel integration of the Bundle app links archived Model-Zoo for customized labeling (e.g., the third-party transformer model for labeling renal cortex, medulla, and pelvicalyceal system. Interactive tools such as DeepEdits).</p>
For a full list of supported bundles, see the <a href="https://github.com/Project-MONAI/MONAILabel/tree/main/sample-apps/monaibundle">MONAI Label Bundles README</a>.
## Step 3 MONAI Label Supported Viewers
### Radiology
#### 3D Slicer
3D Slicer, a free and open-source platform for analyzing, visualizing and understanding medical image data. In MONAI Label, 3D Slicer is most tested with radiology studies and algorithms, develpoment and integration.
[3D Slicer Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/slicer)
#### MITK
The Medical imaging Interaction ToolKit (MITK) is an open source, standalone, medical imaging platform. MONAI Label is partially integrated to MITK Workbench, a powerful and free application to view, process, and segment medical images. The MONAI Label tool in MITK is mostly tested for inferencing using radiology and bundle apps allowing for Auto and Click-based interactive models.
[MITK Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/mitk)
#### OHIF
The Open Health Imaging Foundation (OHIF) Viewer is an open source, web-based, medical imaging platform. It aims to provide a core framework for building complex imaging applications.
[OHIF Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/ohif)
### Pathology
#### QuPath
Quantitative Pathology & Bioimage Analysis (QuPath) is an open, powerful, flexible, extensible software platform for bioimage analysis.
[QuPath Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/qupath)
#### Digital Slide Archive
The Digital Slide Archive (DSA) is a platform that provides the ability to store, manage, visualize and annotate large imaging data sets.
[Digital Slide Archive Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/dsa)
### Video
#### CVAT
CVAT is an interactive video and image annotation tool for computer vision.
[CVAT Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/cvat)
## Step 4 Data Preparation
For data preparation, you have two options, you can use a local data store or any image archive tool that supports DICOMWeb.
#### Local Datastore for the Radiology App on single modality images
For a Datastore in a local file archive, there is a set folder structure that MONAI Label uses. Place your image data in a folder and if you have any segmentation files, create and place them in a subfolder called `labels/final`. You can see an example below:
```
dataset
│-- spleen_10.nii.gz
│-- spleen_11.nii.gz
│ ...
└───labels
└─── final
│-- spleen_10.nii.gz
│-- spleen_11.nii.gz
│ ...
```
If you don't have labels, just place the images/volumes in the dataset folder.
#### DICOMWeb Support
If the viewer you're using supports DICOMweb standard, you can use that instead of a local datastore to serve images to MONAI Label. When starting the MONAI Label server, we need to specify the URL of the DICOMweb service in the studies argument (and, optionally, the username and password for DICOM servers that require them). You can see an example of starting the MONAI Label server with a DICOMweb URL below:
```
monailabel start_server --app apps/radiology --studies http://127.0.0.1:8042/dicom-web --conf models segmentation
```
## Step 5 Start MONAI Label Server and Start Annotating
You're now ready to start using MONAI Label. Once you've configured your viewer, app, and datastore, you can launch the MONAI Label server with the relevant parameters. For simplicity, you can see an example where we download a Radiology sample app and dataset, then start the MONAI Label server below:
```
monailabel apps --download --name radiology --output apps
monailabel datasets --download --name Task09_Spleen --output datasets
monailabel start_server --app apps/radiology --studies datasets/Task09_Spleen/imagesTr --conf models segmentation
```
**Note:** If you want to work on different labels than the ones proposed by default, change the configs file following the instructions here: https://youtu.be/KtPE8m0LvcQ?t=622
## MONAI Label Tutorials
**Content**
- **Radiology App**:
- Viewer: [3D Slicer](https://www.slicer.org/) | Datastore: Local | Task: Segmentation
- [MONAILabel: HelloWorld](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_HelloWorld_radiology_3dslicer.ipynb): Spleen segmentation with 3D Slicer setups.
- Viewer: [OHIF](https://ohif.org/) | Datastore: Local | Task: Segmentation
- [MONAILabel: Web-based OHIF Viewer](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_radiology_spleen_segmentation_OHIF.ipynb): Spleen segmentation with OHIF setups.
- **MONAIBUNDLE App**:
- Viewer: [3D Slicer](https://www.slicer.org/) | Datastore: Local | Task: Segmentation
- [MONAILabel: Pancreas Tumor Segmentation with 3D Slicer](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_bring_your_own_data.ipynb): Pancreas and tumor segmentation with CT scans in 3D Slicer.
- [MONAILabel: Multi-organ Segmentation with 3D Slicer](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_monaibundle_3dslicer_multiorgan_seg.ipynb): Multi-organ segmentation with CT scans in 3D Slicer.
- [MONAILabel: Whole Body CT Segmentation with 3D Slicer](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_wholebody_totalSegmentator_3dslicer.ipynb): Whole body (104 structures) segmentation with CT scans.
- [MONAILabel: Lung nodule CT Detection with 3D Slicer](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_monaibundle_3dslicer_lung_nodule_detection.ipynb): Lung nodule detection task with CT scans.
- **Pathology App**:
- Viewer: [QuPath](https://qupath.github.io/) | Datastore: Local | Task: Segmentation
- [MONAILabel: Nuclei Segmentation with QuPath](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_pathology_nuclei_segmentation_QuPath.ipynb) Nuclei segmentation with QuPath setup and Nuclick models.
- **Endoscopy App**:
- Viewer: [CVAT](https://github.com/opencv/cvat) | Datastore: Local | Task: Segmentation
- [MONAILabel: Tooltracking with CVAT](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_endoscopy_cvat_tooltracking.ipynb): Surgical tool segmentation with CVAT/Nuclio setup.
## Cite
If you are using MONAI Label in your research, please use the following citation:
```bash
@article{DiazPinto2022monailabel,
author = {Diaz-Pinto, Andres and Alle, Sachidanand and Ihsani, Alvin and Asad, Muhammad and
Nath, Vishwesh and P{\'e}rez-Garc{\'\i}a, Fernando and Mehta, Pritesh and
Li, Wenqi and Roth, Holger R. and Vercauteren, Tom and Xu, Daguang and
Dogra, Prerna and Ourselin, Sebastien and Feng, Andrew and Cardoso, M. Jorge},
title = {{MONAI Label: A framework for AI-assisted Interactive Labeling of 3D Medical Images}},
journal = {arXiv e-prints},
year = 2022,
url = {https://arxiv.org/pdf/2203.12362.pdf}
}
@inproceedings{DiazPinto2022DeepEdit,
title={{DeepEdit: Deep Editable Learning for Interactive Segmentation of 3D Medical Images}},
author={Diaz-Pinto, Andres and Mehta, Pritesh and Alle, Sachidanand and Asad, Muhammad and Brown, Richard and Nath, Vishwesh and Ihsani, Alvin and Antonelli, Michela and Palkovics, Daniel and Pinter, Csaba and others},
booktitle={MICCAI Workshop on Data Augmentation, Labelling, and Imperfections},
pages={11--21},
year={2022},
organization={Springer}
}
```
Optional Citation: if you are using active learning functionality from MONAI Label, please support us:
```bash
@article{nath2020diminishing,
title={Diminishing uncertainty within the training pool: Active learning for medical image segmentation},
author={Nath, Vishwesh and Yang, Dong and Landman, Bennett A and Xu, Daguang and Roth, Holger R},
journal={IEEE Transactions on Medical Imaging},
volume={40},
number={10},
pages={2534--2547},
year={2020},
publisher={IEEE}
}
```
## Contributing
For guidance on making a contribution to MONAI Label, see
the [contributing guidelines](https://github.com/Project-MONAI/MONAILabel/blob/main/CONTRIBUTING.md).
## Community
Join the conversation on Twitter [@ProjectMONAI](https://twitter.com/ProjectMONAI) or join
our [Slack channel](https://projectmonai.slack.com/archives/C031QRE0M1C).
Ask and answer questions over
on [MONAI Label's GitHub Discussions tab](https://github.com/Project-MONAI/MONAILabel/discussions).
## Additional Resources
- Website: https://monai.io/
- API documentation: https://docs.monai.io/projects/label
- Code: https://github.com/Project-MONAI/MONAILabel
- Project tracker: https://github.com/Project-MONAI/MONAILabel/projects
- Issue tracker: https://github.com/Project-MONAI/MONAILabel/issues
- Wiki: https://github.com/Project-MONAI/MONAILabel/wiki
- Test status: https://github.com/Project-MONAI/MONAILabel/actions
- PyPI package: https://pypi.org/project/monailabel/
- Docker Hub: https://hub.docker.com/r/projectmonai/monailabel
- Client API: https://www.youtube.com/watch?v=mPMYJyzSmyo
- Demo Videos: https://www.youtube.com/c/ProjectMONAI
Raw data
{
"_id": null,
"home_page": "https://monai.io/",
"name": "monailabel",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": null,
"author": "MONAI Consortium",
"author_email": "monai.contact@gmail.com",
"download_url": null,
"platform": "OS Independent",
"description": "<!--\nCopyright (c) MONAI Consortium\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n http://www.apache.org/licenses/LICENSE-2.0\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n-->\n\n# MONAI Label\n[![License](https://img.shields.io/badge/license-Apache%202.0-green.svg)](https://opensource.org/licenses/Apache-2.0)\n[![CI Build](https://github.com/Project-MONAI/MONAILabel/workflows/build/badge.svg?branch=main)](https://github.com/Project-MONAI/MONAILabel/commits/main)\n[![Documentation Status](https://readthedocs.org/projects/monailabel/badge/?version=latest)](https://docs.monai.io/projects/label/en/latest/?badge=latest)\n[![PyPI version](https://badge.fury.io/py/monailabel.svg)](https://badge.fury.io/py/monailabel)\n[![Azure DevOps tests (compact)](https://img.shields.io/azure-devops/tests/projectmonai/monai-label/10?compact_message)](https://dev.azure.com/projectmonai/monai-label/_test/analytics?definitionId=10&contextType=build)\n[![Azure DevOps coverage](https://img.shields.io/azure-devops/coverage/projectmonai/monai-label/10)](https://dev.azure.com/projectmonai/monai-label/_build?definitionId=10)\n[![codecov](https://codecov.io/gh/Project-MONAI/MONAILabel/branch/main/graph/badge.svg)](https://codecov.io/gh/Project-MONAI/MONAILabel)\n\nMONAI Label is an intelligent open source image labeling and learning tool that enables users to create annotated datasets and build AI annotation models for clinical evaluation. MONAI Label enables application developers to build labeling apps in a serverless way, where custom labeling apps are exposed as a service through the MONAI Label Server.\n\nMONAI Label is a server-client system that facilitates interactive medical image annotation by using AI. It is an\nopen-source and easy-to-install ecosystem that can run locally on a machine with single or multiple GPUs. Both server\nand client work on the same/different machine. It shares the same principles\nwith [MONAI](https://github.com/Project-MONAI).\n\nRefer to full [MONAI Label documentations](https://docs.monai.io/projects/label/en/latest/index.html) for more details or check out our [MONAI Label Deep Dive videos series](https://www.youtube.com/playlist?list=PLtoSVSQ2XzyD4lc-lAacFBzOdv5Ou-9IA).\n\nRefer to [MONAI Label Tutorial](https://github.com/Project-MONAI/tutorials/tree/main/monailabel) series for application and viewer workflows with different medical image tasks. Notebook-like tutorials are created for detailed instructions.\n\n### Table of Contents\n- [Overview](#overview)\n - [Highlights and Features](#highlights-and-features)\n - [Supported Matrix](#supported-matrix)\n- [Getting Started with MONAI Label](#getting-started-with-monai-label)\n - [Step 1. Installation](#step-1-installation)\n - [Step 2. MONAI Label Sample Applications](#step-2-monai-label-sample-applications)\n - [Step 3. MONAI Label Supported Viewers](#step-3-monai-label-supported-viewers)\n - [Step 4. Data Preparation](#step-4-data-preparation)\n - [Step 5. Start MONAI Label Server and Start Annotating!](#step-5-start-monai-label-server-and-start-annotating)\n- [MONAI Label Tutorials](#monai-label-tutorials)\n- [Cite MONAI Label](#cite)\n- [Contributing](#contributing)\n- [Community](#community)\n- [Additional Resources](#additional-resources)\n\n### Overview\nMONAI Label reduces the time and effort of annotating new datasets and enables the adaptation of AI to the task at hand by continuously learning from user interactions and data. MONAI Label allows researchers and developers to make continuous improvements to their apps by allowing them to interact with their apps at the user would. End-users (clinicians, technologists, and annotators in general) benefit from AI continuously learning and becoming better at understanding what the end-user is trying to annotate.\n\nMONAI Label aims to fill the gap between developers creating new annotation applications, and the end users which want to benefit from these innovations.\n\n#### Highlights and Features\n- Framework for developing and deploying MONAI Label Apps to train and infer AI models\n- Compositional & portable APIs for ease of integration in existing workflows\n- Customizable labeling app design for varying user expertise\n- Annotation support via [3DSlicer](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/slicer)\n & [OHIF](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/ohif) for radiology\n- Annotation support via [QuPath](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/qupath), [Digital Slide Archive](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/dsa), and [CVAT](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/cvat) for\n pathology\n- Annotation support via [CVAT](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/cvat) for Endoscopy\n- PACS connectivity via [DICOMWeb](https://www.dicomstandard.org/using/dicomweb)\n- Automated Active Learning workflow for endoscopy using [CVAT](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/cvat)\n\n#### Supported Matrix\n\nMONAI Label supports many state-of-the-art(SOTA) models in Model-Zoo, and their integration with viewers and monaibundle app. Please refer to [monaibundle](https://github.com/Project-MONAI/MONAILabel/tree/main/sample-apps/monaibundle) app page for supported models, including whole body segmentation, whole brain segmentation, lung nodule detection, tumor segmentation and many more.\n\nIn addition, you can find a table of the basic supported fields, modalities, viewers, and general data types. However, these are only ones that we've explicitly test and that doesn't mean that your dataset or file type won't work with MONAI Label. Try MONAI for your given task and if you're having issues, reach out through GitHub Issues.\n<table>\n<tr>\n <th>Field</th>\n <th>Models</th>\n <th>Viewers</th>\n <th>Data Types</th>\n <th>Image Modalities/Target</th>\n</tr>\n <td>Radiology</td>\n <td>\n <ul>\n <li>Segmentation</li>\n <li>DeepGrow</li>\n <li>DeepEdit</li>\n <li>SAM2 (2D/3D)</li>\n </ul>\n </td>\n <td>\n <ul>\n <li>3DSlicer</li>\n <li>MITK</li>\n <li>OHIF</li>\n </ul>\n </td>\n <td>\n <ul>\n <li>NIfTI</li>\n <li>NRRD</li>\n <li>DICOM</li>\n </ul>\n </td>\n <td>\n <ul>\n <li>CT</li>\n <li>MRI</li>\n </ul>\n </td>\n<tr>\n</tr>\n <td>Pathology</td>\n <td>\n <ul>\n <li>DeepEdit</li>\n <li>NuClick</li>\n <li>Segmentation</li>\n <li>Classification</li>\n <li>SAM2 (2D)</li>\n </ul>\n </td>\n <td>\n <ul>\n <li>Digital Slide Archive</li>\n <li>QuPath</li>\n <li>CVAT</li>\n </ul>\n </td>\n <td>\n <ul>\n <li>TIFF</li>\n <li>SVS</li>\n </ul>\n </td>\n <td>\n <ul>\n <li>Nuclei Segmentation</li>\n <li>Nuclei Classification</li>\n </ul>\n </td>\n<tr>\n</tr>\n <td>Video</td>\n <td>\n <ul>\n <li>DeepEdit</li>\n <li>Tooltracking</li>\n <li>InBody/OutBody</li>\n <li>SAM2 (2D)</li>\n </ul>\n </td>\n <td>\n <ul>\n <li>CVAT</li>\n </ul>\n </td>\n <td>\n <ul>\n <li>JPG</li>\n <li>3-channel Video Frames</li>\n </ul>\n </td>\n <td>\n <ul>\n <li>Endoscopy</li>\n </ul>\n </td>\n<tr>\n</table>\n\n# Getting Started with MONAI Label\n### MONAI Label requires a few steps to get started:\n- Step 1: [Install MONAI Label](#step-1-installation)\n- Step 2: [Download a MONAI Label sample app or write your own custom app](#step-2-monai-label-sample-applications)\n- Step 3: [Install a compatible viewer and supported MONAI Label Plugin](#step-3-monai-label-supported-viewers)\n- Step 4: [Prepare your Data](#step-4-data-preparation)\n- Step 5: [Launch MONAI Label Server and start Annotating!](#step-5-start-monai-label-server-and-start-annotating)\n\n## Step 1 Installation\n\n### Current Stable Version\n<a href=\"https://pypi.org/project/monailabel/#history\"><img alt=\"GitHub release (latest SemVer)\" src=\"https://img.shields.io/github/v/release/project-monai/monailabel\"></a>\n<pre>pip install -U monailabel</pre>\n\nMONAI Label supports the following OS with **GPU/CUDA** enabled. For more details instruction, please see the installation guides.\n- [Ubuntu](https://docs.monai.io/projects/label/en/latest/installation.html)\n- [Windows](https://docs.monai.io/projects/label/en/latest/installation.html#windows)\n\n### GPU Acceleration (Optional Dependencies)\nFollowing are the optional dependencies which can help you to accelerate some GPU based transforms from MONAI. These dependencies are enabled by default if you are using `projectmonai/monailabel` docker.\n- [CUCIM](https://pypi.org/project/cucim/)\n- [CUPY](https://docs.cupy.dev/en/stable/install.html#installing-cupy)\n- [CUDA Toolkit](https://developer.nvidia.com/cuda-downloads)\n\n### Development version\n\nTo install the _**latest features**_ using one of the following options:\n\n<details>\n <summary><strong>Git Checkout (developer mode)</strong></summary>\n <a href=\"https://github.com/Project-MONAI/MONAILabel\"><img alt=\"GitHub tag (latest SemVer)\" src=\"https://img.shields.io/github/v/tag/Project-MONAI/monailabel\"></a>\n <br>\n <pre>\n git clone https://github.com/Project-MONAI/MONAILabel\n pip install -r MONAILabel/requirements.txt\n export PATH=$PATH:`pwd`/MONAILabel/monailabel/scripts</pre>\n <p>If you are using DICOM-Web + OHIF then you have to build OHIF package separate. Please refer [here](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/ohif#development-setup).</p>\n</details>\n\n<details>\n <summary><strong>Docker</strong></summary>\n <img alt=\"Docker Image Version (latest semver)\" src=\"https://img.shields.io/docker/v/projectmonai/monailabel\">\n <br>\n <pre>docker run --gpus all --rm -ti --ipc=host --net=host projectmonai/monailabel:latest bash</pre>\n</details>\n\n### SAM-2\n\n> By default, [**SAM2**](https://github.com/facebookresearch/sam2/) model is included for all the Apps when **_python >= 3.10_**\n> - **sam_2d**: for any organ or tissue and others over a given slice/2D image.\n> - **sam_3d**: to support SAM2 propagation over multiple slices (Radiology/MONAI-Bundle).\n\nIf you are using `pip install monailabel` by default it uses [SAM-2](https://huggingface.co/facebook/sam2-hiera-large) models.\n<br/>\nTo use [SAM-2.1](https://huggingface.co/facebook/sam2.1-hiera-large) use one of following options.\n - Use monailabel [Docker](https://hub.docker.com/r/projectmonai/monailabel) instead of pip package\n - Run monailabel in dev mode (git checkout)\n - If you have installed monailabel via pip then uninstall **_sam2_** package `pip uninstall sam2` and then run `pip install -r requirements.txt` or install latest **SAM-2** from it's [github](https://github.com/facebookresearch/sam2/tree/main?tab=readme-ov-file#installation).\n\n## Step 2 MONAI Label Sample Applications\n\n<h3>Radiology</h3>\n<p>This app has example models to do both interactive and automated segmentation over radiology (3D) images. Including auto segmentation with the latest deep learning models (e.g., UNet, UNETR) for multiple abdominal organs. Interactive tools include DeepEdit and Deepgrow for actively improving trained models and deployment.</p>\n<ul>\n <li>Deepedit</li>\n <li>Deepgrow</li>\n <li>Segmentation</li>\n <li>Spleen Segmentation</li>\n <li>Multi-Stage Vertebra Segmentation</li>\n</ul>\n\n<h3>Pathology</h3>\n<p>This app has example models to do both interactive and automated segmentation over pathology (WSI) images. Including nuclei multi-label segmentation for Neoplastic cells, Inflammatory, Connective/Soft tissue cells, Dead Cells, and Epithelial. The app provides interactive tools including DeepEdits for interactive nuclei segmentation.</p>\n<ul>\n <li>Deepedit</li>\n <li>Deepgrow</li>\n <li>Segmentation</li>\n <li>Spleen Segmentation</li>\n <li>Multi-Stage Vertebra Segmentation</li>\n</ul>\n<h3>Video</h3>\n<p>The Endoscopy app enables users to use interactive, automated segmentation and classification models over 2D images for endoscopy usecase. Combined with CVAT, it will demonstrate the fully automated Active Learning workflow to train + fine-tune a model.</p>\n<ul>\n <li>Deepedit</li>\n <li>ToolTracking</li>\n <li>InBody/OutBody</li>\n</ul>\n<h3>Bundles</h3>\n<p>The Bundle app enables users with customized models for inference, training or pre and post processing any target anatomies. The specification for MONAILabel integration of the Bundle app links archived Model-Zoo for customized labeling (e.g., the third-party transformer model for labeling renal cortex, medulla, and pelvicalyceal system. Interactive tools such as DeepEdits).</p>\n\nFor a full list of supported bundles, see the <a href=\"https://github.com/Project-MONAI/MONAILabel/tree/main/sample-apps/monaibundle\">MONAI Label Bundles README</a>.\n\n## Step 3 MONAI Label Supported Viewers\n\n### Radiology\n#### 3D Slicer\n3D Slicer, a free and open-source platform for analyzing, visualizing and understanding medical image data. In MONAI Label, 3D Slicer is most tested with radiology studies and algorithms, develpoment and integration.\n\n[3D Slicer Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/slicer)\n\n#### MITK\nThe Medical imaging Interaction ToolKit (MITK) is an open source, standalone, medical imaging platform. MONAI Label is partially integrated to MITK Workbench, a powerful and free application to view, process, and segment medical images. The MONAI Label tool in MITK is mostly tested for inferencing using radiology and bundle apps allowing for Auto and Click-based interactive models.\n\n[MITK Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/mitk)\n\n#### OHIF\nThe Open Health Imaging Foundation (OHIF) Viewer is an open source, web-based, medical imaging platform. It aims to provide a core framework for building complex imaging applications.\n\n[OHIF Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/ohif)\n\n### Pathology\n#### QuPath\nQuantitative Pathology & Bioimage Analysis (QuPath) is an open, powerful, flexible, extensible software platform for bioimage analysis.\n\n[QuPath Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/qupath)\n\n#### Digital Slide Archive\nThe Digital Slide Archive (DSA) is a platform that provides the ability to store, manage, visualize and annotate large imaging data sets.\n[Digital Slide Archive Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/dsa)\n\n### Video\n#### CVAT\nCVAT is an interactive video and image annotation tool for computer vision.\n[CVAT Setup](https://github.com/Project-MONAI/MONAILabel/tree/main/plugins/cvat)\n\n## Step 4 Data Preparation\nFor data preparation, you have two options, you can use a local data store or any image archive tool that supports DICOMWeb.\n\n#### Local Datastore for the Radiology App on single modality images\nFor a Datastore in a local file archive, there is a set folder structure that MONAI Label uses. Place your image data in a folder and if you have any segmentation files, create and place them in a subfolder called `labels/final`. You can see an example below:\n```\ndataset\n\u2502-- spleen_10.nii.gz\n\u2502-- spleen_11.nii.gz\n\u2502 ...\n\u2514\u2500\u2500\u2500labels\n \u2514\u2500\u2500\u2500 final\n \u2502-- spleen_10.nii.gz\n \u2502-- spleen_11.nii.gz\n \u2502 ...\n```\n\nIf you don't have labels, just place the images/volumes in the dataset folder.\n\n#### DICOMWeb Support\nIf the viewer you're using supports DICOMweb standard, you can use that instead of a local datastore to serve images to MONAI Label. When starting the MONAI Label server, we need to specify the URL of the DICOMweb service in the studies argument (and, optionally, the username and password for DICOM servers that require them). You can see an example of starting the MONAI Label server with a DICOMweb URL below:\n\n\n```\nmonailabel start_server --app apps/radiology --studies http://127.0.0.1:8042/dicom-web --conf models segmentation\n```\n\n## Step 5 Start MONAI Label Server and Start Annotating\nYou're now ready to start using MONAI Label. Once you've configured your viewer, app, and datastore, you can launch the MONAI Label server with the relevant parameters. For simplicity, you can see an example where we download a Radiology sample app and dataset, then start the MONAI Label server below:\n\n```\nmonailabel apps --download --name radiology --output apps\nmonailabel datasets --download --name Task09_Spleen --output datasets\nmonailabel start_server --app apps/radiology --studies datasets/Task09_Spleen/imagesTr --conf models segmentation\n```\n\n**Note:** If you want to work on different labels than the ones proposed by default, change the configs file following the instructions here: https://youtu.be/KtPE8m0LvcQ?t=622\n\n## MONAI Label Tutorials\n\n**Content**\n\n- **Radiology App**:\n - Viewer: [3D Slicer](https://www.slicer.org/) | Datastore: Local | Task: Segmentation\n - [MONAILabel: HelloWorld](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_HelloWorld_radiology_3dslicer.ipynb): Spleen segmentation with 3D Slicer setups.\n - Viewer: [OHIF](https://ohif.org/) | Datastore: Local | Task: Segmentation\n - [MONAILabel: Web-based OHIF Viewer](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_radiology_spleen_segmentation_OHIF.ipynb): Spleen segmentation with OHIF setups.\n- **MONAIBUNDLE App**:\n - Viewer: [3D Slicer](https://www.slicer.org/) | Datastore: Local | Task: Segmentation\n - [MONAILabel: Pancreas Tumor Segmentation with 3D Slicer](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_bring_your_own_data.ipynb): Pancreas and tumor segmentation with CT scans in 3D Slicer.\n - [MONAILabel: Multi-organ Segmentation with 3D Slicer](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_monaibundle_3dslicer_multiorgan_seg.ipynb): Multi-organ segmentation with CT scans in 3D Slicer.\n - [MONAILabel: Whole Body CT Segmentation with 3D Slicer](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_wholebody_totalSegmentator_3dslicer.ipynb): Whole body (104 structures) segmentation with CT scans.\n - [MONAILabel: Lung nodule CT Detection with 3D Slicer](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_monaibundle_3dslicer_lung_nodule_detection.ipynb): Lung nodule detection task with CT scans.\n- **Pathology App**:\n - Viewer: [QuPath](https://qupath.github.io/) | Datastore: Local | Task: Segmentation\n - [MONAILabel: Nuclei Segmentation with QuPath](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_pathology_nuclei_segmentation_QuPath.ipynb) Nuclei segmentation with QuPath setup and Nuclick models.\n- **Endoscopy App**:\n - Viewer: [CVAT](https://github.com/opencv/cvat) | Datastore: Local | Task: Segmentation\n - [MONAILabel: Tooltracking with CVAT](https://github.com/Project-MONAI/tutorials/blob/main/monailabel/monailabel_endoscopy_cvat_tooltracking.ipynb): Surgical tool segmentation with CVAT/Nuclio setup.\n\n## Cite\n\nIf you are using MONAI Label in your research, please use the following citation:\n\n```bash\n@article{DiazPinto2022monailabel,\n author = {Diaz-Pinto, Andres and Alle, Sachidanand and Ihsani, Alvin and Asad, Muhammad and\n Nath, Vishwesh and P{\\'e}rez-Garc{\\'\\i}a, Fernando and Mehta, Pritesh and\n Li, Wenqi and Roth, Holger R. and Vercauteren, Tom and Xu, Daguang and\n Dogra, Prerna and Ourselin, Sebastien and Feng, Andrew and Cardoso, M. Jorge},\n title = {{MONAI Label: A framework for AI-assisted Interactive Labeling of 3D Medical Images}},\n journal = {arXiv e-prints},\n year = 2022,\n url = {https://arxiv.org/pdf/2203.12362.pdf}\n}\n\n@inproceedings{DiazPinto2022DeepEdit,\n title={{DeepEdit: Deep Editable Learning for Interactive Segmentation of 3D Medical Images}},\n author={Diaz-Pinto, Andres and Mehta, Pritesh and Alle, Sachidanand and Asad, Muhammad and Brown, Richard and Nath, Vishwesh and Ihsani, Alvin and Antonelli, Michela and Palkovics, Daniel and Pinter, Csaba and others},\n booktitle={MICCAI Workshop on Data Augmentation, Labelling, and Imperfections},\n pages={11--21},\n year={2022},\n organization={Springer}\n}\n ```\n\nOptional Citation: if you are using active learning functionality from MONAI Label, please support us:\n\n```bash\n@article{nath2020diminishing,\n title={Diminishing uncertainty within the training pool: Active learning for medical image segmentation},\n author={Nath, Vishwesh and Yang, Dong and Landman, Bennett A and Xu, Daguang and Roth, Holger R},\n journal={IEEE Transactions on Medical Imaging},\n volume={40},\n number={10},\n pages={2534--2547},\n year={2020},\n publisher={IEEE}\n}\n```\n\n## Contributing\n\nFor guidance on making a contribution to MONAI Label, see\nthe [contributing guidelines](https://github.com/Project-MONAI/MONAILabel/blob/main/CONTRIBUTING.md).\n\n## Community\n\nJoin the conversation on Twitter [@ProjectMONAI](https://twitter.com/ProjectMONAI) or join\nour [Slack channel](https://projectmonai.slack.com/archives/C031QRE0M1C).\n\nAsk and answer questions over\non [MONAI Label's GitHub Discussions tab](https://github.com/Project-MONAI/MONAILabel/discussions).\n\n## Additional Resources\n\n- Website: https://monai.io/\n- API documentation: https://docs.monai.io/projects/label\n- Code: https://github.com/Project-MONAI/MONAILabel\n- Project tracker: https://github.com/Project-MONAI/MONAILabel/projects\n- Issue tracker: https://github.com/Project-MONAI/MONAILabel/issues\n- Wiki: https://github.com/Project-MONAI/MONAILabel/wiki\n- Test status: https://github.com/Project-MONAI/MONAILabel/actions\n- PyPI package: https://pypi.org/project/monailabel/\n- Docker Hub: https://hub.docker.com/r/projectmonai/monailabel\n- Client API: https://www.youtube.com/watch?v=mPMYJyzSmyo\n- Demo Videos: https://www.youtube.com/c/ProjectMONAI\n",
"bugtrack_url": null,
"license": "Apache License 2.0",
"summary": "Active Learning Toolkit for Healthcare Imaging",
"version": "0.8.5",
"project_urls": {
"Bug Tracker": "https://github.com/Project-MONAI/MONAILabel/issues",
"Documentation": "https://docs.monai.io/",
"Homepage": "https://monai.io/",
"Source Code": "https://github.com/Project-MONAI/MONAILabel"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "04486d0873316bab519102b846a42bfa00391f5430d3e913c8003d1ca5e80808",
"md5": "616f1443604ea71307d038512acae2f0",
"sha256": "0c2c9bd72c8f9862c0c182d14e7f2637c53ccd616f9ffa584f619b1040f1fc6b"
},
"downloads": -1,
"filename": "monailabel-0.8.5-202411252313-py3-none-any.whl",
"has_sig": false,
"md5_digest": "616f1443604ea71307d038512acae2f0",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 19706713,
"upload_time": "2024-11-25T23:22:11",
"upload_time_iso_8601": "2024-11-25T23:22:11.754849Z",
"url": "https://files.pythonhosted.org/packages/04/48/6d0873316bab519102b846a42bfa00391f5430d3e913c8003d1ca5e80808/monailabel-0.8.5-202411252313-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-25 23:22:11",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Project-MONAI",
"github_project": "MONAILabel",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "monailabel"
}