# Segment Anything Model (SAM) in Napari
[![License Apache Software License 2.0](https://img.shields.io/pypi/l/napari-sam.svg?color=green)](https://github.com/MIC-DKFZ/napari-sam/raw/main/LICENSE)
[![PyPI](https://img.shields.io/pypi/v/napari-sam.svg?color=green)](https://pypi.org/project/napari-sam)
[![Python Version](https://img.shields.io/pypi/pyversions/napari-sam.svg?color=green)](https://python.org)
[![tests](https://github.com/MIC-DKFZ/napari-sam/workflows/tests/badge.svg)](https://github.com/MIC-DKFZ/napari-sam/actions)
[![codecov](https://codecov.io/gh/MIC-DKFZ/napari-sam/branch/main/graph/badge.svg)](https://codecov.io/gh/MIC-DKFZ/napari-sam)
[![napari hub](https://img.shields.io/endpoint?url=https://api.napari-hub.org/shields/napari-sam)](https://napari-hub.org/plugins/napari-sam)
Segment anything with our **Napari** integration of Meta AI's new **Segment Anything Model (SAM)**!
SAM is the new segmentation system from Meta AI capable of **one-click segmentation of any object**, and now, our plugin neatly integrates this into Napari.
We have already **extended** SAM's click-based foreground separation to full **click-based semantic segmentation and instance segmentation**!
At last, our SAM integration supports both **2D and 3D images**!
----------------------------------
Everything mode | Click-based semantic segmentation mode | Click-based instance segmentation mode
:-------------------------:|:-------------------------:|:-------------------------:
![](https://github.com/MIC-DKFZ/napari-sam/raw/main/cats_everything.png) | ![](https://github.com/MIC-DKFZ/napari-sam/raw/main/cats_semantic.png) | ![](https://github.com/MIC-DKFZ/napari-sam/raw/main/cats_instance.png)
----------------------------------
<h2 align="center">SAM in Napari demo</h2>
<div align="center">
https://user-images.githubusercontent.com/3471895/236152620-0de983db-954b-4480-97b9-901ee82f8edd.mp4
</div>
----------------------------------
## Installation
The plugin requires `python>=3.8`, as well as `pytorch>=1.7` and `torchvision>=0.8`. Please follow the instructions here to install both PyTorch and TorchVision dependencies. Installing both PyTorch and TorchVision with CUDA support is strongly recommended.
Install Napari via [pip]:
pip install napari[all]
You can install `napari-sam` via [pip]:
pip install git+https://github.com/facebookresearch/segment-anything.git
pip install napari-sam
To install latest development version :
pip install git+https://github.com/MIC-DKFZ/napari-sam.git
## Usage
Start Napari from the console with:
napari
Then navigate to `Plugins -> Segment Anything (napari-sam)` and drag & drop an image into Napari. At last create, a labels layer that will be used for the SAM predictions, by clicking in the layer list on the third button.
You can then auto-download one of the available SAM models (this can take 1-2 minutes), activate one of the annotations & segmentation modes, and you are ready to go!
## Contributing
Contributions are very welcome. Tests can be run with [tox], please ensure
the coverage at least stays the same before you submit a pull request.
## License
Distributed under the terms of the [Apache Software License 2.0] license,
"napari-sam" is free and open source software
## Issues
If you encounter any problems, please [file an issue] along with a detailed description.
[napari]: https://github.com/napari/napari
[Cookiecutter]: https://github.com/audreyr/cookiecutter
[@napari]: https://github.com/napari
[MIT]: http://opensource.org/licenses/MIT
[BSD-3]: http://opensource.org/licenses/BSD-3-Clause
[GNU GPL v3.0]: http://www.gnu.org/licenses/gpl-3.0.txt
[GNU LGPL v3.0]: http://www.gnu.org/licenses/lgpl-3.0.txt
[Apache Software License 2.0]: http://www.apache.org/licenses/LICENSE-2.0
[Mozilla Public License 2.0]: https://www.mozilla.org/media/MPL/2.0/index.txt
[cookiecutter-napari-plugin]: https://github.com/napari/cookiecutter-napari-plugin
[file an issue]: https://github.com/MIC-DKFZ/napari-sam/issues
[napari]: https://github.com/napari/napari
[tox]: https://tox.readthedocs.io/en/latest/
[pip]: https://pypi.org/project/pip/
[PyPI]: https://pypi.org/
# Acknowledgements
<img src="https://github.com/MIC-DKFZ/napari-sam/raw/main/HI_Logo.png" height="100px" />
<img src="https://github.com/MIC-DKFZ/napari-sam/raw/main/dkfz_logo.png" height="100px" />
napari-sam is developed and maintained by the Applied Computer Vision Lab (ACVL) of [Helmholtz Imaging](http://helmholtz-imaging.de)
and the [Division of Medical Image Computing](https://www.dkfz.de/en/mic/index.php) at the
[German Cancer Research Center (DKFZ)](https://www.dkfz.de/en/index.html).
Raw data
{
"_id": null,
"home_page": "https://github.com/MIC-DKFZ/napari-sam",
"name": "napari-sam",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "",
"keywords": "",
"author": "Karol Gotkowski",
"author_email": "karol.gotkowski@dkfz.de",
"download_url": "https://files.pythonhosted.org/packages/09/35/77ffc566d173f8f096923f1e4eb8cf09829fac23f225d7e342e47a48402c/napari-sam-0.4.13.tar.gz",
"platform": null,
"description": "# Segment Anything Model (SAM) in Napari\n\n[![License Apache Software License 2.0](https://img.shields.io/pypi/l/napari-sam.svg?color=green)](https://github.com/MIC-DKFZ/napari-sam/raw/main/LICENSE)\n[![PyPI](https://img.shields.io/pypi/v/napari-sam.svg?color=green)](https://pypi.org/project/napari-sam)\n[![Python Version](https://img.shields.io/pypi/pyversions/napari-sam.svg?color=green)](https://python.org)\n[![tests](https://github.com/MIC-DKFZ/napari-sam/workflows/tests/badge.svg)](https://github.com/MIC-DKFZ/napari-sam/actions)\n[![codecov](https://codecov.io/gh/MIC-DKFZ/napari-sam/branch/main/graph/badge.svg)](https://codecov.io/gh/MIC-DKFZ/napari-sam)\n[![napari hub](https://img.shields.io/endpoint?url=https://api.napari-hub.org/shields/napari-sam)](https://napari-hub.org/plugins/napari-sam)\n\nSegment anything with our **Napari** integration of Meta AI's new **Segment Anything Model (SAM)**!\n\nSAM is the new segmentation system from Meta AI capable of **one-click segmentation of any object**, and now, our plugin neatly integrates this into Napari.\n\nWe have already **extended** SAM's click-based foreground separation to full **click-based semantic segmentation and instance segmentation**!\n\nAt last, our SAM integration supports both **2D and 3D images**!\n\n----------------------------------\n\nEverything mode | Click-based semantic segmentation mode | Click-based instance segmentation mode\n:-------------------------:|:-------------------------:|:-------------------------:\n![](https://github.com/MIC-DKFZ/napari-sam/raw/main/cats_everything.png) | ![](https://github.com/MIC-DKFZ/napari-sam/raw/main/cats_semantic.png) | ![](https://github.com/MIC-DKFZ/napari-sam/raw/main/cats_instance.png)\n\n\n----------------------------------\n<h2 align=\"center\">SAM in Napari demo</h2>\n<div align=\"center\">\n\nhttps://user-images.githubusercontent.com/3471895/236152620-0de983db-954b-4480-97b9-901ee82f8edd.mp4\n\n</div>\n\n----------------------------------\n\n## Installation\n\nThe plugin requires `python>=3.8`, as well as `pytorch>=1.7` and `torchvision>=0.8`. Please follow the instructions here to install both PyTorch and TorchVision dependencies. Installing both PyTorch and TorchVision with CUDA support is strongly recommended.\n\nInstall Napari via [pip]:\n \n pip install napari[all]\n\nYou can install `napari-sam` via [pip]:\n\n pip install git+https://github.com/facebookresearch/segment-anything.git\n pip install napari-sam\n\n\n\nTo install latest development version :\n\n pip install git+https://github.com/MIC-DKFZ/napari-sam.git\n\n## Usage\n\nStart Napari from the console with:\n\n napari\n\nThen navigate to `Plugins -> Segment Anything (napari-sam)` and drag & drop an image into Napari. At last create, a labels layer that will be used for the SAM predictions, by clicking in the layer list on the third button.\n\nYou can then auto-download one of the available SAM models (this can take 1-2 minutes), activate one of the annotations & segmentation modes, and you are ready to go!\n\n\n## Contributing\n\nContributions are very welcome. Tests can be run with [tox], please ensure\nthe coverage at least stays the same before you submit a pull request.\n\n## License\n\nDistributed under the terms of the [Apache Software License 2.0] license,\n\"napari-sam\" is free and open source software\n\n## Issues\n\nIf you encounter any problems, please [file an issue] along with a detailed description.\n\n[napari]: https://github.com/napari/napari\n[Cookiecutter]: https://github.com/audreyr/cookiecutter\n[@napari]: https://github.com/napari\n[MIT]: http://opensource.org/licenses/MIT\n[BSD-3]: http://opensource.org/licenses/BSD-3-Clause\n[GNU GPL v3.0]: http://www.gnu.org/licenses/gpl-3.0.txt\n[GNU LGPL v3.0]: http://www.gnu.org/licenses/lgpl-3.0.txt\n[Apache Software License 2.0]: http://www.apache.org/licenses/LICENSE-2.0\n[Mozilla Public License 2.0]: https://www.mozilla.org/media/MPL/2.0/index.txt\n[cookiecutter-napari-plugin]: https://github.com/napari/cookiecutter-napari-plugin\n\n[file an issue]: https://github.com/MIC-DKFZ/napari-sam/issues\n\n[napari]: https://github.com/napari/napari\n[tox]: https://tox.readthedocs.io/en/latest/\n[pip]: https://pypi.org/project/pip/\n[PyPI]: https://pypi.org/\n\n# Acknowledgements\n<img src=\"https://github.com/MIC-DKFZ/napari-sam/raw/main/HI_Logo.png\" height=\"100px\" />\n\n<img src=\"https://github.com/MIC-DKFZ/napari-sam/raw/main/dkfz_logo.png\" height=\"100px\" />\n\nnapari-sam is developed and maintained by the Applied Computer Vision Lab (ACVL) of [Helmholtz Imaging](http://helmholtz-imaging.de) \nand the [Division of Medical Image Computing](https://www.dkfz.de/en/mic/index.php) at the \n[German Cancer Research Center (DKFZ)](https://www.dkfz.de/en/index.html).\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Segment anything with Meta AI's new SAM model!",
"version": "0.4.13",
"project_urls": {
"Bug Tracker": "https://github.com/MIC-DKFZ/napari-sam/issues",
"Documentation": "https://github.com/MIC-DKFZ/napari-sam#README.md",
"Homepage": "https://github.com/MIC-DKFZ/napari-sam",
"Source Code": "https://github.com/MIC-DKFZ/napari-sam",
"User Support": "https://github.com/MIC-DKFZ/napari-sam/issues"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "da963f41c2bf61ff70f3290dd4c2c4efc47a3fa6c4118c2b31ad235c930b41c4",
"md5": "884082f9fdf7c1b6bad14e4026d38a1b",
"sha256": "8c81227a55192f70f4c038f18ca46a520c078a615f9066f425b0f05a54ce2a88"
},
"downloads": -1,
"filename": "napari_sam-0.4.13-py3-none-any.whl",
"has_sig": false,
"md5_digest": "884082f9fdf7c1b6bad14e4026d38a1b",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 22951,
"upload_time": "2023-08-02T13:40:44",
"upload_time_iso_8601": "2023-08-02T13:40:44.541679Z",
"url": "https://files.pythonhosted.org/packages/da/96/3f41c2bf61ff70f3290dd4c2c4efc47a3fa6c4118c2b31ad235c930b41c4/napari_sam-0.4.13-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "093577ffc566d173f8f096923f1e4eb8cf09829fac23f225d7e342e47a48402c",
"md5": "e43bdbc51b5a600b5f5db1e6b5d3f3a9",
"sha256": "f1f6fe177e45a217a3fd10e3a66c2f641560b9b14e940cd00d2fbebc9f65ce5f"
},
"downloads": -1,
"filename": "napari-sam-0.4.13.tar.gz",
"has_sig": false,
"md5_digest": "e43bdbc51b5a600b5f5db1e6b5d3f3a9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 23800,
"upload_time": "2023-08-02T13:40:46",
"upload_time_iso_8601": "2023-08-02T13:40:46.234314Z",
"url": "https://files.pythonhosted.org/packages/09/35/77ffc566d173f8f096923f1e4eb8cf09829fac23f225d7e342e47a48402c/napari-sam-0.4.13.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-08-02 13:40:46",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "MIC-DKFZ",
"github_project": "napari-sam",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"tox": true,
"lcname": "napari-sam"
}