isat-sam


Nameisat-sam JSON
Version 1.0.0 PyPI version JSON
download
home_pagehttps://github.com/yatengLG/ISAT_with_segment_anything
SummaryInteractive semi-automatic annotation tool for image segmentation based on SAM(segment anything model).
upload_time2024-05-20 08:51:47
maintainerNone
docs_urlNone
authoryatengLG
requires_python>=3.8
licenseApache2.0
keywords annotation tool segment anything image annotation semantic segmentation instance segmentation
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <h1 align='center'>ISAT_with_segment_anything</h1>
<h2 align='center'>An Interactive Semi-Automatic Annotation Tool Based on Segment Anything</h2>
<p align='center'>
    <a href='https://github.com/yatengLG/ISAT_with_segment_anything' target="_blank"><img alt="GitHub forks" src="https://img.shields.io/github/stars/yatengLG/ISAT_with_segment_anything"></a>
    <a href='https://github.com/yatengLG/ISAT_with_segment_anything' target="_blank"><img alt="GitHub forks" src="https://img.shields.io/github/forks/yatengLG/ISAT_with_segment_anything"></a>
    <a href='https://pypi.org/project/isat-sam/' target="_blank"><img alt="PyPI - Version" src="https://img.shields.io/pypi/v/isat-sam"></a>
    <a href='https://pypi.org/project/isat-sam/' target="_blank"><img alt="Pepy Total Downlods" src="https://img.shields.io/pepy/dt/isat-sam"></a>
</p>
<p align='center'>
    <a href='README-cn.md'><b>[中文]</b></a>
    <a href='README.md'><b>[English]</b></a>
</p>
<p align='center'><img src="./display/标注.gif" alt="标注.gif"'></p>

Our tool enables interactive use of [segment anything](https://github.com/facebookresearch/segment-anything) for rapid image segmentation with low RAM requirements (optional bf16 mode).

Demo Video:[YouTube](https://www.youtube.com/watch?v=yLdZCPmX-Bc)

---

# Features
## Annotaion modes
- **Semi-automatic Annotation**: utilizes SAM with point and bbox prompts.
- **Manual Annotation**:  click or drag to draw polygons (0.15s per point).

## Annotation adjustments
- **Polygon Adjustments**: delete points and adjust object occlusions.
- **Polygon Visualization**: Preview groups and semantic/instance segmentation masks.

## Export annotations
- Supported formats: **MSCOCO**, **YOLO**, **LabelMe**, and **VOC** (also xml)

For more features, see the [Features Description](./docs/features%20description.md).

---

# Installation
There are three ways to install ISAT-SAM:
1. from source code (recommended)
2. pip install
3. from .exe

## Option 1: from source code
### (1) Create environment
```shell
conda create -n isat_env python=3.8
conda activate isat_env
```

### (2) Install ISAT_with_segment_anything and its dependencies
**To use GPU, please install [Pytorch-GPU](https://pytorch.org/) on Windows OS frist.**
```shell
git clone https://github.com/yatengLG/ISAT_with_segment_anything.git
cd ISAT_with_segment_anything
pip install -r requirements.txt
```

### (3) Download Segment anything pretrained checkpoint.

Download the checkpoint, and save in under: ISAT_with_segment_anything/ISAT/checkpoints

**After version 0.0.3, you can manage checkpoints within GUI, click [menubar]-[SAM]-[Model manage] to open the GUI.**

Now support [SAM](https://github.com/facebookresearch/segment-anything), [sam-hq](https://github.com/SysCV/sam-hq), [MobileSAM](https://github.com/ChaoningZhang/MobileSAM), and [EdgeSAM](https://github.com/chongzhou96/EdgeSAM) etc.

|  | pretrained checkpoint | memory | size |
|----|----|----|----|
|    SAM     | [sam_vit_h_4b8939.pth](https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth)           | 7305M | 2.6G |
|            | [sam_vit_l_0b3195.pth](https://dl.fbaipublicfiles.com/segment_anything/sam_vit_l_0b3195.pth)           | 5855M | 2.6G |
|            | [sam_vit_b_01ec64.pth](https://dl.fbaipublicfiles.com/segment_anything/sam_vit_b_01ec64.pth)           | 4149M | 375M |
|   sam-hq   | [sam_hq_vit_h.pth](https://huggingface.co/lkeab/hq-sam/blob/main/sam_hq_vit_h.pth)                     | 7393M | 2.6G |
|            | [sam_hq_vit_l.pth](https://huggingface.co/lkeab/hq-sam/blob/main/sam_hq_vit_l.pth)                     | 5939M | 1.3G |
|            | [sam_hq_vit_b.pth](https://huggingface.co/lkeab/hq-sam/blob/main/sam_hq_vit_b.pth)                     | 4207M | 379M |
|            | [sam_hq_vit_tiny.pth](https://huggingface.co/lkeab/hq-sam/blob/main/sam_hq_vit_tiny.pth)               | 1463M |  43M |
| mobile-sam | [mobile_sam.pt](https://github.com/ChaoningZhang/MobileSAM/blob/master/weights/mobile_sam.pt)          | 1375M |  40M |
|  edge-sam  | [edge_sam.pth](https://huggingface.co/spaces/chongzhou/EdgeSAM/resolve/main/weights/edge_sam.pth)      |  960M |  39M |
|            | [edge_sam_3x.pth](https://huggingface.co/spaces/chongzhou/EdgeSAM/resolve/main/weights/edge_sam_3x.pth)|  960M |  39M |
|   sam-med  | [sam-med2d_b.pth](https://drive.google.com/file/d/1ARiB5RkSsWmAB_8mqWnwDF8ZKTtFwsjl/view?usp=drive_link)|1500M |  2.4G |

### (4) Run
```shell
python main.py
```
<br>

## Option 2: pip install
**Note that the version may be lower than source code version if installed with pip**
### (1) Create environment
```shell
conda create -n isat_env python=3.8
conda activate isat_env
```
### (2) pip install isat-sam
**To use GPU, please install [Pytorch-GPU](https://pytorch.org/) on Windows OS frist.**
```shell
pip install isat-sam
```
### (3) Run
```shell
isat-sam
```

<br>

## Option 3: install with .exe
### (1) download the .exe
**The version of exe maybe older than source code.**

Download three .zip files, total 2.7G

|        | Download link                                                      |
|--------|-----------------------------------------------------------|
| Baidu Netdisk | link:https://pan.baidu.com/s/1vD19PzvIT1QAJrAkSVFfhg code:ISAT |

Click main.exe to run the tool.

### (2) Download Segment anything pretrained checkpoint.

The download zip files, container sam_hq_vit_tiny.pth, but somebody told the model not support cpu.
You can download [mobile_sam.pt](https://github.com/ChaoningZhang/MobileSAM/blob/master/weights/mobile_sam.pt) to test the tool.

If you want use other models, see[Download Segment anything pretrained checkpoint](https://github.com/yatengLG/ISAT_with_segment_anything/blob/master/docs/README-en.md#3-download-segment-anything-pretrained-checkpoint)


---

# Usage
## 1. Annotation
```text
1. Choose the categories in left window of software.
    Edit the category in Toolbar-File-Setting.
    
2. Start annotating
    2.1 semi-automatic annotate with SAM.
        Click button named [Segment anything] start annotate(shortcut Q).
        Click interested area with left button of mouse, click uninterested area with right button of mouse, SAM will calcute masks.
    2.2 draw polygons manually.
        Click the button [Draw polygon] to start annotation (shortcut C).
        Left click to add point into the polygon.
        Hold the left click and drag will automaticly add point into the polygon (time interval of 0.15 seconds).
    2.3 Undo
        Click the button [Backspace] to return to the previous state (shortcut Z).
3. Finish the annotation with [Annotate finished] or shortcut E.
4. Save the annotation with [Save] or shortcut S
```
## 2. Polygon Modification
```text
1. Modify polygons coordinates
    Drag and drop polygon vertices to modify the shape of the polygon.
    Drag the polygon to adjust its position.
2. Modify polygons category
    Choose the polygon and click [Edit] or double click the polygon, and choose the new category in editing window. 
3. Occlusion modification
    Choose the polygon and click [To top] (shortcut T) or [To bottom] (shortcut B).
4. Delete polygon
    Choose the polygon and click [Delete] to delete the polygon.
```
## 3. Visualization
```text
1. Preview annotations
    Click the [Bit map] to preview semantic and instance annotation masks (shortcut space).
    The order of swithing is polygons-semantic-instance.
2. Image window
    Click [Zoom in], [Zoom out], [Fit window] (shortcut F) to adjust the zooming distances.
3. Show / hide polygons
    Click [Visible] to show / hide polygons (shortcut V).
4. Mask aplha (only effective when using SAM)
    Drag the [mask aplha] bar to adjust the mask transparency.
```
## 4. Convet annotations
ISAT has a specific format with .json. You can use export it to other formarts.
```text
1. ISAT to VOC
    Convert ISAT jsons to PNG images.
2. ISAT to COCO
    Convert ISAT jsons to COCO json.
3. ISAT to LABELME
    Convert ISAT jsons to LABELME jsons.
4. COCO to ISAT
    Convert COCO json to ISAT jsons.
```

---

# Star History

**Please support us with a star—it's like a virtual coffee!**
[![Star History Chart](https://api.star-history.com/svg?repos=yatengLG/ISAT_with_segment_anything&type=Date)](https://star-history.com/#yatengLG/ISAT_with_segment_anything&Date)


# Contributors

<table border="0">
<tr>
    <td><img alt="yatengLG" src="https://avatars.githubusercontent.com/u/31759824?v=4" width="60" height="60" href="">
    <td><img alt="Alias-z" src="https://avatars.githubusercontent.com/u/66273343?v=4" width="60" height="60" href="">
    <td>...
</td>
</tr>
<tr>
  <td><a href="https://github.com/yatengLG">yatengLG</a>
  <td><a href="https://github.com/Alias-z">Alias-z</a>
    <td><a href="https://github.com/yatengLG/ISAT_with_segment_anything/graphs/contributors">...</a>
</tr>
</table>


# Citation
```text
@misc{ISAT with segment anything,
  title={{ISAT with segment anything}: An Interactive Semi-Automatic Annotation Tool Based on Segment Anything},
  url={https://github.com/yatengLG/ISAT_with_segment_anything},
  note={Open source software available from https://github.com/yatengLG/ISAT_with_segment_anything},
  author={yatengLG, Alias-z and horffmanwang},
  year={2023},
}
```

# 更新日志

## [0.0.1]

* 发布测试版,版本号0.0.1

## 
* 更新下载功能(后续需优化下载链接)
* 支持多选多边形,现在可以批量删除了

    按下Ctrl点击多边形;按下Ctrl点击右侧标注栏;按下Shift点击右侧标注栏;点击右侧标注栏,Ctrl+A全选。
    
* 添加转换脚本 - utils目录下
    
    现支持 ISAT <-> COCO, ISAT <-> YOLO, ISAT -> XML(目标检测) 

* 添加了对[segment-anything-fast](https://github.com/pytorch-labs/segment-anything-fast)的支持
    
    **现支持SAM系列模型,sam-hq以及mobile-sam等后续更新**
    
    sam-fast要求torch版本不低于2.1.1;低于版本要求时,默认导入sam
    
    sam_vit_h_4b8939.pth encode计算时间缩短大概4倍,显存占用缩小到5.6G
    
| | sam | sam-fast |
|----|----|----|
| 0 | 0.698307991027832 | 0.19336390495300293 | 
| 1 | 0.7048919200897217 | 0.21175742149353027 | 
| 2 | 0.766636848449707 | 0.2573261260986328 | 
| 3 | 0.8198366165161133 | 0.22284531593322754 | 

* 添加了饱和度调整
    
    通过拖动工具栏中的饱和度调整条,对图片进行饱和度调整。(只与显示有关,不影响sam)
    
* 添加track模式
    
    在"自动auto"和"手动manual"模式外,为组模式中添加了"跟踪track"模式。该模式下,使用[TAB]或者[`]切换目标时,组id会显示为设置为当前多边形的组id。
    
## [0.0.2]

* 添加了模型管理界面

    现在可以方便的管理与使用模型了。
    
    **由于sam-hq以及mobile-sam的权重链接,需要科学上网才可以访问。这两类模型下载时会经常失败**
    **有推荐的比较好用的大文件托管服务,可以联系我**

* 整合了数据转换界面并提供了新功能
    
    - COCO <-> ISAT
    - YOLO <-> ISAT
    - LABELME <-> ISAT
    - ISAT -> VOC(png单通道图)
    - ISAT -> VOC for object detection(xml目标检测)

* 添加了linux系统对[**segment-anything-fast**](https://github.com/pytorch-labs/segment-anything-fast)的支持
    
    该功能可以保持sam分割效果的情况下,减少显存占用并提升分割速度。(目前只对sam系列模型有效)
    
    由于windows下需需torch版本为2.2.0+dev且需要安装较多其他依赖,因而暂时关闭了windows系统下对该功能的支持。
    

* 修复了遗留bug
    
    - 修复了转VOC后第一行一直为0的问题
    - 轮廓保存模式中-只保存最大轮廓现在严格保存面积最大的轮廓(之前使用顶点数量进行粗估计)

## 
* 添加了对[EdgeSAM](https://github.com/chongzhou96/EdgeSAM)的支持。
* 修复转coco后,类别ID从0开始的问题。(现在第一类的类别id为1)
* 修复sam标注过程中,切换图片文件夹后,闪退的问题
* 添加了模型国内下载链接

* 减少模型显存占用
    
    使用bfloat16模型后,显存需求降低,特征计算时间略微增加,最终分割效果无显著变化。

| checkpoint | mem(float) | mem(bfloat16) | cost(float)| cost(bfloat16) |
|----:|----:|----:|----:|----:|
| edge_sam.pth          | 360M | 304M | 0.0212 | 0.0239 |
| edge_sam_3x.pth       | 360M | 304M | 0.0212 | 0.0239 |
| mobile_sam.pt         | 534M | 390M | 0.0200 | 0.0206 |
| sam_hq_vit_tiny.pth   | 598M | 392M | 0.0196 | 0.0210 |
| sam_hq_vit_b.pth      | 3304M | 1762M | 0.1496 | 0.1676 |
| sam_hq_vit_l.pth      | 5016M | 2634M | 0.3766 | 0.4854 |
| sam_hq_vit_h.pth      | 6464M | 3378M | 0.6764 | 0.9282 |
| sam_vit_b_01ec64.pth  | 3302M | 1760M | 0.1539 | 0.1696 |
| sam_vit_l_0b3195.pth  | 5016M | 2634M | 0.3776 | 0.4833 |
| sam_vit_h_4b8939.pth  | 6462M | 3378M | 0.6863 | 0.9288 |

## 0.0.3

* 更新了项目结构
* 填加了sam_med2d模型,实现对医疗数据更好的分割效果

## 0.0.4

* 更新了基于bounding box的自动分割功能

基于标注完成的VOC格式目标检测数据,使用标注框进行sam框提示,自动分割图像并存储为ISAT格式json。


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/yatengLG/ISAT_with_segment_anything",
    "name": "isat-sam",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "annotation tool, segment anything, image annotation, semantic segmentation, instance segmentation",
    "author": "yatengLG",
    "author_email": "yatenglg@foxmail.com",
    "download_url": "https://files.pythonhosted.org/packages/cc/67/3ed63687147e430b57bb0ff292027dd844c39800fd108ffea691ff160aa2/isat-sam-1.0.0.tar.gz",
    "platform": null,
    "description": "<h1 align='center'>ISAT_with_segment_anything</h1>\n<h2 align='center'>An Interactive Semi-Automatic Annotation Tool Based on Segment Anything</h2>\n<p align='center'>\n    <a href='https://github.com/yatengLG/ISAT_with_segment_anything' target=\"_blank\"><img alt=\"GitHub forks\" src=\"https://img.shields.io/github/stars/yatengLG/ISAT_with_segment_anything\"></a>\n    <a href='https://github.com/yatengLG/ISAT_with_segment_anything' target=\"_blank\"><img alt=\"GitHub forks\" src=\"https://img.shields.io/github/forks/yatengLG/ISAT_with_segment_anything\"></a>\n    <a href='https://pypi.org/project/isat-sam/' target=\"_blank\"><img alt=\"PyPI - Version\" src=\"https://img.shields.io/pypi/v/isat-sam\"></a>\n    <a href='https://pypi.org/project/isat-sam/' target=\"_blank\"><img alt=\"Pepy Total Downlods\" src=\"https://img.shields.io/pepy/dt/isat-sam\"></a>\n</p>\n<p align='center'>\n    <a href='README-cn.md'><b>[\u4e2d\u6587]</b></a>\n    <a href='README.md'><b>[English]</b></a>\n</p>\n<p align='center'><img src=\"./display/\u6807\u6ce8.gif\" alt=\"\u6807\u6ce8.gif\"'></p>\n\nOur tool enables interactive use of [segment anything](https://github.com/facebookresearch/segment-anything) for rapid image segmentation with low RAM requirements (optional bf16 mode).\n\nDemo Video\uff1a[YouTube](https://www.youtube.com/watch?v=yLdZCPmX-Bc)\n\n---\n\n# Features\n## Annotaion modes\n- **Semi-automatic Annotation**: utilizes SAM with point and bbox prompts.\n- **Manual Annotation**:  click or drag to draw polygons (0.15s per point).\n\n## Annotation adjustments\n- **Polygon Adjustments**: delete points and adjust object occlusions.\n- **Polygon Visualization**: Preview groups and semantic/instance segmentation masks.\n\n## Export annotations\n- Supported formats: **MSCOCO**, **YOLO**, **LabelMe**, and **VOC** (also xml)\n\nFor more features, see the [Features Description](./docs/features%20description.md).\n\n---\n\n# Installation\nThere are three ways to install ISAT-SAM:\n1. from source code (recommended)\n2. pip install\n3. from .exe\n\n## Option 1: from source code\n### (1) Create environment\n```shell\nconda create -n isat_env python=3.8\nconda activate isat_env\n```\n\n### (2) Install ISAT_with_segment_anything and its dependencies\n**To use GPU, please install [Pytorch-GPU](https://pytorch.org/) on Windows OS frist.**\n```shell\ngit clone https://github.com/yatengLG/ISAT_with_segment_anything.git\ncd ISAT_with_segment_anything\npip install -r requirements.txt\n```\n\n### (3) Download Segment anything pretrained checkpoint.\n\nDownload the checkpoint, and save in under: ISAT_with_segment_anything/ISAT/checkpoints\n\n**After version 0.0.3, you can manage checkpoints within GUI, click [menubar]-[SAM]-[Model manage] to open the GUI.**\n\nNow support [SAM](https://github.com/facebookresearch/segment-anything), [sam-hq](https://github.com/SysCV/sam-hq), [MobileSAM](https://github.com/ChaoningZhang/MobileSAM), and [EdgeSAM](https://github.com/chongzhou96/EdgeSAM) etc.\n\n|  | pretrained checkpoint | memory | size |\n|----|----|----|----|\n|    SAM     | [sam_vit_h_4b8939.pth](https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth)           | 7305M | 2.6G |\n|            | [sam_vit_l_0b3195.pth](https://dl.fbaipublicfiles.com/segment_anything/sam_vit_l_0b3195.pth)           | 5855M | 2.6G |\n|            | [sam_vit_b_01ec64.pth](https://dl.fbaipublicfiles.com/segment_anything/sam_vit_b_01ec64.pth)           | 4149M | 375M |\n|   sam-hq   | [sam_hq_vit_h.pth](https://huggingface.co/lkeab/hq-sam/blob/main/sam_hq_vit_h.pth)                     | 7393M | 2.6G |\n|            | [sam_hq_vit_l.pth](https://huggingface.co/lkeab/hq-sam/blob/main/sam_hq_vit_l.pth)                     | 5939M | 1.3G |\n|            | [sam_hq_vit_b.pth](https://huggingface.co/lkeab/hq-sam/blob/main/sam_hq_vit_b.pth)                     | 4207M | 379M |\n|            | [sam_hq_vit_tiny.pth](https://huggingface.co/lkeab/hq-sam/blob/main/sam_hq_vit_tiny.pth)               | 1463M |  43M |\n| mobile-sam | [mobile_sam.pt](https://github.com/ChaoningZhang/MobileSAM/blob/master/weights/mobile_sam.pt)          | 1375M |  40M |\n|  edge-sam  | [edge_sam.pth](https://huggingface.co/spaces/chongzhou/EdgeSAM/resolve/main/weights/edge_sam.pth)      |  960M |  39M |\n|            | [edge_sam_3x.pth](https://huggingface.co/spaces/chongzhou/EdgeSAM/resolve/main/weights/edge_sam_3x.pth)|  960M |  39M |\n|   sam-med  | [sam-med2d_b.pth](https://drive.google.com/file/d/1ARiB5RkSsWmAB_8mqWnwDF8ZKTtFwsjl/view?usp=drive_link)|1500M |  2.4G |\n\n### (4) Run\n```shell\npython main.py\n```\n<br>\n\n## Option 2: pip install\n**Note that the version may be lower than source code version if installed with pip**\n### (1) Create environment\n```shell\nconda create -n isat_env python=3.8\nconda activate isat_env\n```\n### (2) pip install isat-sam\n**To use GPU, please install [Pytorch-GPU](https://pytorch.org/) on Windows OS frist.**\n```shell\npip install isat-sam\n```\n### (3) Run\n```shell\nisat-sam\n```\n\n<br>\n\n## Option 3: install with .exe\n### (1) download the .exe\n**The version of exe maybe older than source code.**\n\nDownload three .zip files, total 2.7G\n\n|        | Download link                                                      |\n|--------|-----------------------------------------------------------|\n| Baidu Netdisk | link\uff1ahttps://pan.baidu.com/s/1vD19PzvIT1QAJrAkSVFfhg code\uff1aISAT |\n\nClick main.exe to run the tool.\n\n### (2) Download Segment anything pretrained checkpoint.\n\nThe download zip files, container sam_hq_vit_tiny.pth, but somebody told the model not support cpu.\nYou can download [mobile_sam.pt](https://github.com/ChaoningZhang/MobileSAM/blob/master/weights/mobile_sam.pt) to test the tool.\n\nIf you want use other models, see[Download Segment anything pretrained checkpoint](https://github.com/yatengLG/ISAT_with_segment_anything/blob/master/docs/README-en.md#3-download-segment-anything-pretrained-checkpoint)\n\n\n---\n\n# Usage\n## 1. Annotation\n```text\n1. Choose the categories in left window of software.\n    Edit the category in Toolbar-File-Setting.\n    \n2. Start annotating\n    2.1 semi-automatic annotate with SAM.\n        Click button named [Segment anything] start annotate(shortcut Q).\n        Click interested area with left button of mouse, click uninterested area with right button of mouse, SAM will calcute masks.\n    2.2 draw polygons manually.\n        Click the button [Draw polygon] to start annotation (shortcut C).\n        Left click to add point into the polygon.\n        Hold the left click and drag will automaticly add point into the polygon (time interval of 0.15 seconds).\n    2.3 Undo\n        Click the button [Backspace] to return to the previous state (shortcut Z).\n3. Finish the annotation with [Annotate finished] or shortcut E.\n4. Save the annotation with [Save] or shortcut S\n```\n## 2. Polygon Modification\n```text\n1. Modify polygons coordinates\n    Drag and drop polygon vertices to modify the shape of the polygon.\n    Drag the polygon to adjust its position.\n2. Modify polygons category\n    Choose the polygon and click [Edit] or double click the polygon, and choose the new category in editing window. \n3. Occlusion modification\n    Choose the polygon and click [To top] (shortcut T) or [To bottom] (shortcut B).\n4. Delete polygon\n    Choose the polygon and click [Delete] to delete the polygon.\n```\n## 3. Visualization\n```text\n1. Preview annotations\n    Click the [Bit map] to preview semantic and instance annotation masks (shortcut space).\n    The order of swithing is polygons-semantic-instance.\n2. Image window\n    Click [Zoom in], [Zoom out], [Fit window] (shortcut F) to adjust the zooming distances.\n3. Show / hide polygons\n    Click [Visible] to show / hide polygons (shortcut V).\n4. Mask aplha (only effective when using SAM)\n    Drag the [mask aplha] bar to adjust the mask transparency.\n```\n## 4. Convet annotations\nISAT has a specific format with .json. You can use export it to other formarts.\n```text\n1. ISAT to VOC\n    Convert ISAT jsons to PNG images.\n2. ISAT to COCO\n    Convert ISAT jsons to COCO json.\n3. ISAT to LABELME\n    Convert ISAT jsons to LABELME jsons.\n4. COCO to ISAT\n    Convert COCO json to ISAT jsons.\n```\n\n---\n\n# Star History\n\n**Please support us with a star\u2014it's like a virtual coffee!**\n[![Star History Chart](https://api.star-history.com/svg?repos=yatengLG/ISAT_with_segment_anything&type=Date)](https://star-history.com/#yatengLG/ISAT_with_segment_anything&Date)\n\n\n# Contributors\n\n<table border=\"0\">\n<tr>\n    <td><img alt=\"yatengLG\" src=\"https://avatars.githubusercontent.com/u/31759824?v=4\" width=\"60\" height=\"60\" href=\"\">\n    <td><img alt=\"Alias-z\" src=\"https://avatars.githubusercontent.com/u/66273343?v=4\" width=\"60\" height=\"60\" href=\"\">\n    <td>...\n</td>\n</tr>\n<tr>\n  <td><a href=\"https://github.com/yatengLG\">yatengLG</a>\n  <td><a href=\"https://github.com/Alias-z\">Alias-z</a>\n    <td><a href=\"https://github.com/yatengLG/ISAT_with_segment_anything/graphs/contributors\">...</a>\n</tr>\n</table>\n\n\n# Citation\n```text\n@misc{ISAT with segment anything,\n  title={{ISAT with segment anything}: An Interactive Semi-Automatic Annotation Tool Based on Segment Anything},\n  url={https://github.com/yatengLG/ISAT_with_segment_anything},\n  note={Open source software available from https://github.com/yatengLG/ISAT_with_segment_anything},\n  author={yatengLG, Alias-z and horffmanwang},\n  year={2023},\n}\n```\n\n# \u66f4\u65b0\u65e5\u5fd7\n\n## [0.0.1]\n\n* \u53d1\u5e03\u6d4b\u8bd5\u7248\uff0c\u7248\u672c\u53f70.0.1\n\n## \n* \u66f4\u65b0\u4e0b\u8f7d\u529f\u80fd\uff08\u540e\u7eed\u9700\u4f18\u5316\u4e0b\u8f7d\u94fe\u63a5\uff09\n* \u652f\u6301\u591a\u9009\u591a\u8fb9\u5f62\uff0c\u73b0\u5728\u53ef\u4ee5\u6279\u91cf\u5220\u9664\u4e86\n\n    \u6309\u4e0bCtrl\u70b9\u51fb\u591a\u8fb9\u5f62\uff1b\u6309\u4e0bCtrl\u70b9\u51fb\u53f3\u4fa7\u6807\u6ce8\u680f\uff1b\u6309\u4e0bShift\u70b9\u51fb\u53f3\u4fa7\u6807\u6ce8\u680f\uff1b\u70b9\u51fb\u53f3\u4fa7\u6807\u6ce8\u680f\uff0cCtrl+A\u5168\u9009\u3002\n    \n* \u6dfb\u52a0\u8f6c\u6362\u811a\u672c - utils\u76ee\u5f55\u4e0b\n    \n    \u73b0\u652f\u6301 ISAT <-> COCO, ISAT <-> YOLO, ISAT -> XML(\u76ee\u6807\u68c0\u6d4b) \n\n* \u6dfb\u52a0\u4e86\u5bf9[segment-anything-fast](https://github.com/pytorch-labs/segment-anything-fast)\u7684\u652f\u6301\n    \n    **\u73b0\u652f\u6301SAM\u7cfb\u5217\u6a21\u578b\uff0csam-hq\u4ee5\u53camobile-sam\u7b49\u540e\u7eed\u66f4\u65b0**\n    \n    sam-fast\u8981\u6c42torch\u7248\u672c\u4e0d\u4f4e\u4e8e2.1.1;\u4f4e\u4e8e\u7248\u672c\u8981\u6c42\u65f6\uff0c\u9ed8\u8ba4\u5bfc\u5165sam\n    \n    sam_vit_h_4b8939.pth encode\u8ba1\u7b97\u65f6\u95f4\u7f29\u77ed\u5927\u69824\u500d\uff0c\u663e\u5b58\u5360\u7528\u7f29\u5c0f\u52305.6G\n    \n| | sam | sam-fast |\n|----|----|----|\n| 0 | 0.698307991027832 | 0.19336390495300293 | \n| 1 | 0.7048919200897217 | 0.21175742149353027 | \n| 2 | 0.766636848449707 | 0.2573261260986328 | \n| 3 | 0.8198366165161133 | 0.22284531593322754 | \n\n* \u6dfb\u52a0\u4e86\u9971\u548c\u5ea6\u8c03\u6574\n    \n    \u901a\u8fc7\u62d6\u52a8\u5de5\u5177\u680f\u4e2d\u7684\u9971\u548c\u5ea6\u8c03\u6574\u6761\uff0c\u5bf9\u56fe\u7247\u8fdb\u884c\u9971\u548c\u5ea6\u8c03\u6574\u3002\uff08\u53ea\u4e0e\u663e\u793a\u6709\u5173\uff0c\u4e0d\u5f71\u54cdsam\uff09\n    \n* \u6dfb\u52a0track\u6a21\u5f0f\n    \n    \u5728\"\u81ea\u52a8auto\"\u548c\"\u624b\u52a8manual\"\u6a21\u5f0f\u5916\uff0c\u4e3a\u7ec4\u6a21\u5f0f\u4e2d\u6dfb\u52a0\u4e86\"\u8ddf\u8e2atrack\"\u6a21\u5f0f\u3002\u8be5\u6a21\u5f0f\u4e0b\uff0c\u4f7f\u7528[TAB]\u6216\u8005[`]\u5207\u6362\u76ee\u6807\u65f6\uff0c\u7ec4id\u4f1a\u663e\u793a\u4e3a\u8bbe\u7f6e\u4e3a\u5f53\u524d\u591a\u8fb9\u5f62\u7684\u7ec4id\u3002\n    \n## [0.0.2]\n\n* \u6dfb\u52a0\u4e86\u6a21\u578b\u7ba1\u7406\u754c\u9762\n\n    \u73b0\u5728\u53ef\u4ee5\u65b9\u4fbf\u7684\u7ba1\u7406\u4e0e\u4f7f\u7528\u6a21\u578b\u4e86\u3002\n    \n    **\u7531\u4e8esam-hq\u4ee5\u53camobile-sam\u7684\u6743\u91cd\u94fe\u63a5\uff0c\u9700\u8981\u79d1\u5b66\u4e0a\u7f51\u624d\u53ef\u4ee5\u8bbf\u95ee\u3002\u8fd9\u4e24\u7c7b\u6a21\u578b\u4e0b\u8f7d\u65f6\u4f1a\u7ecf\u5e38\u5931\u8d25**\n    **\u6709\u63a8\u8350\u7684\u6bd4\u8f83\u597d\u7528\u7684\u5927\u6587\u4ef6\u6258\u7ba1\u670d\u52a1\uff0c\u53ef\u4ee5\u8054\u7cfb\u6211**\n\n* \u6574\u5408\u4e86\u6570\u636e\u8f6c\u6362\u754c\u9762\u5e76\u63d0\u4f9b\u4e86\u65b0\u529f\u80fd\n    \n    - COCO <-> ISAT\n    - YOLO <-> ISAT\n    - LABELME <-> ISAT\n    - ISAT -> VOC(png\u5355\u901a\u9053\u56fe)\n    - ISAT -> VOC for object detection(xml\u76ee\u6807\u68c0\u6d4b)\n\n* \u6dfb\u52a0\u4e86linux\u7cfb\u7edf\u5bf9[**segment-anything-fast**](https://github.com/pytorch-labs/segment-anything-fast)\u7684\u652f\u6301\n    \n    \u8be5\u529f\u80fd\u53ef\u4ee5\u4fdd\u6301sam\u5206\u5272\u6548\u679c\u7684\u60c5\u51b5\u4e0b\uff0c\u51cf\u5c11\u663e\u5b58\u5360\u7528\u5e76\u63d0\u5347\u5206\u5272\u901f\u5ea6\u3002\uff08\u76ee\u524d\u53ea\u5bf9sam\u7cfb\u5217\u6a21\u578b\u6709\u6548\uff09\n    \n    \u7531\u4e8ewindows\u4e0b\u9700\u9700torch\u7248\u672c\u4e3a2.2.0+dev\u4e14\u9700\u8981\u5b89\u88c5\u8f83\u591a\u5176\u4ed6\u4f9d\u8d56\uff0c\u56e0\u800c\u6682\u65f6\u5173\u95ed\u4e86windows\u7cfb\u7edf\u4e0b\u5bf9\u8be5\u529f\u80fd\u7684\u652f\u6301\u3002\n    \n\n* \u4fee\u590d\u4e86\u9057\u7559bug\n    \n    - \u4fee\u590d\u4e86\u8f6cVOC\u540e\u7b2c\u4e00\u884c\u4e00\u76f4\u4e3a0\u7684\u95ee\u9898\n    - \u8f6e\u5ed3\u4fdd\u5b58\u6a21\u5f0f\u4e2d-\u53ea\u4fdd\u5b58\u6700\u5927\u8f6e\u5ed3\u73b0\u5728\u4e25\u683c\u4fdd\u5b58\u9762\u79ef\u6700\u5927\u7684\u8f6e\u5ed3(\u4e4b\u524d\u4f7f\u7528\u9876\u70b9\u6570\u91cf\u8fdb\u884c\u7c97\u4f30\u8ba1)\n\n## \n* \u6dfb\u52a0\u4e86\u5bf9[EdgeSAM](https://github.com/chongzhou96/EdgeSAM)\u7684\u652f\u6301\u3002\n* \u4fee\u590d\u8f6ccoco\u540e\uff0c\u7c7b\u522bID\u4ece0\u5f00\u59cb\u7684\u95ee\u9898\u3002\uff08\u73b0\u5728\u7b2c\u4e00\u7c7b\u7684\u7c7b\u522bid\u4e3a1\uff09\n* \u4fee\u590dsam\u6807\u6ce8\u8fc7\u7a0b\u4e2d\uff0c\u5207\u6362\u56fe\u7247\u6587\u4ef6\u5939\u540e\uff0c\u95ea\u9000\u7684\u95ee\u9898\n* \u6dfb\u52a0\u4e86\u6a21\u578b\u56fd\u5185\u4e0b\u8f7d\u94fe\u63a5\n\n* \u51cf\u5c11\u6a21\u578b\u663e\u5b58\u5360\u7528\n    \n    \u4f7f\u7528bfloat16\u6a21\u578b\u540e\uff0c\u663e\u5b58\u9700\u6c42\u964d\u4f4e\uff0c\u7279\u5f81\u8ba1\u7b97\u65f6\u95f4\u7565\u5fae\u589e\u52a0\uff0c\u6700\u7ec8\u5206\u5272\u6548\u679c\u65e0\u663e\u8457\u53d8\u5316\u3002\n\n| checkpoint | mem(float) | mem(bfloat16) | cost(float)| cost(bfloat16) |\n|----:|----:|----:|----:|----:|\n| edge_sam.pth          | 360M | 304M | 0.0212 | 0.0239 |\n| edge_sam_3x.pth       | 360M | 304M | 0.0212 | 0.0239 |\n| mobile_sam.pt         | 534M | 390M | 0.0200 | 0.0206 |\n| sam_hq_vit_tiny.pth   | 598M | 392M | 0.0196 | 0.0210 |\n| sam_hq_vit_b.pth      | 3304M | 1762M | 0.1496 | 0.1676 |\n| sam_hq_vit_l.pth      | 5016M | 2634M | 0.3766 | 0.4854 |\n| sam_hq_vit_h.pth      | 6464M | 3378M | 0.6764 | 0.9282 |\n| sam_vit_b_01ec64.pth  | 3302M | 1760M | 0.1539 | 0.1696 |\n| sam_vit_l_0b3195.pth  | 5016M | 2634M | 0.3776 | 0.4833 |\n| sam_vit_h_4b8939.pth  | 6462M | 3378M | 0.6863 | 0.9288 |\n\n## 0.0.3\n\n* \u66f4\u65b0\u4e86\u9879\u76ee\u7ed3\u6784\n* \u586b\u52a0\u4e86sam_med2d\u6a21\u578b\uff0c\u5b9e\u73b0\u5bf9\u533b\u7597\u6570\u636e\u66f4\u597d\u7684\u5206\u5272\u6548\u679c\n\n## 0.0.4\n\n* \u66f4\u65b0\u4e86\u57fa\u4e8ebounding box\u7684\u81ea\u52a8\u5206\u5272\u529f\u80fd\n\n\u57fa\u4e8e\u6807\u6ce8\u5b8c\u6210\u7684VOC\u683c\u5f0f\u76ee\u6807\u68c0\u6d4b\u6570\u636e\uff0c\u4f7f\u7528\u6807\u6ce8\u6846\u8fdb\u884csam\u6846\u63d0\u793a\uff0c\u81ea\u52a8\u5206\u5272\u56fe\u50cf\u5e76\u5b58\u50a8\u4e3aISAT\u683c\u5f0fjson\u3002\n\n",
    "bugtrack_url": null,
    "license": "Apache2.0",
    "summary": "Interactive semi-automatic annotation tool for image segmentation based on SAM(segment anything model).",
    "version": "1.0.0",
    "project_urls": {
        "Homepage": "https://github.com/yatengLG/ISAT_with_segment_anything"
    },
    "split_keywords": [
        "annotation tool",
        " segment anything",
        " image annotation",
        " semantic segmentation",
        " instance segmentation"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cc673ed63687147e430b57bb0ff292027dd844c39800fd108ffea691ff160aa2",
                "md5": "f0c3a01c8ed05712f4080f1699a9b542",
                "sha256": "431043cab489884bdc19ea7f14430e0f1549ac0055b2c9c17f9fa2fb3c30d5fc"
            },
            "downloads": -1,
            "filename": "isat-sam-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "f0c3a01c8ed05712f4080f1699a9b542",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 38224863,
            "upload_time": "2024-05-20T08:51:47",
            "upload_time_iso_8601": "2024-05-20T08:51:47.703015Z",
            "url": "https://files.pythonhosted.org/packages/cc/67/3ed63687147e430b57bb0ff292027dd844c39800fd108ffea691ff160aa2/isat-sam-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-20 08:51:47",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "yatengLG",
    "github_project": "ISAT_with_segment_anything",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "isat-sam"
}
        
Elapsed time: 0.26336s