labelme-ytu


Namelabelme-ytu JSON
Version 5.5.4 PyPI version JSON
download
home_pagehttps://github.com/wkentaro/labelme
SummaryImage Polygonal Annotation with Python
upload_time2024-09-05 13:56:17
maintainerNone
docs_urlNone
authorKentaro Wada
requires_pythonNone
licenseGPLv3
keywords image annotation machine learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <h1 align="center">
  <img src="https://github.com/wkentaro/labelme/blob/main/labelme/icons/icon.png?raw=true"><br/>labelme-ytu
</h1>

<h4 align="center">
  Image Polygonal Annotation with Python
</h4>

<div align="center">
  <a href="https://pypi.python.org/pypi/labelme"><img src="https://img.shields.io/pypi/v/labelme.svg"></a>
  <a href="https://pypi.org/project/labelme"><img src="https://img.shields.io/pypi/pyversions/labelme.svg"></a>
  <a href="https://github.com/wkentaro/labelme/actions"><img src="https://github.com/wkentaro/labelme/workflows/ci/badge.svg?branch=main&event=push"></a>
</div>

<div align="center">
  <a href="https://github.com/wkentaro/labelme/blob/main/#starter-guide"><b>Starter Guide</b></a>
  | <a href="https://github.com/wkentaro/labelme/blob/main/#installation?raw=true"><b>Installation</b></a>
  | <a href="https://github.com/wkentaro/labelme/blob/main/#usage"><b>Usage</b></a>
  | <a href="https://github.com/wkentaro/labelme/blob/main/#examples"><b>Examples</b></a>
  | <a href="https://x.com/labelmeai"><b>X/Twitter</b></a>
  <!-- | <a href="https://github.com/wkentaro/labelme/discussions"><b>Community</b></a> -->
  <!-- | <a href="https://www.youtube.com/playlist?list=PLI6LvFw0iflh3o33YYnVIfOpaO0hc5Dzw"><b>Youtube FAQ</b></a> -->
</div>

<br/>

<div align="center">
  <img src="https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation/.readme/annotation.jpg?raw=true" width="70%">
</div>

## Description

Labelme is a graphical image annotation tool inspired by <http://labelme.csail.mit.edu>.  
It is written in Python and uses Qt for its graphical interface.

<img src="https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation/data_dataset_voc/JPEGImages/2011_000006.jpg?raw=true" width="19%" /> <img src="examples/instance_segmentation/data_dataset_voc/SegmentationClass/2011_000006.png" width="19%" /> <img src="examples/instance_segmentation/data_dataset_voc/SegmentationClassVisualization/2011_000006.jpg" width="19%" /> <img src="examples/instance_segmentation/data_dataset_voc/SegmentationObject/2011_000006.png" width="19%" /> <img src="examples/instance_segmentation/data_dataset_voc/SegmentationObjectVisualization/2011_000006.jpg" width="19%" />  
<i>VOC dataset example of instance segmentation.</i>

<img src="https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation/.readme/annotation.jpg?raw=true" width="30%" /> <img src="examples/bbox_detection/.readme/annotation.jpg" width="30%" /> <img src="examples/classification/.readme/annotation_cat.jpg" width="35%" />  
<i>Other examples (semantic segmentation, bbox detection, and classification).</i>

<img src="https://user-images.githubusercontent.com/4310419/47907116-85667800-de82-11e8-83d0-b9f4eb33268f.gif" width="30%" /> <img src="https://user-images.githubusercontent.com/4310419/47922172-57972880-deae-11e8-84f8-e4324a7c856a.gif" width="30%" /> <img src="https://user-images.githubusercontent.com/14256482/46932075-92145f00-d080-11e8-8d09-2162070ae57c.png" width="32%" />  
<i>Various primitives (polygon, rectangle, circle, line, and point).</i>


## Features

- [x] Image annotation for polygon, rectangle, circle, line and point. ([tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial))
- [x] Image flag annotation for classification and cleaning. ([#166](https://github.com/wkentaro/labelme/pull/166))
- [x] Video annotation. ([video annotation](https://github.com/wkentaro/labelme/blob/main/examples/video_annotation?raw=true))
- [x] GUI customization (predefined labels / flags, auto-saving, label validation, etc). ([#144](https://github.com/wkentaro/labelme/pull/144))
- [x] Exporting VOC-format dataset for semantic/instance segmentation. ([semantic segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true), [instance segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true))
- [x] Exporting COCO-format dataset for instance segmentation. ([instance segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true))


## Starter Guide

If you're new to Labelme, you can get started with [Labelme Starter Guide](https://labelme.gumroad.com/l/starter-guide) (FREE), which contains:

- **Installation guides** for all platforms: Windows, macOS, and Linux 💻
- **Step-by-step tutorials**: first annotation to editing, exporting, and integrating with other programs 📕
- **A compilation of valuable resources** for further exploration 🔗.


## Installation

There are options:

- Platform agnostic installation: [Anaconda](https://github.com/wkentaro/labelme/blob/main/#anaconda)
- Platform specific installation: [Ubuntu](https://github.com/wkentaro/labelme/blob/main/#ubuntu), [macOS](https://github.com/wkentaro/labelme/blob/main/#macos), [Windows](https://github.com/wkentaro/labelme/blob/main/#windows)
- Pre-build binaries from [the release section](https://github.com/wkentaro/labelme/releases)

### Anaconda

You need install [Anaconda](https://www.continuum.io/downloads), then run below:

```bash
# python3
conda create --name=labelme python=3
source activate labelme
# conda install -c conda-forge pyside2
# conda install pyqt
# pip install pyqt5  # pyqt5 can be installed via pip on python3
pip install labelme
# or you can install everything by conda command
# conda install labelme -c conda-forge
```

### Ubuntu

```bash
sudo apt-get install labelme

# or
sudo pip3 install labelme

# or install standalone executable from:
# https://github.com/wkentaro/labelme/releases
```

### macOS

```bash
brew install pyqt  # maybe pyqt5
pip install labelme

# or
brew install wkentaro/labelme/labelme  # command line interface
# brew install --cask wkentaro/labelme/labelme  # app

# or install standalone executable/app from:
# https://github.com/wkentaro/labelme/releases
```

### Windows

Install [Anaconda](https://www.continuum.io/downloads), then in an Anaconda Prompt run:

```bash
conda create --name=labelme python=3
conda activate labelme
pip install labelme

# or install standalone executable/app from:
# https://github.com/wkentaro/labelme/releases
```


## Usage

Run `labelme --help` for detail.  
The annotations are saved as a [JSON](http://www.json.org/) file.

```bash
labelme  # just open gui

# tutorial (single image example)
cd examples/tutorial
labelme apc2016_obj3.jpg  # specify image file
labelme apc2016_obj3.jpg -O apc2016_obj3.json  # close window after the save
labelme apc2016_obj3.jpg --nodata  # not include image data but relative image path in JSON file
labelme apc2016_obj3.jpg \
  --labels highland_6539_self_stick_notes,mead_index_cards,kong_air_dog_squeakair_tennis_ball  # specify label list

# semantic segmentation example
cd examples/semantic_segmentation
labelme data_annotated/  # Open directory to annotate all images in it
labelme data_annotated/ --labels labels.txt  # specify label list with a file
```

### Command Line Arguments
- `--output` specifies the location that annotations will be written to. If the location ends with .json, a single annotation will be written to this file. Only one image can be annotated if a location is specified with .json. If the location does not end with .json, the program will assume it is a directory. Annotations will be stored in this directory with a name that corresponds to the image that the annotation was made on.
- The first time you run labelme, it will create a config file in `~/.labelmerc`. You can edit this file and the changes will be applied the next time that you launch labelme. If you would prefer to use a config file from another location, you can specify this file with the `--config` flag.
- Without the `--nosortlabels` flag, the program will list labels in alphabetical order. When the program is run with this flag, it will display labels in the order that they are provided.
- Flags are assigned to an entire image. [Example](https://github.com/wkentaro/labelme/blob/main/examples/classification?raw=true)
- Labels are assigned to a single polygon. [Example](https://github.com/wkentaro/labelme/blob/main/examples/bbox_detection?raw=true)

### FAQ

- **How to convert JSON file to numpy array?** See [examples/tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial#convert-to-dataset).
- **How to load label PNG file?** See [examples/tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial#how-to-load-label-png-file).
- **How to get annotations for semantic segmentation?** See [examples/semantic_segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true).
- **How to get annotations for instance segmentation?** See [examples/instance_segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true).


## Examples

* [Image Classification](https://github.com/wkentaro/labelme/blob/main/examples/classification?raw=true)
* [Bounding Box Detection](https://github.com/wkentaro/labelme/blob/main/examples/bbox_detection?raw=true)
* [Semantic Segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true)
* [Instance Segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true)
* [Video Annotation](https://github.com/wkentaro/labelme/blob/main/examples/video_annotation?raw=true)

## How to develop

```bash
git clone https://github.com/wkentaro/labelme.git
cd labelme

# Install anaconda3 and labelme
curl -L https://github.com/wkentaro/dotfiles/raw/main/local/bin/install_anaconda3.sh | bash -s .
source .anaconda3/bin/activate
pip install -e .
```


### How to build standalone executable

Below shows how to build the standalone executable on macOS, Linux and Windows.  

```bash
# Setup conda
conda create --name labelme python=3.9
conda activate labelme

# Build the standalone executable
pip install .
pip install 'matplotlib<3.3'
pip install pyinstaller
pyinstaller labelme.spec
dist/labelme --version
```


### How to contribute

Make sure below test passes on your environment.  
See `.github/workflows/ci.yml` for more detail.

```bash
pip install -r requirements-dev.txt

ruff format --check  # `ruff format` to auto-fix
ruff check  # `ruff check --fix` to auto-fix
MPLBACKEND='agg' pytest -vsx tests/
```


## Acknowledgement

This repo is the fork of [mpitid/pylabelme](https://github.com/mpitid/pylabelme).

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/wkentaro/labelme",
    "name": "labelme-ytu",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "Image Annotation, Machine Learning",
    "author": "Kentaro Wada",
    "author_email": "www.kentaro.wada@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/f9/4d/eb9ccb05e3d55feb14c01830de00460b2294177c6ed6edf7b83ee9752804/labelme-ytu-5.5.4.tar.gz",
    "platform": null,
    "description": "<h1 align=\"center\">\r\n  <img src=\"https://github.com/wkentaro/labelme/blob/main/labelme/icons/icon.png?raw=true\"><br/>labelme-ytu\r\n</h1>\r\n\r\n<h4 align=\"center\">\r\n  Image Polygonal Annotation with Python\r\n</h4>\r\n\r\n<div align=\"center\">\r\n  <a href=\"https://pypi.python.org/pypi/labelme\"><img src=\"https://img.shields.io/pypi/v/labelme.svg\"></a>\r\n  <a href=\"https://pypi.org/project/labelme\"><img src=\"https://img.shields.io/pypi/pyversions/labelme.svg\"></a>\r\n  <a href=\"https://github.com/wkentaro/labelme/actions\"><img src=\"https://github.com/wkentaro/labelme/workflows/ci/badge.svg?branch=main&event=push\"></a>\r\n</div>\r\n\r\n<div align=\"center\">\r\n  <a href=\"https://github.com/wkentaro/labelme/blob/main/#starter-guide\"><b>Starter Guide</b></a>\r\n  | <a href=\"https://github.com/wkentaro/labelme/blob/main/#installation?raw=true\"><b>Installation</b></a>\r\n  | <a href=\"https://github.com/wkentaro/labelme/blob/main/#usage\"><b>Usage</b></a>\r\n  | <a href=\"https://github.com/wkentaro/labelme/blob/main/#examples\"><b>Examples</b></a>\r\n  | <a href=\"https://x.com/labelmeai\"><b>X/Twitter</b></a>\r\n  <!-- | <a href=\"https://github.com/wkentaro/labelme/discussions\"><b>Community</b></a> -->\r\n  <!-- | <a href=\"https://www.youtube.com/playlist?list=PLI6LvFw0iflh3o33YYnVIfOpaO0hc5Dzw\"><b>Youtube FAQ</b></a> -->\r\n</div>\r\n\r\n<br/>\r\n\r\n<div align=\"center\">\r\n  <img src=\"https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation/.readme/annotation.jpg?raw=true\" width=\"70%\">\r\n</div>\r\n\r\n## Description\r\n\r\nLabelme is a graphical image annotation tool inspired by <http://labelme.csail.mit.edu>.  \r\nIt is written in Python and uses Qt for its graphical interface.\r\n\r\n<img src=\"https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation/data_dataset_voc/JPEGImages/2011_000006.jpg?raw=true\" width=\"19%\" /> <img src=\"examples/instance_segmentation/data_dataset_voc/SegmentationClass/2011_000006.png\" width=\"19%\" /> <img src=\"examples/instance_segmentation/data_dataset_voc/SegmentationClassVisualization/2011_000006.jpg\" width=\"19%\" /> <img src=\"examples/instance_segmentation/data_dataset_voc/SegmentationObject/2011_000006.png\" width=\"19%\" /> <img src=\"examples/instance_segmentation/data_dataset_voc/SegmentationObjectVisualization/2011_000006.jpg\" width=\"19%\" />  \r\n<i>VOC dataset example of instance segmentation.</i>\r\n\r\n<img src=\"https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation/.readme/annotation.jpg?raw=true\" width=\"30%\" /> <img src=\"examples/bbox_detection/.readme/annotation.jpg\" width=\"30%\" /> <img src=\"examples/classification/.readme/annotation_cat.jpg\" width=\"35%\" />  \r\n<i>Other examples (semantic segmentation, bbox detection, and classification).</i>\r\n\r\n<img src=\"https://user-images.githubusercontent.com/4310419/47907116-85667800-de82-11e8-83d0-b9f4eb33268f.gif\" width=\"30%\" /> <img src=\"https://user-images.githubusercontent.com/4310419/47922172-57972880-deae-11e8-84f8-e4324a7c856a.gif\" width=\"30%\" /> <img src=\"https://user-images.githubusercontent.com/14256482/46932075-92145f00-d080-11e8-8d09-2162070ae57c.png\" width=\"32%\" />  \r\n<i>Various primitives (polygon, rectangle, circle, line, and point).</i>\r\n\r\n\r\n## Features\r\n\r\n- [x] Image annotation for polygon, rectangle, circle, line and point. ([tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial))\r\n- [x] Image flag annotation for classification and cleaning. ([#166](https://github.com/wkentaro/labelme/pull/166))\r\n- [x] Video annotation. ([video annotation](https://github.com/wkentaro/labelme/blob/main/examples/video_annotation?raw=true))\r\n- [x] GUI customization (predefined labels / flags, auto-saving, label validation, etc). ([#144](https://github.com/wkentaro/labelme/pull/144))\r\n- [x] Exporting VOC-format dataset for semantic/instance segmentation. ([semantic segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true), [instance segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true))\r\n- [x] Exporting COCO-format dataset for instance segmentation. ([instance segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true))\r\n\r\n\r\n## Starter Guide\r\n\r\nIf you're new to Labelme, you can get started with [Labelme Starter Guide](https://labelme.gumroad.com/l/starter-guide) (FREE), which contains:\r\n\r\n- **Installation guides** for all platforms: Windows, macOS, and Linux \u011f\u0178\u2019\u00bb\r\n- **Step-by-step tutorials**: first annotation to editing, exporting, and integrating with other programs \u011f\u0178\u201c\u2022\r\n- **A compilation of valuable resources** for further exploration \u011f\u0178\u201d\u2014.\r\n\r\n\r\n## Installation\r\n\r\nThere are options:\r\n\r\n- Platform agnostic installation: [Anaconda](https://github.com/wkentaro/labelme/blob/main/#anaconda)\r\n- Platform specific installation: [Ubuntu](https://github.com/wkentaro/labelme/blob/main/#ubuntu), [macOS](https://github.com/wkentaro/labelme/blob/main/#macos), [Windows](https://github.com/wkentaro/labelme/blob/main/#windows)\r\n- Pre-build binaries from [the release section](https://github.com/wkentaro/labelme/releases)\r\n\r\n### Anaconda\r\n\r\nYou need install [Anaconda](https://www.continuum.io/downloads), then run below:\r\n\r\n```bash\r\n# python3\r\nconda create --name=labelme python=3\r\nsource activate labelme\r\n# conda install -c conda-forge pyside2\r\n# conda install pyqt\r\n# pip install pyqt5  # pyqt5 can be installed via pip on python3\r\npip install labelme\r\n# or you can install everything by conda command\r\n# conda install labelme -c conda-forge\r\n```\r\n\r\n### Ubuntu\r\n\r\n```bash\r\nsudo apt-get install labelme\r\n\r\n# or\r\nsudo pip3 install labelme\r\n\r\n# or install standalone executable from:\r\n# https://github.com/wkentaro/labelme/releases\r\n```\r\n\r\n### macOS\r\n\r\n```bash\r\nbrew install pyqt  # maybe pyqt5\r\npip install labelme\r\n\r\n# or\r\nbrew install wkentaro/labelme/labelme  # command line interface\r\n# brew install --cask wkentaro/labelme/labelme  # app\r\n\r\n# or install standalone executable/app from:\r\n# https://github.com/wkentaro/labelme/releases\r\n```\r\n\r\n### Windows\r\n\r\nInstall [Anaconda](https://www.continuum.io/downloads), then in an Anaconda Prompt run:\r\n\r\n```bash\r\nconda create --name=labelme python=3\r\nconda activate labelme\r\npip install labelme\r\n\r\n# or install standalone executable/app from:\r\n# https://github.com/wkentaro/labelme/releases\r\n```\r\n\r\n\r\n## Usage\r\n\r\nRun `labelme --help` for detail.  \r\nThe annotations are saved as a [JSON](http://www.json.org/) file.\r\n\r\n```bash\r\nlabelme  # just open gui\r\n\r\n# tutorial (single image example)\r\ncd examples/tutorial\r\nlabelme apc2016_obj3.jpg  # specify image file\r\nlabelme apc2016_obj3.jpg -O apc2016_obj3.json  # close window after the save\r\nlabelme apc2016_obj3.jpg --nodata  # not include image data but relative image path in JSON file\r\nlabelme apc2016_obj3.jpg \\\r\n  --labels highland_6539_self_stick_notes,mead_index_cards,kong_air_dog_squeakair_tennis_ball  # specify label list\r\n\r\n# semantic segmentation example\r\ncd examples/semantic_segmentation\r\nlabelme data_annotated/  # Open directory to annotate all images in it\r\nlabelme data_annotated/ --labels labels.txt  # specify label list with a file\r\n```\r\n\r\n### Command Line Arguments\r\n- `--output` specifies the location that annotations will be written to. If the location ends with .json, a single annotation will be written to this file. Only one image can be annotated if a location is specified with .json. If the location does not end with .json, the program will assume it is a directory. Annotations will be stored in this directory with a name that corresponds to the image that the annotation was made on.\r\n- The first time you run labelme, it will create a config file in `~/.labelmerc`. You can edit this file and the changes will be applied the next time that you launch labelme. If you would prefer to use a config file from another location, you can specify this file with the `--config` flag.\r\n- Without the `--nosortlabels` flag, the program will list labels in alphabetical order. When the program is run with this flag, it will display labels in the order that they are provided.\r\n- Flags are assigned to an entire image. [Example](https://github.com/wkentaro/labelme/blob/main/examples/classification?raw=true)\r\n- Labels are assigned to a single polygon. [Example](https://github.com/wkentaro/labelme/blob/main/examples/bbox_detection?raw=true)\r\n\r\n### FAQ\r\n\r\n- **How to convert JSON file to numpy array?** See [examples/tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial#convert-to-dataset).\r\n- **How to load label PNG file?** See [examples/tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial#how-to-load-label-png-file).\r\n- **How to get annotations for semantic segmentation?** See [examples/semantic_segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true).\r\n- **How to get annotations for instance segmentation?** See [examples/instance_segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true).\r\n\r\n\r\n## Examples\r\n\r\n* [Image Classification](https://github.com/wkentaro/labelme/blob/main/examples/classification?raw=true)\r\n* [Bounding Box Detection](https://github.com/wkentaro/labelme/blob/main/examples/bbox_detection?raw=true)\r\n* [Semantic Segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true)\r\n* [Instance Segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true)\r\n* [Video Annotation](https://github.com/wkentaro/labelme/blob/main/examples/video_annotation?raw=true)\r\n\r\n## How to develop\r\n\r\n```bash\r\ngit clone https://github.com/wkentaro/labelme.git\r\ncd labelme\r\n\r\n# Install anaconda3 and labelme\r\ncurl -L https://github.com/wkentaro/dotfiles/raw/main/local/bin/install_anaconda3.sh | bash -s .\r\nsource .anaconda3/bin/activate\r\npip install -e .\r\n```\r\n\r\n\r\n### How to build standalone executable\r\n\r\nBelow shows how to build the standalone executable on macOS, Linux and Windows.  \r\n\r\n```bash\r\n# Setup conda\r\nconda create --name labelme python=3.9\r\nconda activate labelme\r\n\r\n# Build the standalone executable\r\npip install .\r\npip install 'matplotlib<3.3'\r\npip install pyinstaller\r\npyinstaller labelme.spec\r\ndist/labelme --version\r\n```\r\n\r\n\r\n### How to contribute\r\n\r\nMake sure below test passes on your environment.  \r\nSee `.github/workflows/ci.yml` for more detail.\r\n\r\n```bash\r\npip install -r requirements-dev.txt\r\n\r\nruff format --check  # `ruff format` to auto-fix\r\nruff check  # `ruff check --fix` to auto-fix\r\nMPLBACKEND='agg' pytest -vsx tests/\r\n```\r\n\r\n\r\n## Acknowledgement\r\n\r\nThis repo is the fork of [mpitid/pylabelme](https://github.com/mpitid/pylabelme).\r\n",
    "bugtrack_url": null,
    "license": "GPLv3",
    "summary": "Image Polygonal Annotation with Python",
    "version": "5.5.4",
    "project_urls": {
        "Homepage": "https://github.com/wkentaro/labelme"
    },
    "split_keywords": [
        "image annotation",
        " machine learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f94deb9ccb05e3d55feb14c01830de00460b2294177c6ed6edf7b83ee9752804",
                "md5": "b7b96d5af0d181a88cce58f8975169b7",
                "sha256": "3b9646136196800613e261f7ccc87e28905eaf4d3c4a2f775287ac982eaf9317"
            },
            "downloads": -1,
            "filename": "labelme-ytu-5.5.4.tar.gz",
            "has_sig": false,
            "md5_digest": "b7b96d5af0d181a88cce58f8975169b7",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 440828,
            "upload_time": "2024-09-05T13:56:17",
            "upload_time_iso_8601": "2024-09-05T13:56:17.188250Z",
            "url": "https://files.pythonhosted.org/packages/f9/4d/eb9ccb05e3d55feb14c01830de00460b2294177c6ed6edf7b83ee9752804/labelme-ytu-5.5.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-05 13:56:17",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "wkentaro",
    "github_project": "labelme",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "labelme-ytu"
}
        
Elapsed time: 0.75352s