labelme


Namelabelme JSON
Version 5.4.1 PyPI version JSON
download
home_pagehttps://github.com/wkentaro/labelme
SummaryImage Polygonal Annotation with Python
upload_time2024-01-06 14:10:02
maintainer
docs_urlNone
authorKentaro Wada
requires_python
licenseGPLv3
keywords image annotation machine learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <h1 align="center">
  <img src="https://github.com/wkentaro/labelme/blob/main/labelme/icons/icon.png?raw=true"><br/>labelme
</h1>

<h4 align="center">
  Image Polygonal Annotation with Python
</h4>

<div align="center">
  <a href="https://pypi.python.org/pypi/labelme"><img src="https://img.shields.io/pypi/v/labelme.svg"></a>
  <a href="https://pypi.org/project/labelme"><img src="https://img.shields.io/pypi/pyversions/labelme.svg"></a>
  <a href="https://github.com/wkentaro/labelme/actions"><img src="https://github.com/wkentaro/labelme/workflows/ci/badge.svg?branch=main&event=push"></a>
</div>

<div align="center">
  <a href="https://github.com/wkentaro/labelme/blob/main/#starter-bundle"><b>Starter Bundle</b></a>
  | <a href="https://github.com/wkentaro/labelme/blob/main/#installation?raw=true"><b>Installation</b></a>
  | <a href="https://github.com/wkentaro/labelme/blob/main/#usage"><b>Usage</b></a>
  | <a href="https://github.com/wkentaro/labelme/blob/main/#examples"><b>Examples</b></a>
  | <a href="https://x.com/labelmeai"><b>X/Twitter</b></a>
  <!-- | <a href="https://github.com/wkentaro/labelme/discussions"><b>Community</b></a> -->
  <!-- | <a href="https://www.youtube.com/playlist?list=PLI6LvFw0iflh3o33YYnVIfOpaO0hc5Dzw"><b>Youtube FAQ</b></a> -->
</div>

<br/>

<div align="center">
  <img src="https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation/.readme/annotation.jpg?raw=true" width="70%">
</div>

## Description

Labelme is a graphical image annotation tool inspired by <http://labelme.csail.mit.edu>.  
It is written in Python and uses Qt for its graphical interface.

<img src="https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation/data_dataset_voc/JPEGImages/2011_000006.jpg?raw=true" width="19%" /> <img src="examples/instance_segmentation/data_dataset_voc/SegmentationClass/2011_000006.png" width="19%" /> <img src="examples/instance_segmentation/data_dataset_voc/SegmentationClassVisualization/2011_000006.jpg" width="19%" /> <img src="examples/instance_segmentation/data_dataset_voc/SegmentationObject/2011_000006.png" width="19%" /> <img src="examples/instance_segmentation/data_dataset_voc/SegmentationObjectVisualization/2011_000006.jpg" width="19%" />  
<i>VOC dataset example of instance segmentation.</i>

<img src="https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation/.readme/annotation.jpg?raw=true" width="30%" /> <img src="examples/bbox_detection/.readme/annotation.jpg" width="30%" /> <img src="examples/classification/.readme/annotation_cat.jpg" width="35%" />  
<i>Other examples (semantic segmentation, bbox detection, and classification).</i>

<img src="https://user-images.githubusercontent.com/4310419/47907116-85667800-de82-11e8-83d0-b9f4eb33268f.gif" width="30%" /> <img src="https://user-images.githubusercontent.com/4310419/47922172-57972880-deae-11e8-84f8-e4324a7c856a.gif" width="30%" /> <img src="https://user-images.githubusercontent.com/14256482/46932075-92145f00-d080-11e8-8d09-2162070ae57c.png" width="32%" />  
<i>Various primitives (polygon, rectangle, circle, line, and point).</i>


## Features

- [x] Image annotation for polygon, rectangle, circle, line and point. ([tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial))
- [x] Image flag annotation for classification and cleaning. ([#166](https://github.com/wkentaro/labelme/pull/166))
- [x] Video annotation. ([video annotation](https://github.com/wkentaro/labelme/blob/main/examples/video_annotation?raw=true))
- [x] GUI customization (predefined labels / flags, auto-saving, label validation, etc). ([#144](https://github.com/wkentaro/labelme/pull/144))
- [x] Exporting VOC-format dataset for semantic/instance segmentation. ([semantic segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true), [instance segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true))
- [x] Exporting COCO-format dataset for instance segmentation. ([instance segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true))


## Starter Bundle

If you're new to Labelme, you can get started with [Labelme Starter Bundle](https://labelme.gumroad.com/l/starter-bundle) (FREE), which contains:

- **Installation guides** for all platforms: Windows, macOS, and Linux 💻
- **Step-by-step tutorials**: first annotation to editing, exporting, and integrating with other programs 📕
- **A compilation of valuable resources** for further exploration 🔗.


## Installation

There are options:

- Platform agnostic installation: [Anaconda](https://github.com/wkentaro/labelme/blob/main/#anaconda)
- Platform specific installation: [Ubuntu](https://github.com/wkentaro/labelme/blob/main/#ubuntu), [macOS](https://github.com/wkentaro/labelme/blob/main/#macos), [Windows](https://github.com/wkentaro/labelme/blob/main/#windows)
- Pre-build binaries from [the release section](https://github.com/wkentaro/labelme/releases)

### Anaconda

You need install [Anaconda](https://www.continuum.io/downloads), then run below:

```bash
# python3
conda create --name=labelme python=3
source activate labelme
# conda install -c conda-forge pyside2
# conda install pyqt
# pip install pyqt5  # pyqt5 can be installed via pip on python3
pip install labelme
# or you can install everything by conda command
# conda install labelme -c conda-forge
```

### Ubuntu

```bash
sudo apt-get install labelme

# or
sudo pip3 install labelme

# or install standalone executable from:
# https://github.com/wkentaro/labelme/releases
```

### macOS

```bash
brew install pyqt  # maybe pyqt5
pip install labelme

# or
brew install wkentaro/labelme/labelme  # command line interface
# brew install --cask wkentaro/labelme/labelme  # app

# or install standalone executable/app from:
# https://github.com/wkentaro/labelme/releases
```

### Windows

Install [Anaconda](https://www.continuum.io/downloads), then in an Anaconda Prompt run:

```bash
conda create --name=labelme python=3
conda activate labelme
pip install labelme

# or install standalone executable/app from:
# https://github.com/wkentaro/labelme/releases
```


## Usage

Run `labelme --help` for detail.  
The annotations are saved as a [JSON](http://www.json.org/) file.

```bash
labelme  # just open gui

# tutorial (single image example)
cd examples/tutorial
labelme apc2016_obj3.jpg  # specify image file
labelme apc2016_obj3.jpg -O apc2016_obj3.json  # close window after the save
labelme apc2016_obj3.jpg --nodata  # not include image data but relative image path in JSON file
labelme apc2016_obj3.jpg \
  --labels highland_6539_self_stick_notes,mead_index_cards,kong_air_dog_squeakair_tennis_ball  # specify label list

# semantic segmentation example
cd examples/semantic_segmentation
labelme data_annotated/  # Open directory to annotate all images in it
labelme data_annotated/ --labels labels.txt  # specify label list with a file
```

### Command Line Arguments
- `--output` specifies the location that annotations will be written to. If the location ends with .json, a single annotation will be written to this file. Only one image can be annotated if a location is specified with .json. If the location does not end with .json, the program will assume it is a directory. Annotations will be stored in this directory with a name that corresponds to the image that the annotation was made on.
- The first time you run labelme, it will create a config file in `~/.labelmerc`. You can edit this file and the changes will be applied the next time that you launch labelme. If you would prefer to use a config file from another location, you can specify this file with the `--config` flag.
- Without the `--nosortlabels` flag, the program will list labels in alphabetical order. When the program is run with this flag, it will display labels in the order that they are provided.
- Flags are assigned to an entire image. [Example](https://github.com/wkentaro/labelme/blob/main/examples/classification?raw=true)
- Labels are assigned to a single polygon. [Example](https://github.com/wkentaro/labelme/blob/main/examples/bbox_detection?raw=true)

### FAQ

- **How to convert JSON file to numpy array?** See [examples/tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial#convert-to-dataset).
- **How to load label PNG file?** See [examples/tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial#how-to-load-label-png-file).
- **How to get annotations for semantic segmentation?** See [examples/semantic_segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true).
- **How to get annotations for instance segmentation?** See [examples/instance_segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true).


## Examples

* [Image Classification](https://github.com/wkentaro/labelme/blob/main/examples/classification?raw=true)
* [Bounding Box Detection](https://github.com/wkentaro/labelme/blob/main/examples/bbox_detection?raw=true)
* [Semantic Segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true)
* [Instance Segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true)
* [Video Annotation](https://github.com/wkentaro/labelme/blob/main/examples/video_annotation?raw=true)

## How to develop

```bash
git clone https://github.com/wkentaro/labelme.git
cd labelme

# Install anaconda3 and labelme
curl -L https://github.com/wkentaro/dotfiles/raw/main/local/bin/install_anaconda3.sh | bash -s .
source .anaconda3/bin/activate
pip install -e .
```


### How to build standalone executable

Below shows how to build the standalone executable on macOS, Linux and Windows.  

```bash
# Setup conda
conda create --name labelme python=3.9
conda activate labelme

# Build the standalone executable
pip install .
pip install 'matplotlib<3.3'
pip install pyinstaller
pyinstaller labelme.spec
dist/labelme --version
```


### How to contribute

Make sure below test passes on your environment.  
See `.github/workflows/ci.yml` for more detail.

```bash
pip install -r requirements-dev.txt

ruff format --check  # `ruff format` to auto-fix
ruff check  # `ruff check --fix` to auto-fix
MPLBACKEND='agg' pytest -vsx tests/
```


## Acknowledgement

This repo is the fork of [mpitid/pylabelme](https://github.com/mpitid/pylabelme).

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/wkentaro/labelme",
    "name": "labelme",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "Image Annotation,Machine Learning",
    "author": "Kentaro Wada",
    "author_email": "www.kentaro.wada@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/af/2d/33df0206b6a55a1f2470be0f4f9d5cf2add28ce8901cdb4e4ef58bd36194/labelme-5.4.1.tar.gz",
    "platform": null,
    "description": "<h1 align=\"center\">\n  <img src=\"https://github.com/wkentaro/labelme/blob/main/labelme/icons/icon.png?raw=true\"><br/>labelme\n</h1>\n\n<h4 align=\"center\">\n  Image Polygonal Annotation with Python\n</h4>\n\n<div align=\"center\">\n  <a href=\"https://pypi.python.org/pypi/labelme\"><img src=\"https://img.shields.io/pypi/v/labelme.svg\"></a>\n  <a href=\"https://pypi.org/project/labelme\"><img src=\"https://img.shields.io/pypi/pyversions/labelme.svg\"></a>\n  <a href=\"https://github.com/wkentaro/labelme/actions\"><img src=\"https://github.com/wkentaro/labelme/workflows/ci/badge.svg?branch=main&event=push\"></a>\n</div>\n\n<div align=\"center\">\n  <a href=\"https://github.com/wkentaro/labelme/blob/main/#starter-bundle\"><b>Starter Bundle</b></a>\n  | <a href=\"https://github.com/wkentaro/labelme/blob/main/#installation?raw=true\"><b>Installation</b></a>\n  | <a href=\"https://github.com/wkentaro/labelme/blob/main/#usage\"><b>Usage</b></a>\n  | <a href=\"https://github.com/wkentaro/labelme/blob/main/#examples\"><b>Examples</b></a>\n  | <a href=\"https://x.com/labelmeai\"><b>X/Twitter</b></a>\n  <!-- | <a href=\"https://github.com/wkentaro/labelme/discussions\"><b>Community</b></a> -->\n  <!-- | <a href=\"https://www.youtube.com/playlist?list=PLI6LvFw0iflh3o33YYnVIfOpaO0hc5Dzw\"><b>Youtube FAQ</b></a> -->\n</div>\n\n<br/>\n\n<div align=\"center\">\n  <img src=\"https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation/.readme/annotation.jpg?raw=true\" width=\"70%\">\n</div>\n\n## Description\n\nLabelme is a graphical image annotation tool inspired by <http://labelme.csail.mit.edu>.  \nIt is written in Python and uses Qt for its graphical interface.\n\n<img src=\"https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation/data_dataset_voc/JPEGImages/2011_000006.jpg?raw=true\" width=\"19%\" /> <img src=\"examples/instance_segmentation/data_dataset_voc/SegmentationClass/2011_000006.png\" width=\"19%\" /> <img src=\"examples/instance_segmentation/data_dataset_voc/SegmentationClassVisualization/2011_000006.jpg\" width=\"19%\" /> <img src=\"examples/instance_segmentation/data_dataset_voc/SegmentationObject/2011_000006.png\" width=\"19%\" /> <img src=\"examples/instance_segmentation/data_dataset_voc/SegmentationObjectVisualization/2011_000006.jpg\" width=\"19%\" />  \n<i>VOC dataset example of instance segmentation.</i>\n\n<img src=\"https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation/.readme/annotation.jpg?raw=true\" width=\"30%\" /> <img src=\"examples/bbox_detection/.readme/annotation.jpg\" width=\"30%\" /> <img src=\"examples/classification/.readme/annotation_cat.jpg\" width=\"35%\" />  \n<i>Other examples (semantic segmentation, bbox detection, and classification).</i>\n\n<img src=\"https://user-images.githubusercontent.com/4310419/47907116-85667800-de82-11e8-83d0-b9f4eb33268f.gif\" width=\"30%\" /> <img src=\"https://user-images.githubusercontent.com/4310419/47922172-57972880-deae-11e8-84f8-e4324a7c856a.gif\" width=\"30%\" /> <img src=\"https://user-images.githubusercontent.com/14256482/46932075-92145f00-d080-11e8-8d09-2162070ae57c.png\" width=\"32%\" />  \n<i>Various primitives (polygon, rectangle, circle, line, and point).</i>\n\n\n## Features\n\n- [x] Image annotation for polygon, rectangle, circle, line and point. ([tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial))\n- [x] Image flag annotation for classification and cleaning. ([#166](https://github.com/wkentaro/labelme/pull/166))\n- [x] Video annotation. ([video annotation](https://github.com/wkentaro/labelme/blob/main/examples/video_annotation?raw=true))\n- [x] GUI customization (predefined labels / flags, auto-saving, label validation, etc). ([#144](https://github.com/wkentaro/labelme/pull/144))\n- [x] Exporting VOC-format dataset for semantic/instance segmentation. ([semantic segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true), [instance segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true))\n- [x] Exporting COCO-format dataset for instance segmentation. ([instance segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true))\n\n\n## Starter Bundle\n\nIf you're new to Labelme, you can get started with [Labelme Starter Bundle](https://labelme.gumroad.com/l/starter-bundle) (FREE), which contains:\n\n- **Installation guides** for all platforms: Windows, macOS, and Linux \ud83d\udcbb\n- **Step-by-step tutorials**: first annotation to editing, exporting, and integrating with other programs \ud83d\udcd5\n- **A compilation of valuable resources** for further exploration \ud83d\udd17.\n\n\n## Installation\n\nThere are options:\n\n- Platform agnostic installation: [Anaconda](https://github.com/wkentaro/labelme/blob/main/#anaconda)\n- Platform specific installation: [Ubuntu](https://github.com/wkentaro/labelme/blob/main/#ubuntu), [macOS](https://github.com/wkentaro/labelme/blob/main/#macos), [Windows](https://github.com/wkentaro/labelme/blob/main/#windows)\n- Pre-build binaries from [the release section](https://github.com/wkentaro/labelme/releases)\n\n### Anaconda\n\nYou need install [Anaconda](https://www.continuum.io/downloads), then run below:\n\n```bash\n# python3\nconda create --name=labelme python=3\nsource activate labelme\n# conda install -c conda-forge pyside2\n# conda install pyqt\n# pip install pyqt5  # pyqt5 can be installed via pip on python3\npip install labelme\n# or you can install everything by conda command\n# conda install labelme -c conda-forge\n```\n\n### Ubuntu\n\n```bash\nsudo apt-get install labelme\n\n# or\nsudo pip3 install labelme\n\n# or install standalone executable from:\n# https://github.com/wkentaro/labelme/releases\n```\n\n### macOS\n\n```bash\nbrew install pyqt  # maybe pyqt5\npip install labelme\n\n# or\nbrew install wkentaro/labelme/labelme  # command line interface\n# brew install --cask wkentaro/labelme/labelme  # app\n\n# or install standalone executable/app from:\n# https://github.com/wkentaro/labelme/releases\n```\n\n### Windows\n\nInstall [Anaconda](https://www.continuum.io/downloads), then in an Anaconda Prompt run:\n\n```bash\nconda create --name=labelme python=3\nconda activate labelme\npip install labelme\n\n# or install standalone executable/app from:\n# https://github.com/wkentaro/labelme/releases\n```\n\n\n## Usage\n\nRun `labelme --help` for detail.  \nThe annotations are saved as a [JSON](http://www.json.org/) file.\n\n```bash\nlabelme  # just open gui\n\n# tutorial (single image example)\ncd examples/tutorial\nlabelme apc2016_obj3.jpg  # specify image file\nlabelme apc2016_obj3.jpg -O apc2016_obj3.json  # close window after the save\nlabelme apc2016_obj3.jpg --nodata  # not include image data but relative image path in JSON file\nlabelme apc2016_obj3.jpg \\\n  --labels highland_6539_self_stick_notes,mead_index_cards,kong_air_dog_squeakair_tennis_ball  # specify label list\n\n# semantic segmentation example\ncd examples/semantic_segmentation\nlabelme data_annotated/  # Open directory to annotate all images in it\nlabelme data_annotated/ --labels labels.txt  # specify label list with a file\n```\n\n### Command Line Arguments\n- `--output` specifies the location that annotations will be written to. If the location ends with .json, a single annotation will be written to this file. Only one image can be annotated if a location is specified with .json. If the location does not end with .json, the program will assume it is a directory. Annotations will be stored in this directory with a name that corresponds to the image that the annotation was made on.\n- The first time you run labelme, it will create a config file in `~/.labelmerc`. You can edit this file and the changes will be applied the next time that you launch labelme. If you would prefer to use a config file from another location, you can specify this file with the `--config` flag.\n- Without the `--nosortlabels` flag, the program will list labels in alphabetical order. When the program is run with this flag, it will display labels in the order that they are provided.\n- Flags are assigned to an entire image. [Example](https://github.com/wkentaro/labelme/blob/main/examples/classification?raw=true)\n- Labels are assigned to a single polygon. [Example](https://github.com/wkentaro/labelme/blob/main/examples/bbox_detection?raw=true)\n\n### FAQ\n\n- **How to convert JSON file to numpy array?** See [examples/tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial#convert-to-dataset).\n- **How to load label PNG file?** See [examples/tutorial](https://github.com/wkentaro/labelme/blob/main/examples/tutorial#how-to-load-label-png-file).\n- **How to get annotations for semantic segmentation?** See [examples/semantic_segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true).\n- **How to get annotations for instance segmentation?** See [examples/instance_segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true).\n\n\n## Examples\n\n* [Image Classification](https://github.com/wkentaro/labelme/blob/main/examples/classification?raw=true)\n* [Bounding Box Detection](https://github.com/wkentaro/labelme/blob/main/examples/bbox_detection?raw=true)\n* [Semantic Segmentation](https://github.com/wkentaro/labelme/blob/main/examples/semantic_segmentation?raw=true)\n* [Instance Segmentation](https://github.com/wkentaro/labelme/blob/main/examples/instance_segmentation?raw=true)\n* [Video Annotation](https://github.com/wkentaro/labelme/blob/main/examples/video_annotation?raw=true)\n\n## How to develop\n\n```bash\ngit clone https://github.com/wkentaro/labelme.git\ncd labelme\n\n# Install anaconda3 and labelme\ncurl -L https://github.com/wkentaro/dotfiles/raw/main/local/bin/install_anaconda3.sh | bash -s .\nsource .anaconda3/bin/activate\npip install -e .\n```\n\n\n### How to build standalone executable\n\nBelow shows how to build the standalone executable on macOS, Linux and Windows.  \n\n```bash\n# Setup conda\nconda create --name labelme python=3.9\nconda activate labelme\n\n# Build the standalone executable\npip install .\npip install 'matplotlib<3.3'\npip install pyinstaller\npyinstaller labelme.spec\ndist/labelme --version\n```\n\n\n### How to contribute\n\nMake sure below test passes on your environment.  \nSee `.github/workflows/ci.yml` for more detail.\n\n```bash\npip install -r requirements-dev.txt\n\nruff format --check  # `ruff format` to auto-fix\nruff check  # `ruff check --fix` to auto-fix\nMPLBACKEND='agg' pytest -vsx tests/\n```\n\n\n## Acknowledgement\n\nThis repo is the fork of [mpitid/pylabelme](https://github.com/mpitid/pylabelme).\n",
    "bugtrack_url": null,
    "license": "GPLv3",
    "summary": "Image Polygonal Annotation with Python",
    "version": "5.4.1",
    "project_urls": {
        "Homepage": "https://github.com/wkentaro/labelme"
    },
    "split_keywords": [
        "image annotation",
        "machine learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "af2d33df0206b6a55a1f2470be0f4f9d5cf2add28ce8901cdb4e4ef58bd36194",
                "md5": "904ecb34bd77f46f1e235bec6f64a399",
                "sha256": "aa34a71d5f1f5561c4cc0b43e1e66192762913ced12f1348e2cc2a12fada442e"
            },
            "downloads": -1,
            "filename": "labelme-5.4.1.tar.gz",
            "has_sig": false,
            "md5_digest": "904ecb34bd77f46f1e235bec6f64a399",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 1426408,
            "upload_time": "2024-01-06T14:10:02",
            "upload_time_iso_8601": "2024-01-06T14:10:02.081142Z",
            "url": "https://files.pythonhosted.org/packages/af/2d/33df0206b6a55a1f2470be0f4f9d5cf2add28ce8901cdb4e4ef58bd36194/labelme-5.4.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-06 14:10:02",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "wkentaro",
    "github_project": "labelme",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "labelme"
}
        
Elapsed time: 0.17884s