motrack


Namemotrack JSON
Version 0.4.1 PyPI version JSON
download
home_pagehttps://github.com/Robotmurlock/Motrack
SummaryTracking-by-detection (MOT) package
upload_time2024-02-26 17:56:22
maintainer
docs_urlNone
authorMomir Adzemovic
requires_python>=3.8, <4
license
keywords tracking-by-detection multi-object-tracking
VCS
bugtrack_url
requirements hydra-core matplotlib numpy omegaconf opencv_python pandas PyYAML scipy tqdm ultralytics torch motrack-motion
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Motrack: Multi-Object Tracking Library

## Introduction

Motrack is a versatile multi-object tracking library designed to 
leverage the tracking-by-detection paradigm. 
It supports a range of tracker algorithms and object detections, 
making it ideal for applications in various domains.

## Usage

Pseudocode for tracker utilization:

```python
from motrack.object_detection import YOLOv8Inference
from motrack.tracker import ByteTracker, TrackletState

tracker = ByteTracker()  # Default parameters
tracklets = []
yolo = YOLOv8Inference(...)

video_frames = read_video(video_path)

for i, image in enumerate(video_frames):
  detections = yolo.predict_bboxes(image)
  tracklets = tracker.track(tracklets, detections, i)
  active_tracklets = [t for t in tracklets if t.state == TrackletState.ACTIVE]

  foo_bar(active_tracklets)
```

This library offers flexibility to use any custom object detector.

Implementation of custom tracker:

```python
from typing import List, Tuple

from motrack.library.cv.bbox import PredBBox
from motrack.tracker import Tracker, Tracklet


class MyTracker(Tracker):
  def track(
    self,
    tracklets: List[Tracklet],
    detections: List[PredBBox],
    frame_index: int
  ) -> List[Tracklet]:
    ... Tracker logic ...

    return tracklets
```

Similarly, custom object detection inference, filter, association method
or dataset can also be implemented and seamlessly combined
with other components.

## Features

### Supported tracker algorithms

| Method Name | Description                                            |
|-------------|--------------------------------------------------------|
| SORT        | [arxiv: Simple Online and Realtime Tracking](https://arxiv.org/pdf/1602.00763.pdf)    | 
| DeepSORT    | [arxiv: SIMPLE ONLINE AND REALTIME TRACKING WITH A DEEP ASSOCIATION METRIC](https://arxiv.org/pdf/1703.07402.pdf) |
| MoveSORT    | SORT with improved association method                  |
| ByteTrack   | [arxiv: ByteTrack: Multi-Object Tracking by Associating Every Detection Box](https://arxiv.org/abs/2110.06864)   |
| Bot-SORT    | [arxiv: BoT-SORT: Robust Associations Multi-Pedestrian Tracking](https://arxiv.org/abs/2206.14651)    |
| SparseTrack | [arxiv: SparseTrack: Multi-Object Tracking by Performing Scene Decomposition based on Pseudo-Depth](https://arxiv.org/abs/2306.05238) |

Evaluation of these methods on different datasets can be found in [evaluation.md](https://github.com/Robotmurlock/Motrack/blob/main/docs/evaluation.md)

### Supported object detection algorithms

| Method Name | Description                                                                        |
|-------------|------------------------------------------------------------------------------------|
| YOLOX       | [arxiv: Simple Online and Realtime Tracking](https://arxiv.org/pdf/1602.00763.pdf) | 
| YOLOv8      | [github: Ultralytics YOLOv8](https://github.com/ultralytics/ultralytics)                             |

Use `motrack/create_yolo_format.py` to create YOLOv8 training dataset and `motrack/create_coco_format.py` 
to create YOLOX training dataset.

### FastReID integration

Any [FastReID](https://github.com/JDAI-CV/fast-reid) model for appearance matching can be used.
Model has to be exported in ONNX. Please check [deploy documentation](https://github.com/JDAI-CV/fast-reid/tree/master/tools/deploy) for mode info.
Use `scrips/create_fastreid_patches.py` to create fast-reid dataset in order to train an appearance model.

### Supported datasets

Currently supported datasets are: MOT17, MOT20, DanceTrack and SportsMOT.

Any custom dataset can be added by extending the base dataset.

### Tools

List of script tools:

  - Inference: Perform any tracker inference that can directly evaluated with TrackEval framework.
  - Postprocess: Perform offline postprocessing (linear interpolation, etc...) for more accuracy tracklets.
  - Visualize: Visualize tracker inference.

### Evaluation

Evaluation of different supported methods can be found [here](https://github.com/Robotmurlock/Motrack/blob/main/docs/evaluation.md).

## Installation

Run these commands to install package within your virtual environment or docker container.

```bash
pip install motrack
```

Package page can be found on [PyPI](https://pypi.org/project/motrack/).

### Extensions

In order to use `YOLOv8` for inference, please install `ultralytics` library:

```bash
pip install ultralytics
```

or install extras `motrack['yolov8']`:

```bash
pip install `motrack['yolov8']`
```

For `FastReID` inference, please install `onnxruntime` for CPU:

```bash
pip install onnxruntime
```

or GPU:

```bash
pip install onnxruntime-gpu
```

In order to use `motrack-motion` filters, use:

```bash
pip install `motrack['motion']`
```

## Changelog

Package changelog can be found [here](https://github.com/Robotmurlock/Motrack/blob/main/docs/changelog.md)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Robotmurlock/Motrack",
    "name": "motrack",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8, <4",
    "maintainer_email": "",
    "keywords": "tracking-by-detection,multi-object-tracking",
    "author": "Momir Adzemovic",
    "author_email": "momir.adzemovic@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/3a/8f/33e16ee20a0aaa7755a7bac45a742fed1aa3aebe9d189a3f2957aa300acc/motrack-0.4.1.tar.gz",
    "platform": null,
    "description": "# Motrack: Multi-Object Tracking Library\n\n## Introduction\n\nMotrack is a versatile multi-object tracking library designed to \nleverage the tracking-by-detection paradigm. \nIt supports a range of tracker algorithms and object detections, \nmaking it ideal for applications in various domains.\n\n## Usage\n\nPseudocode for tracker utilization:\n\n```python\nfrom motrack.object_detection import YOLOv8Inference\nfrom motrack.tracker import ByteTracker, TrackletState\n\ntracker = ByteTracker()  # Default parameters\ntracklets = []\nyolo = YOLOv8Inference(...)\n\nvideo_frames = read_video(video_path)\n\nfor i, image in enumerate(video_frames):\n  detections = yolo.predict_bboxes(image)\n  tracklets = tracker.track(tracklets, detections, i)\n  active_tracklets = [t for t in tracklets if t.state == TrackletState.ACTIVE]\n\n  foo_bar(active_tracklets)\n```\n\nThis library offers flexibility to use any custom object detector.\n\nImplementation of custom tracker:\n\n```python\nfrom typing import List, Tuple\n\nfrom motrack.library.cv.bbox import PredBBox\nfrom motrack.tracker import Tracker, Tracklet\n\n\nclass MyTracker(Tracker):\n  def track(\n    self,\n    tracklets: List[Tracklet],\n    detections: List[PredBBox],\n    frame_index: int\n  ) -> List[Tracklet]:\n    ... Tracker logic ...\n\n    return tracklets\n```\n\nSimilarly, custom object detection inference, filter, association method\nor dataset can also be implemented and seamlessly combined\nwith other components.\n\n## Features\n\n### Supported tracker algorithms\n\n| Method Name | Description                                            |\n|-------------|--------------------------------------------------------|\n| SORT        | [arxiv: Simple Online and Realtime Tracking](https://arxiv.org/pdf/1602.00763.pdf)    | \n| DeepSORT    | [arxiv: SIMPLE ONLINE AND REALTIME TRACKING WITH A DEEP ASSOCIATION METRIC](https://arxiv.org/pdf/1703.07402.pdf) |\n| MoveSORT    | SORT with improved association method                  |\n| ByteTrack   | [arxiv: ByteTrack: Multi-Object Tracking by Associating Every Detection Box](https://arxiv.org/abs/2110.06864)   |\n| Bot-SORT    | [arxiv: BoT-SORT: Robust Associations Multi-Pedestrian Tracking](https://arxiv.org/abs/2206.14651)    |\n| SparseTrack | [arxiv: SparseTrack: Multi-Object Tracking by Performing Scene Decomposition based on Pseudo-Depth](https://arxiv.org/abs/2306.05238) |\n\nEvaluation of these methods on different datasets can be found in [evaluation.md](https://github.com/Robotmurlock/Motrack/blob/main/docs/evaluation.md)\n\n### Supported object detection algorithms\n\n| Method Name | Description                                                                        |\n|-------------|------------------------------------------------------------------------------------|\n| YOLOX       | [arxiv: Simple Online and Realtime Tracking](https://arxiv.org/pdf/1602.00763.pdf) | \n| YOLOv8      | [github: Ultralytics YOLOv8](https://github.com/ultralytics/ultralytics)                             |\n\nUse `motrack/create_yolo_format.py` to create YOLOv8 training dataset and `motrack/create_coco_format.py` \nto create YOLOX training dataset.\n\n### FastReID integration\n\nAny [FastReID](https://github.com/JDAI-CV/fast-reid) model for appearance matching can be used.\nModel has to be exported in ONNX. Please check [deploy documentation](https://github.com/JDAI-CV/fast-reid/tree/master/tools/deploy) for mode info.\nUse `scrips/create_fastreid_patches.py` to create fast-reid dataset in order to train an appearance model.\n\n### Supported datasets\n\nCurrently supported datasets are: MOT17, MOT20, DanceTrack and SportsMOT.\n\nAny custom dataset can be added by extending the base dataset.\n\n### Tools\n\nList of script tools:\n\n  - Inference: Perform any tracker inference that can directly evaluated with TrackEval framework.\n  - Postprocess: Perform offline postprocessing (linear interpolation, etc...) for more accuracy tracklets.\n  - Visualize: Visualize tracker inference.\n\n### Evaluation\n\nEvaluation of different supported methods can be found [here](https://github.com/Robotmurlock/Motrack/blob/main/docs/evaluation.md).\n\n## Installation\n\nRun these commands to install package within your virtual environment or docker container.\n\n```bash\npip install motrack\n```\n\nPackage page can be found on [PyPI](https://pypi.org/project/motrack/).\n\n### Extensions\n\nIn order to use `YOLOv8` for inference, please install `ultralytics` library:\n\n```bash\npip install ultralytics\n```\n\nor install extras `motrack['yolov8']`:\n\n```bash\npip install `motrack['yolov8']`\n```\n\nFor `FastReID` inference, please install `onnxruntime` for CPU:\n\n```bash\npip install onnxruntime\n```\n\nor GPU:\n\n```bash\npip install onnxruntime-gpu\n```\n\nIn order to use `motrack-motion` filters, use:\n\n```bash\npip install `motrack['motion']`\n```\n\n## Changelog\n\nPackage changelog can be found [here](https://github.com/Robotmurlock/Motrack/blob/main/docs/changelog.md)\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Tracking-by-detection (MOT) package",
    "version": "0.4.1",
    "project_urls": {
        "Homepage": "https://github.com/Robotmurlock/Motrack"
    },
    "split_keywords": [
        "tracking-by-detection",
        "multi-object-tracking"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "15fbbc4fc31c2b82115af66152c6c0689eb0cd64faff7fa32101f09934183a2f",
                "md5": "4aa05c637d512d7a4a38cda690325aeb",
                "sha256": "623b770ee06d2e2fb200ce18e1d27fd324c734f36597df1c3317dcb4e0a51ec2"
            },
            "downloads": -1,
            "filename": "motrack-0.4.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4aa05c637d512d7a4a38cda690325aeb",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8, <4",
            "size": 104227,
            "upload_time": "2024-02-26T17:56:16",
            "upload_time_iso_8601": "2024-02-26T17:56:16.009079Z",
            "url": "https://files.pythonhosted.org/packages/15/fb/bc4fc31c2b82115af66152c6c0689eb0cd64faff7fa32101f09934183a2f/motrack-0.4.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3a8f33e16ee20a0aaa7755a7bac45a742fed1aa3aebe9d189a3f2957aa300acc",
                "md5": "638034ee26cc7daf4c8c5994570aefdd",
                "sha256": "08aec9c2a146c9b7de9cc275126ddfcaf382e6f63bfef94c1330928535ed1525"
            },
            "downloads": -1,
            "filename": "motrack-0.4.1.tar.gz",
            "has_sig": false,
            "md5_digest": "638034ee26cc7daf4c8c5994570aefdd",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8, <4",
            "size": 70374,
            "upload_time": "2024-02-26T17:56:22",
            "upload_time_iso_8601": "2024-02-26T17:56:22.037143Z",
            "url": "https://files.pythonhosted.org/packages/3a/8f/33e16ee20a0aaa7755a7bac45a742fed1aa3aebe9d189a3f2957aa300acc/motrack-0.4.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-26 17:56:22",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Robotmurlock",
    "github_project": "Motrack",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "hydra-core",
            "specs": [
                [
                    ">=",
                    "1.3.2"
                ]
            ]
        },
        {
            "name": "matplotlib",
            "specs": []
        },
        {
            "name": "numpy",
            "specs": [
                [
                    ">=",
                    "1.23.4"
                ]
            ]
        },
        {
            "name": "omegaconf",
            "specs": []
        },
        {
            "name": "opencv_python",
            "specs": []
        },
        {
            "name": "pandas",
            "specs": []
        },
        {
            "name": "PyYAML",
            "specs": []
        },
        {
            "name": "scipy",
            "specs": [
                [
                    ">=",
                    "1.10.0"
                ]
            ]
        },
        {
            "name": "tqdm",
            "specs": []
        },
        {
            "name": "ultralytics",
            "specs": []
        },
        {
            "name": "torch",
            "specs": []
        },
        {
            "name": "motrack-motion",
            "specs": [
                [
                    ">=",
                    "0.2.4"
                ]
            ]
        }
    ],
    "lcname": "motrack"
}
        
Elapsed time: 0.20739s