motrack


Namemotrack JSON
Version 0.5.0 PyPI version JSON
download
home_pagehttps://github.com/Robotmurlock/Motrack
SummaryTracking-by-detection (MOT) package
upload_time2025-08-16 17:38:28
maintainerNone
docs_urlNone
authorMomir Adzemovic
requires_python<4,>=3.8
licenseNone
keywords tracking-by-detection multi-object-tracking
VCS
bugtrack_url
requirements hydra-core matplotlib numpy omegaconf opencv_python pandas PyYAML scipy tqdm ultralytics torch motrack-motion
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Motrack: Multi-Object Tracking Library

## Introduction

Motrack is a versatile multi-object tracking library designed to 
leverage the tracking-by-detection paradigm. 
It supports a range of tracker algorithms and object detections, 
making it ideal for applications in various domains.

## Usage

Pseudocode for tracker utilization:

```python
from motrack.object_detection import YOLOv8Inference
from motrack.tracker import ByteTracker, TrackletState

tracker = ByteTracker()  # Default parameters
tracklets = []
yolo = YOLOv8Inference(...)

video_frames = read_video(video_path)

for i, image in enumerate(video_frames):
  detections = yolo.predict_bboxes(image)
  tracklets = tracker.track(tracklets, detections, i)
  active_tracklets = [t for t in tracklets if t.state == TrackletState.ACTIVE]

  foo_bar(active_tracklets)
```

This library offers flexibility to use any custom object detector.

Implementation of custom tracker:

```python
from typing import List, Tuple

from motrack.library.cv.bbox import PredBBox
from motrack.tracker import Tracker, Tracklet


class MyTracker(Tracker):
  def track(
    self,
    tracklets: List[Tracklet],
    detections: List[PredBBox],
    frame_index: int
  ) -> List[Tracklet]:
    ... Tracker logic ...

    return tracklets
```

Similarly, custom object detection inference, filter, association method
or dataset can also be implemented and seamlessly combined
with other components.

## Features

### Supported tracker algorithms

| Method Name | Description                                            |
|-------------|--------------------------------------------------------|
| SORT        | [arxiv: Simple Online and Realtime Tracking](https://arxiv.org/pdf/1602.00763.pdf)    | 
| DeepSORT    | [arxiv: SIMPLE ONLINE AND REALTIME TRACKING WITH A DEEP ASSOCIATION METRIC](https://arxiv.org/pdf/1703.07402.pdf) |
| MoveSORT    | SORT with improved association method                  |
| ByteTrack   | [arxiv: ByteTrack: Multi-Object Tracking by Associating Every Detection Box](https://arxiv.org/abs/2110.06864)   |
| Bot-SORT    | [arxiv: BoT-SORT: Robust Associations Multi-Pedestrian Tracking](https://arxiv.org/abs/2206.14651)    |
| SparseTrack | [arxiv: SparseTrack: Multi-Object Tracking by Performing Scene Decomposition based on Pseudo-Depth](https://arxiv.org/abs/2306.05238) |

Evaluation of these methods on different datasets can be found in [evaluation.md](https://github.com/Robotmurlock/Motrack/blob/main/docs/evaluation.md)

### Supported object detection algorithms

| Method Name | Description                                                                        |
|-------------|------------------------------------------------------------------------------------|
| YOLOX       | [arxiv: Simple Online and Realtime Tracking](https://arxiv.org/pdf/1602.00763.pdf) | 
| YOLOv8      | [github: Ultralytics YOLOv8](https://github.com/ultralytics/ultralytics)                             |

Use `motrack/create_yolo_format.py` to create YOLOv8 training dataset and `motrack/create_coco_format.py` 
to create YOLOX training dataset.

### FastReID integration

Any [FastReID](https://github.com/JDAI-CV/fast-reid) model for appearance matching can be used.
Model has to be exported in ONNX. Please check [deploy documentation](https://github.com/JDAI-CV/fast-reid/tree/master/tools/deploy) for mode info.
Use `scrips/create_fastreid_patches.py` to create fast-reid dataset in order to train an appearance model.

### Supported datasets

Currently supported datasets are: MOT17, MOT20, DanceTrack and SportsMOT.

Any custom dataset can be added by extending the base dataset.

### Tools

List of script tools:

  - Inference: Perform any tracker inference that can directly evaluated with TrackEval framework.
  - Postprocess: Perform offline postprocessing (linear interpolation, etc...) for more accuracy tracklets.
  - Visualize: Visualize tracker inference.

### Evaluation

Evaluation of different supported methods can be found [here](https://github.com/Robotmurlock/Motrack/blob/main/docs/evaluation.md).

## Installation

Run these commands to install package within your virtual environment or docker container.

```bash
pip install motrack
```

Package page can be found on [PyPI](https://pypi.org/project/motrack/).

### Extensions

In order to use `YOLOv8` for inference, please install `ultralytics` library:

```bash
pip install ultralytics
```

or install extras `motrack['yolov8']`:

```bash
pip install `motrack['yolov8']`
```

For `FastReID` inference, please install `onnxruntime` for CPU:

```bash
pip install onnxruntime
```

or GPU:

```bash
pip install onnxruntime-gpu
```

In order to use `motrack-motion` filters, use:

```bash
pip install `motrack['motion']`
```

## Changelog

Package changelog can be found [here](https://github.com/Robotmurlock/Motrack/blob/main/docs/changelog.md)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Robotmurlock/Motrack",
    "name": "motrack",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4,>=3.8",
    "maintainer_email": null,
    "keywords": "tracking-by-detection, multi-object-tracking",
    "author": "Momir Adzemovic",
    "author_email": "momir.adzemovic@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/c1/fa/76ea77d08698b1ac939ffed0b4d516f259d0ad18d73f537b5fec53ef3032/motrack-0.5.0.tar.gz",
    "platform": null,
    "description": "# Motrack: Multi-Object Tracking Library\r\n\r\n## Introduction\r\n\r\nMotrack is a versatile multi-object tracking library designed to \r\nleverage the tracking-by-detection paradigm. \r\nIt supports a range of tracker algorithms and object detections, \r\nmaking it ideal for applications in various domains.\r\n\r\n## Usage\r\n\r\nPseudocode for tracker utilization:\r\n\r\n```python\r\nfrom motrack.object_detection import YOLOv8Inference\r\nfrom motrack.tracker import ByteTracker, TrackletState\r\n\r\ntracker = ByteTracker()  # Default parameters\r\ntracklets = []\r\nyolo = YOLOv8Inference(...)\r\n\r\nvideo_frames = read_video(video_path)\r\n\r\nfor i, image in enumerate(video_frames):\r\n  detections = yolo.predict_bboxes(image)\r\n  tracklets = tracker.track(tracklets, detections, i)\r\n  active_tracklets = [t for t in tracklets if t.state == TrackletState.ACTIVE]\r\n\r\n  foo_bar(active_tracklets)\r\n```\r\n\r\nThis library offers flexibility to use any custom object detector.\r\n\r\nImplementation of custom tracker:\r\n\r\n```python\r\nfrom typing import List, Tuple\r\n\r\nfrom motrack.library.cv.bbox import PredBBox\r\nfrom motrack.tracker import Tracker, Tracklet\r\n\r\n\r\nclass MyTracker(Tracker):\r\n  def track(\r\n    self,\r\n    tracklets: List[Tracklet],\r\n    detections: List[PredBBox],\r\n    frame_index: int\r\n  ) -> List[Tracklet]:\r\n    ... Tracker logic ...\r\n\r\n    return tracklets\r\n```\r\n\r\nSimilarly, custom object detection inference, filter, association method\r\nor dataset can also be implemented and seamlessly combined\r\nwith other components.\r\n\r\n## Features\r\n\r\n### Supported tracker algorithms\r\n\r\n| Method Name | Description                                            |\r\n|-------------|--------------------------------------------------------|\r\n| SORT        | [arxiv: Simple Online and Realtime Tracking](https://arxiv.org/pdf/1602.00763.pdf)    | \r\n| DeepSORT    | [arxiv: SIMPLE ONLINE AND REALTIME TRACKING WITH A DEEP ASSOCIATION METRIC](https://arxiv.org/pdf/1703.07402.pdf) |\r\n| MoveSORT    | SORT with improved association method                  |\r\n| ByteTrack   | [arxiv: ByteTrack: Multi-Object Tracking by Associating Every Detection Box](https://arxiv.org/abs/2110.06864)   |\r\n| Bot-SORT    | [arxiv: BoT-SORT: Robust Associations Multi-Pedestrian Tracking](https://arxiv.org/abs/2206.14651)    |\r\n| SparseTrack | [arxiv: SparseTrack: Multi-Object Tracking by Performing Scene Decomposition based on Pseudo-Depth](https://arxiv.org/abs/2306.05238) |\r\n\r\nEvaluation of these methods on different datasets can be found in [evaluation.md](https://github.com/Robotmurlock/Motrack/blob/main/docs/evaluation.md)\r\n\r\n### Supported object detection algorithms\r\n\r\n| Method Name | Description                                                                        |\r\n|-------------|------------------------------------------------------------------------------------|\r\n| YOLOX       | [arxiv: Simple Online and Realtime Tracking](https://arxiv.org/pdf/1602.00763.pdf) | \r\n| YOLOv8      | [github: Ultralytics YOLOv8](https://github.com/ultralytics/ultralytics)                             |\r\n\r\nUse `motrack/create_yolo_format.py` to create YOLOv8 training dataset and `motrack/create_coco_format.py` \r\nto create YOLOX training dataset.\r\n\r\n### FastReID integration\r\n\r\nAny [FastReID](https://github.com/JDAI-CV/fast-reid) model for appearance matching can be used.\r\nModel has to be exported in ONNX. Please check [deploy documentation](https://github.com/JDAI-CV/fast-reid/tree/master/tools/deploy) for mode info.\r\nUse `scrips/create_fastreid_patches.py` to create fast-reid dataset in order to train an appearance model.\r\n\r\n### Supported datasets\r\n\r\nCurrently supported datasets are: MOT17, MOT20, DanceTrack and SportsMOT.\r\n\r\nAny custom dataset can be added by extending the base dataset.\r\n\r\n### Tools\r\n\r\nList of script tools:\r\n\r\n  - Inference: Perform any tracker inference that can directly evaluated with TrackEval framework.\r\n  - Postprocess: Perform offline postprocessing (linear interpolation, etc...) for more accuracy tracklets.\r\n  - Visualize: Visualize tracker inference.\r\n\r\n### Evaluation\r\n\r\nEvaluation of different supported methods can be found [here](https://github.com/Robotmurlock/Motrack/blob/main/docs/evaluation.md).\r\n\r\n## Installation\r\n\r\nRun these commands to install package within your virtual environment or docker container.\r\n\r\n```bash\r\npip install motrack\r\n```\r\n\r\nPackage page can be found on [PyPI](https://pypi.org/project/motrack/).\r\n\r\n### Extensions\r\n\r\nIn order to use `YOLOv8` for inference, please install `ultralytics` library:\r\n\r\n```bash\r\npip install ultralytics\r\n```\r\n\r\nor install extras `motrack['yolov8']`:\r\n\r\n```bash\r\npip install `motrack['yolov8']`\r\n```\r\n\r\nFor `FastReID` inference, please install `onnxruntime` for CPU:\r\n\r\n```bash\r\npip install onnxruntime\r\n```\r\n\r\nor GPU:\r\n\r\n```bash\r\npip install onnxruntime-gpu\r\n```\r\n\r\nIn order to use `motrack-motion` filters, use:\r\n\r\n```bash\r\npip install `motrack['motion']`\r\n```\r\n\r\n## Changelog\r\n\r\nPackage changelog can be found [here](https://github.com/Robotmurlock/Motrack/blob/main/docs/changelog.md)\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Tracking-by-detection (MOT) package",
    "version": "0.5.0",
    "project_urls": {
        "Homepage": "https://github.com/Robotmurlock/Motrack"
    },
    "split_keywords": [
        "tracking-by-detection",
        " multi-object-tracking"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "2036035783f8c3aa08293d31318574cb0610c2d78493f93e5680aa299074d2fe",
                "md5": "92c0786b18ce12e46b60215151f56b02",
                "sha256": "16c4e60c9d56aca1f27da3755e7a6bc5ea948bc43d04c9b3c007b8e85147278a"
            },
            "downloads": -1,
            "filename": "motrack-0.5.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "92c0786b18ce12e46b60215151f56b02",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4,>=3.8",
            "size": 98886,
            "upload_time": "2025-08-16T17:38:27",
            "upload_time_iso_8601": "2025-08-16T17:38:27.378250Z",
            "url": "https://files.pythonhosted.org/packages/20/36/035783f8c3aa08293d31318574cb0610c2d78493f93e5680aa299074d2fe/motrack-0.5.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "c1fa76ea77d08698b1ac939ffed0b4d516f259d0ad18d73f537b5fec53ef3032",
                "md5": "86f00e153aeaaf6fb5da8a5b2da65878",
                "sha256": "a6eb1c0af68dd721775f9b3b29a6ca686e8413316e37da83858697a257c42b90"
            },
            "downloads": -1,
            "filename": "motrack-0.5.0.tar.gz",
            "has_sig": false,
            "md5_digest": "86f00e153aeaaf6fb5da8a5b2da65878",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4,>=3.8",
            "size": 71097,
            "upload_time": "2025-08-16T17:38:28",
            "upload_time_iso_8601": "2025-08-16T17:38:28.450446Z",
            "url": "https://files.pythonhosted.org/packages/c1/fa/76ea77d08698b1ac939ffed0b4d516f259d0ad18d73f537b5fec53ef3032/motrack-0.5.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-16 17:38:28",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Robotmurlock",
    "github_project": "Motrack",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "hydra-core",
            "specs": [
                [
                    ">=",
                    "1.3.2"
                ]
            ]
        },
        {
            "name": "matplotlib",
            "specs": []
        },
        {
            "name": "numpy",
            "specs": [
                [
                    ">=",
                    "1.23.4"
                ]
            ]
        },
        {
            "name": "omegaconf",
            "specs": []
        },
        {
            "name": "opencv_python",
            "specs": []
        },
        {
            "name": "pandas",
            "specs": []
        },
        {
            "name": "PyYAML",
            "specs": []
        },
        {
            "name": "scipy",
            "specs": [
                [
                    ">=",
                    "1.10.0"
                ]
            ]
        },
        {
            "name": "tqdm",
            "specs": []
        },
        {
            "name": "ultralytics",
            "specs": []
        },
        {
            "name": "torch",
            "specs": []
        },
        {
            "name": "motrack-motion",
            "specs": [
                [
                    ">=",
                    "0.2.4"
                ]
            ]
        }
    ],
    "lcname": "motrack"
}
        
Elapsed time: 2.78720s