<div align="center">
<img src="artwork/trackastra_logo.png" alt="Optimus Prime" style="width:25%;"/>
</div>
# *Trackastra* - Tracking by Association with Transformers
*Trackastra* is a cell tracking approach that links already segmented cells in a microscopy timelapse by predicting associations with a transformer model that was trained on a diverse set of microscopy videos.

If you are using this code in your research, please cite our [paper](https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/09819.pdf):
> Benjamin Gallusser and Martin Weigert<br>*Trackastra - Transformer-based cell tracking for live-cell microscopy*<br> European Conference on Computer Vision, 2024
## Examples
Nuclei tracking | Bacteria tracking
:-: | :-:
<video src='https://github.com/weigertlab/trackastra/assets/8866751/807a8545-2f65-4697-a175-89b90dfdc435' width=180></video>| <video src='https://github.com/weigertlab/trackastra/assets/8866751/e7426d34-4407-4acb-ad79-fae3bc7ee6f9' width=180/></video>
## Installation
This repository contains the Python implementation of Trackastra.
Please first set up a Python environment (with Python version 3.10 or higher), preferably via [conda](https://conda.io/projects/conda/en/latest/user-guide/install/index.html) or [mamba](https://mamba.readthedocs.io/en/latest/installation/mamba-installation.html#mamba-install).
Trackastra can then be installed from PyPI using `pip`:
```bash
pip install trackastra
```
For tracking with an integer linear program (ILP, which is optional)
```bash
conda create --name trackastra python=3.10 --no-default-packages
conda activate trackastra
conda install -c conda-forge -c gurobi -c funkelab ilpy
pip install "trackastra[ilp]"
```
Notes:
- For the optional ILP linking, this will install [`motile`](https://funkelab.github.io/motile/index.html) and binaries for two discrete optimizers:
1. The [Gurobi Optimizer](https://www.gurobi.com/). This is a commercial solver, which requires a valid license. Academic licenses are provided for free, see [here](https://www.gurobi.com/academia/academic-program-and-licenses/) for how to obtain one.
2. The [SCIP Optimizer](https://www.scipopt.org/), a free and open source solver. If `motile` does not find a valid Gurobi license, it will fall back to using SCIP.
- On MacOS, installing packages into the conda environment before installing `ilpy` can cause problems.
- 2024-06-07: On Apple M3 chips, you might have to use the nightly build of `torch` and `torchvision`, or worst case build them yourself.
## Usage
The input to Trackastra is a sequence of images and their corresponding cell (instance) segmentations.
### Napari plugin
For a quick try of Trackastra on your data, please use our [napari plugin](https://github.com/weigertlab/napari-trackastra/), which already comes with pretrained models included.

### Tracking with a pretrained model
> The available pretrained models are described in detail [here](trackastra/model/pretrained.json).
Consider the following python example script for tracking already segmented cells. All you need are the following two numpy arrays:
- `imgs`: a microscopy time lapse of shape `time,(z),y,x`.
- `masks`: corresponding instance segmentation of shape `time,(z),y,x`.
The predicted assocations can then be used for linked with several modes:
- `greedy_nodiv` (greedy linking with no division) - fast, no additional dependencies
- `greedy` (greedy linking with division) - fast, no additional dependencies
- `ilp` (ILP based linking) - slower but more accurate, needs [`motile`](https://github.com/funkelab/motile)
Apart from that, no hyperparameters to choose :)
```python
import torch
from trackastra.model import Trackastra
from trackastra.tracking import graph_to_ctc, graph_to_napari_tracks
from trackastra.data import example_data_bacteria
device = "cuda" if torch.cuda.is_available() else "cpu"
# load some test data images and masks
imgs, masks = example_data_bacteria()
# Load a pretrained model
model = Trackastra.from_pretrained("general_2d", device=device)
# or from a local folder
# model = Trackastra.from_folder('path/my_model_folder/', device=device)
# Track the cells
track_graph = model.track(imgs, masks, mode="greedy") # or mode="ilp", or "greedy_nodiv"
# Write to cell tracking challenge format
ctc_tracks, masks_tracked = graph_to_ctc(
track_graph,
masks,
outdir="tracked",
)
```
You then can visualize the tracks with [napari](https://github.com/napari/napari):
```python
# Visualise in napari
napari_tracks, napari_tracks_graph, _ = graph_to_napari_tracks(track_graph)
import napari
v = napari.Viewer()
v.add_image(imgs)
v.add_labels(masks_tracked)
v.add_tracks(data=napari_tracks, graph=napari_tracks_graph)
```
### Training a model on your own data
To run an example
- clone this repository and got into the scripts directory with `cd trackastra/scripts`.
- download the [Fluo-N2DL-HeLa](http://data.celltrackingchallenge.net/training-datasets/Fluo-N2DL-HeLa.zip) dataset from the Cell Tracking Challenge into `data/ctc`.
Now, run
```bash
python train.py --config example_config.yaml
```
Generally, training data needs to be provided in the [Cell Tracking Challenge (CTC) format](http://public.celltrackingchallenge.net/documents/Naming%20and%20file%20content%20conventions.pdf), i.e. annotations are located in a folder containing one or several subfolders named `TRA`, with masks and tracklet information.
Raw data
{
"_id": null,
"home_page": null,
"name": "trackastra",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": null,
"author": "Benjamin Gallusser, Martin Weigert",
"author_email": "benjamin.gallusser@epfl.ch, martin.weigert@epfl.ch,",
"download_url": "https://files.pythonhosted.org/packages/9f/e1/0a8253464099812e9d4dcde4c41cb739e8cebad589fffa696d6989c61367/trackastra-0.2.4.tar.gz",
"platform": null,
"description": "<div align=\"center\">\n <img src=\"artwork/trackastra_logo.png\" alt=\"Optimus Prime\" style=\"width:25%;\"/>\n</div>\n\n# *Trackastra* - Tracking by Association with Transformers\n\n\n*Trackastra* is a cell tracking approach that links already segmented cells in a microscopy timelapse by predicting associations with a transformer model that was trained on a diverse set of microscopy videos.\n\n\n\nIf you are using this code in your research, please cite our [paper](https://www.ecva.net/papers/eccv_2024/papers_ECCV/papers/09819.pdf):\n> Benjamin Gallusser and Martin Weigert<br>*Trackastra - Transformer-based cell tracking for live-cell microscopy*<br> European Conference on Computer Vision, 2024\n\n## Examples\nNuclei tracking | Bacteria tracking\n:-: | :-:\n<video src='https://github.com/weigertlab/trackastra/assets/8866751/807a8545-2f65-4697-a175-89b90dfdc435' width=180></video>| <video src='https://github.com/weigertlab/trackastra/assets/8866751/e7426d34-4407-4acb-ad79-fae3bc7ee6f9' width=180/></video>\n\n## Installation\nThis repository contains the Python implementation of Trackastra.\n\nPlease first set up a Python environment (with Python version 3.10 or higher), preferably via [conda](https://conda.io/projects/conda/en/latest/user-guide/install/index.html) or [mamba](https://mamba.readthedocs.io/en/latest/installation/mamba-installation.html#mamba-install).\n\nTrackastra can then be installed from PyPI using `pip`:\n```bash\npip install trackastra\n```\n\nFor tracking with an integer linear program (ILP, which is optional)\n```bash\nconda create --name trackastra python=3.10 --no-default-packages\nconda activate trackastra\nconda install -c conda-forge -c gurobi -c funkelab ilpy\npip install \"trackastra[ilp]\"\n```\n\nNotes:\n- For the optional ILP linking, this will install [`motile`](https://funkelab.github.io/motile/index.html) and binaries for two discrete optimizers:\n\n 1. The [Gurobi Optimizer](https://www.gurobi.com/). This is a commercial solver, which requires a valid license. Academic licenses are provided for free, see [here](https://www.gurobi.com/academia/academic-program-and-licenses/) for how to obtain one.\n\n 2. The [SCIP Optimizer](https://www.scipopt.org/), a free and open source solver. If `motile` does not find a valid Gurobi license, it will fall back to using SCIP.\n- On MacOS, installing packages into the conda environment before installing `ilpy` can cause problems.\n- 2024-06-07: On Apple M3 chips, you might have to use the nightly build of `torch` and `torchvision`, or worst case build them yourself.\n\n## Usage\n\nThe input to Trackastra is a sequence of images and their corresponding cell (instance) segmentations.\n\n### Napari plugin \n\nFor a quick try of Trackastra on your data, please use our [napari plugin](https://github.com/weigertlab/napari-trackastra/), which already comes with pretrained models included.\n\n\n\n\n### Tracking with a pretrained model\n\n> The available pretrained models are described in detail [here](trackastra/model/pretrained.json).\n\nConsider the following python example script for tracking already segmented cells. All you need are the following two numpy arrays:\n- `imgs`: a microscopy time lapse of shape `time,(z),y,x`.\n- `masks`: corresponding instance segmentation of shape `time,(z),y,x`.\n\nThe predicted assocations can then be used for linked with several modes:\n\n- `greedy_nodiv` (greedy linking with no division) - fast, no additional dependencies\n- `greedy` (greedy linking with division) - fast, no additional dependencies\n- `ilp` (ILP based linking) - slower but more accurate, needs [`motile`](https://github.com/funkelab/motile)\n\nApart from that, no hyperparameters to choose :)\n\n```python\nimport torch\nfrom trackastra.model import Trackastra\nfrom trackastra.tracking import graph_to_ctc, graph_to_napari_tracks\nfrom trackastra.data import example_data_bacteria\n\ndevice = \"cuda\" if torch.cuda.is_available() else \"cpu\"\n\n# load some test data images and masks\nimgs, masks = example_data_bacteria()\n\n# Load a pretrained model\nmodel = Trackastra.from_pretrained(\"general_2d\", device=device)\n\n# or from a local folder\n# model = Trackastra.from_folder('path/my_model_folder/', device=device)\n\n# Track the cells\ntrack_graph = model.track(imgs, masks, mode=\"greedy\") # or mode=\"ilp\", or \"greedy_nodiv\"\n\n\n# Write to cell tracking challenge format\nctc_tracks, masks_tracked = graph_to_ctc(\n track_graph,\n masks,\n outdir=\"tracked\",\n)\n```\n\nYou then can visualize the tracks with [napari](https://github.com/napari/napari):\n\n```python\n# Visualise in napari\nnapari_tracks, napari_tracks_graph, _ = graph_to_napari_tracks(track_graph)\n\nimport napari\nv = napari.Viewer()\nv.add_image(imgs)\nv.add_labels(masks_tracked)\nv.add_tracks(data=napari_tracks, graph=napari_tracks_graph)\n```\n\n### Training a model on your own data\n\nTo run an example\n- clone this repository and got into the scripts directory with `cd trackastra/scripts`.\n- download the [Fluo-N2DL-HeLa](http://data.celltrackingchallenge.net/training-datasets/Fluo-N2DL-HeLa.zip) dataset from the Cell Tracking Challenge into `data/ctc`.\n\nNow, run\n```bash\npython train.py --config example_config.yaml\n```\n\nGenerally, training data needs to be provided in the [Cell Tracking Challenge (CTC) format](http://public.celltrackingchallenge.net/documents/Naming%20and%20file%20content%20conventions.pdf), i.e. annotations are located in a folder containing one or several subfolders named `TRA`, with masks and tracklet information.\n",
"bugtrack_url": null,
"license": "BSD 3-Clause License",
"summary": "Tracking by Association with Transformers",
"version": "0.2.4",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "19b700c5a4f02273855daba4300e3215caf1361cb9a51f2c8a8f2a3d7f181072",
"md5": "ef2ee03831af374b29ac708408b47090",
"sha256": "84b8a03a95e9cc5e383c88e90a37b784c49141bbdd5b66d666efe6b5edf6b34c"
},
"downloads": -1,
"filename": "trackastra-0.2.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ef2ee03831af374b29ac708408b47090",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 50527973,
"upload_time": "2024-11-28T13:00:02",
"upload_time_iso_8601": "2024-11-28T13:00:02.461075Z",
"url": "https://files.pythonhosted.org/packages/19/b7/00c5a4f02273855daba4300e3215caf1361cb9a51f2c8a8f2a3d7f181072/trackastra-0.2.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9fe10a8253464099812e9d4dcde4c41cb739e8cebad589fffa696d6989c61367",
"md5": "8c8278ff172eb466d51f047719f77277",
"sha256": "20f0f34b2c29b0812d4a8d03c5d6d74fef7256162fb9f3afb3f51ff217c22261"
},
"downloads": -1,
"filename": "trackastra-0.2.4.tar.gz",
"has_sig": false,
"md5_digest": "8c8278ff172eb466d51f047719f77277",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 51914951,
"upload_time": "2024-11-28T13:00:07",
"upload_time_iso_8601": "2024-11-28T13:00:07.689067Z",
"url": "https://files.pythonhosted.org/packages/9f/e1/0a8253464099812e9d4dcde4c41cb739e8cebad589fffa696d6989c61367/trackastra-0.2.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-28 13:00:07",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "trackastra"
}