# RABiTPy - Rapid Artificially Intelligent Bacterial Tracker
RABiTPy is a comprehensive package designed to track and analyze the movement of micro-organisms in video files or image sequences. This package provides tools for loading, identifying, tracking, and analyzing the movement of various organisms.
## Features
- **Capture**: Load video files or image sequences and convert them into frames.
- **Identify**: Detect and identify microorganisms using thresholding or advanced Omnipose masking.
- **Tracker**: Track identified organisms across frames, applying filters to retain meaningful tracks.
- **Stats**: Analyze tracked organisms to calculate speed, movement patterns, and other metrics.
## Installation
***NOTE: Use the python version 3.10.11***
1. First, install the package:
```sh
pip install RABiTPy
```
2. Use this specific command to install the Omnipose
```sh
pip install git+https://github.com/kevinjohncutler/omnipose.git@63045b1af0d52174dee7ff18e94c7cfd84ddd2ff
````
## Usage
RABiTPy consists of four main classes:
1. **Capture**: This class is responsible for loading video files or image sequences and converting them into frames that can be processed in subsequent steps.
2. **Identify**: This class is used to identify different nodes or organisms in each frame. It provides two methods for identification:
- **Thresholding**: A simple technique that uses pixel intensity thresholds to segment organisms.
- **AI-based Masking**: A more advanced method leveraging the AI-based(Omnipose) algorithm for accurate segmentation of organisms.
3. **Tracker**: This class tracks each identified node across frames and filters them based on criteria such as the minimum number of frames they appear in or minimal displacement across frames. This step ensures that only meaningful tracks are retained for analysis.
4. **Stats**: This class computes various statistics about the tracked organisms, such as their speed, correlation of movements, and other relevant metrics. It records these statistics for further analysis.
## Documentation
For detailed usage and examples, refer to the documentation for each class:
- [Capture Class Documentation](documentation/capture.md)
- [Identify Class Documentation](documentation/identify.md)
- [Tracker Class Documentation](documentation/track.md)
- [Stats Class Documentation](documentation/stats.md)
### Example Workflow
A Jupyter notebook with the basic implementationis can be found [here](walkthrough.ipynb).
## Notes
1. **Capture**: The `Capture` class loads video files or images and converts them into a sequence of frames. These frames are then used as input for the next class in the workflow.
2. **Identify**: The `Identify` class processes each frame to detect and identify different nodes or organisms. This can be done using simple thresholding techniques or more advanced masking techniques with Omnipose. The identified nodes are passed on to the next class.
3. **Tracker**: The `Tracker` class takes the identified nodes from the `Identify` class and tracks their movement across frames. It applies filters to ensure that only nodes meeting certain criteria (e.g., minimum appearance in frames, minimal displacement) are kept. The tracking information is then passed to the `Stats` class.
4. **Stats**: The `Stats` class analyzes the tracked nodes to compute various statistics, such as speed and correlation of movement. These statistics are crucial for understanding the behavior and movement patterns of the organisms being studied.
Each class in the workflow passes its output to the next class, ensuring a seamless transition from loading and identifying organisms to tracking their movement and finally analyzing their behavior.
## Authors
- Indraneel Vairagare (indraneel207@gmail.com)
- Samyabrata Sen (ssen31@asu.edu)
- Abhishek Shrivastava (ashrivastava@asu.edu)
## License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## Acknowledgements
We extend our thanks to the developers of the Omnipose and Trackpy libraries, which are essential to the functionality of this package.
Raw data
{
"_id": null,
"home_page": "https://github.com/indraneel207/RABiTPy",
"name": "RABiTPy",
"maintainer": "Samyabrata Sen, Abhishek Shrivastava",
"docs_url": null,
"requires_python": "<3.11,>=3.10",
"maintainer_email": "ssen31@asu.edu, ashrivastava@asu.edu",
"keywords": "tracking video analysis 2D",
"author": "Indraneel Vairagare",
"author_email": "indraneel207@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/45/98/650b044f948fd85dd6dd0fa1f38ad7bc43a6bf65a9a56c756d7a5645d32e/rabitpy-1.0.1.tar.gz",
"platform": null,
"description": "# RABiTPy - Rapid Artificially Intelligent Bacterial Tracker\n\nRABiTPy is a comprehensive package designed to track and analyze the movement of micro-organisms in video files or image sequences. This package provides tools for loading, identifying, tracking, and analyzing the movement of various organisms.\n\n## Features\n\n- **Capture**: Load video files or image sequences and convert them into frames.\n- **Identify**: Detect and identify microorganisms using thresholding or advanced Omnipose masking.\n- **Tracker**: Track identified organisms across frames, applying filters to retain meaningful tracks.\n- **Stats**: Analyze tracked organisms to calculate speed, movement patterns, and other metrics.\n\n## Installation\n\n***NOTE: Use the python version 3.10.11***\n\n1. First, install the package:\n\n ```sh\n pip install RABiTPy\n ```\n\n2. Use this specific command to install the Omnipose\n\n ```sh\n pip install git+https://github.com/kevinjohncutler/omnipose.git@63045b1af0d52174dee7ff18e94c7cfd84ddd2ff\n ````\n\n## Usage\n\nRABiTPy consists of four main classes:\n\n1. **Capture**: This class is responsible for loading video files or image sequences and converting them into frames that can be processed in subsequent steps.\n\n2. **Identify**: This class is used to identify different nodes or organisms in each frame. It provides two methods for identification:\n - **Thresholding**: A simple technique that uses pixel intensity thresholds to segment organisms.\n - **AI-based Masking**: A more advanced method leveraging the AI-based(Omnipose) algorithm for accurate segmentation of organisms.\n\n3. **Tracker**: This class tracks each identified node across frames and filters them based on criteria such as the minimum number of frames they appear in or minimal displacement across frames. This step ensures that only meaningful tracks are retained for analysis.\n\n4. **Stats**: This class computes various statistics about the tracked organisms, such as their speed, correlation of movements, and other relevant metrics. It records these statistics for further analysis.\n\n## Documentation\n\nFor detailed usage and examples, refer to the documentation for each class:\n\n- [Capture Class Documentation](documentation/capture.md)\n- [Identify Class Documentation](documentation/identify.md)\n- [Tracker Class Documentation](documentation/track.md)\n- [Stats Class Documentation](documentation/stats.md)\n\n### Example Workflow\n\nA Jupyter notebook with the basic implementationis can be found [here](walkthrough.ipynb).\n\n## Notes\n\n1. **Capture**: The `Capture` class loads video files or images and converts them into a sequence of frames. These frames are then used as input for the next class in the workflow.\n\n2. **Identify**: The `Identify` class processes each frame to detect and identify different nodes or organisms. This can be done using simple thresholding techniques or more advanced masking techniques with Omnipose. The identified nodes are passed on to the next class.\n\n3. **Tracker**: The `Tracker` class takes the identified nodes from the `Identify` class and tracks their movement across frames. It applies filters to ensure that only nodes meeting certain criteria (e.g., minimum appearance in frames, minimal displacement) are kept. The tracking information is then passed to the `Stats` class.\n\n4. **Stats**: The `Stats` class analyzes the tracked nodes to compute various statistics, such as speed and correlation of movement. These statistics are crucial for understanding the behavior and movement patterns of the organisms being studied.\n\nEach class in the workflow passes its output to the next class, ensuring a seamless transition from loading and identifying organisms to tracking their movement and finally analyzing their behavior.\n\n## Authors\n\n- Indraneel Vairagare (indraneel207@gmail.com)\n- Samyabrata Sen (ssen31@asu.edu)\n- Abhishek Shrivastava (ashrivastava@asu.edu)\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## Acknowledgements\n\nWe extend our thanks to the developers of the Omnipose and Trackpy libraries, which are essential to the functionality of this package.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "RABiTPy: A package to track the movement of a subject in a video file.",
"version": "1.0.1",
"project_urls": {
"Bug Reports": "https://github.com/indraneel207/RABiTPy/issues",
"Homepage": "https://github.com/indraneel207/RABiTPy",
"Source": "https://github.com/indraneel207/RABiTPy"
},
"split_keywords": [
"tracking",
"video",
"analysis",
"2d"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "8c7e1ba13f6b24a3713bad40d5ba4e2e4fef6153cd55c25cf8f929fdb5bbfdff",
"md5": "e73e238f8a2b0d8e1e378b7d75c8a8a6",
"sha256": "965147cdbcde175f636cb03828eaae85830a0a55c07b0b96bfc6079c8e13af13"
},
"downloads": -1,
"filename": "RABiTPy-1.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e73e238f8a2b0d8e1e378b7d75c8a8a6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.11,>=3.10",
"size": 24042,
"upload_time": "2024-12-17T03:31:38",
"upload_time_iso_8601": "2024-12-17T03:31:38.805084Z",
"url": "https://files.pythonhosted.org/packages/8c/7e/1ba13f6b24a3713bad40d5ba4e2e4fef6153cd55c25cf8f929fdb5bbfdff/RABiTPy-1.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "4598650b044f948fd85dd6dd0fa1f38ad7bc43a6bf65a9a56c756d7a5645d32e",
"md5": "e73eaedd125c28c67590cb7061ff17ee",
"sha256": "b6d3362793c478c6a0459f6e4f3130f26fd76e09852c9241cd42f28c4e2b9a6e"
},
"downloads": -1,
"filename": "rabitpy-1.0.1.tar.gz",
"has_sig": false,
"md5_digest": "e73eaedd125c28c67590cb7061ff17ee",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.11,>=3.10",
"size": 23600,
"upload_time": "2024-12-17T03:31:41",
"upload_time_iso_8601": "2024-12-17T03:31:41.014800Z",
"url": "https://files.pythonhosted.org/packages/45/98/650b044f948fd85dd6dd0fa1f38ad7bc43a6bf65a9a56c756d7a5645d32e/rabitpy-1.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-17 03:31:41",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "indraneel207",
"github_project": "RABiTPy",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "aicsimageio",
"specs": [
[
"==",
"4.14.0"
]
]
},
{
"name": "distfit",
"specs": [
[
"==",
"1.7.3"
]
]
},
{
"name": "matplotlib",
"specs": [
[
"==",
"3.7.5"
]
]
},
{
"name": "matplotlib-inline",
"specs": [
[
"==",
"0.1.6"
]
]
},
{
"name": "numpy",
"specs": [
[
"==",
"1.24.4"
]
]
},
{
"name": "natsort",
"specs": [
[
"==",
"8.4.0"
]
]
},
{
"name": "omnipose",
"specs": []
},
{
"name": "opencv-python",
"specs": [
[
"==",
"4.9.0.80"
]
]
},
{
"name": "opencv-python-headless",
"specs": [
[
"==",
"4.9.0.80"
]
]
},
{
"name": "pandas",
"specs": [
[
"==",
"2.1.4"
]
]
},
{
"name": "scikit-image",
"specs": [
[
"==",
"0.20.0"
]
]
},
{
"name": "scikit-learn",
"specs": [
[
"==",
"1.4.2"
]
]
},
{
"name": "scipy",
"specs": [
[
"==",
"1.13.0"
]
]
},
{
"name": "seaborn",
"specs": [
[
"==",
"0.13.2"
]
]
},
{
"name": "torch-optimizer",
"specs": [
[
"==",
"0.3.0"
]
]
},
{
"name": "tqdm",
"specs": [
[
"==",
"4.66.2"
]
]
},
{
"name": "trackpy",
"specs": [
[
"==",
"0.6.2"
]
]
}
],
"lcname": "rabitpy"
}