LabGym


NameLabGym JSON
Version 1.4 PyPI version JSON
download
home_pagehttp://github.com/umyelab/LabGym
SummaryQuantification of user-defined animal behaviors
upload_time2022-06-23 20:21:02
maintainer
docs_urlNone
authorYujia Hu
requires_python>=3.9
licenseGNU General Public License v3 (GPLv3)
keywords quantification of user-defined behaviors
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # What is LabGym?

LabGym is a multi-animal-tracking and deep-learning based package for end-to-end classification and quantification of user-defined animal behaviors without restrictions on animal species or behavior types. It also provides users a way to generate visualizable datasets for the user-defined behaviors.

Please cite:
https://www.biorxiv.org/content/10.1101/2022.02.17.480911v3






The graphical user interface (GUI) of LabGym has 4 functional units: 'Generate Datasets', 'Train Networks', 'Test Networks', and 'Analyze Behaviors':

![alt text](https://github.com/yujiahu415/LabGym/blob/3cac15a69c386673853d91a93f73818f35726e71/Examples/Graphical_user_interface.png)






First you need to use the 'Generate Datasets' functional unit to generate some visualizable behavior data pairs (a data pair comprises an animation & a pattern image) like:

![alt text](https://github.com/yujiahu415/LabGym/blob/a9c77cd1f25ca1edc97aadb2257dd8fc0552483d/Examples/Larvae.gif)
![alt text](https://github.com/yujiahu415/LabGym/blob/4484050e52480cdc0e0611eaff3545dfedf03908/Examples/Flies.gif)
![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Mice.gif)
![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Rats.gif)






The duration of the animation is user-definable.

Next, you need to manually sort them into different folders under the behavior names defined by you. Then input all the folders into LabGym to let it generated a labeled training dataset for training a 'Categorizer' using the 'Train Network functional unit'. There are various complexity levels of the Categorizer for you to choose to suit different behavior datasets. This is the end-to-end process that you 'teach' LabGym to recognize the behaviors defined by you. 

After the Categorizer is trained, you can use 'Test Networks' functional unit to test it in unbiased manner and the trained Categorizer will appear in the 'Analyze Behavior' functional unit. You can select it to analyze behavior videos and output annotated videos with behavior names (and %confidence) in each frame, like:

![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Categorizer_larvae.gif)
![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Categorizer_mice_1.gif)
![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Categorizer_mice_2.gif)
![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Categorizer_rats_1.gif)
![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Categorizer_rats_2.gif)






Notably, LabGym calculates diverse behavioral parameters to provide quantitative measurements of the intensity and dynamics of each user-defined behavior, and the animal movement kinetics during a behavior, like:

![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Quantify%20behavior_1.jpg)
![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Quantify%20behavior_2.jpg)






The outputs of analysis results are:

![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Analysis_output.jpg)










# How to use LabGym:

LabGym does not require labeling or training to track the animals. In turn, it does have preferred video recording setting: LabGym works best for videos with stable background and illumination (the illumination can have dark-to-bright or bright-to-dark transitions but need to be stable before and after the transitions). Animals are expected to present some locational changes instead of being completely immobile all the time during a video recording. Users need to specify a time window during which the animals are moving for background extraction (the shorter the duration of the time window is, the shorter processing time it would take).






To use LabGym:

First install Python3 (version >= 3.9)

Then in your terminal or cmd prompt, type:

    pip install LabGym

or

    pip3 install LabGym

or

    python3 -m pip install LabGym

or

    py -m pip install LabGym

After LabGym is installed, activate python interaction shell by typing 'python3' or 'py' in the terminal or cmd prompt.

Then type:

    from LabGym import gui

Then type:

    gui.gui()

Now the graphical user interface is initiated and is ready to use.



The video tutorials are in the /Tutorials/ folder (https://github.com/umyelab/LabGym/tree/master/Tutorials).

A manual containing explanations on all the buttons in the GUI and the tips for use will come soon.










# If you encounter any issue in using LabGym:

Please first refer to the issue page (https://github.com/umyelab/LabGym/issues?q=) to see whether it was listed in addressed issues. If not, please contact the author: Yujia Hu (henryhu@umich.edu).










# Change logs:



v1.4:

1. Make the time points in the output time-series sheets more precise.
2. Fixed an error when using the 'load background image' option.



v1.3:

1. Improved background subtraction and the tracking is more accurate.
2. Now LabGym not only can work for videos with illumination transitions from dark to bright, but also can work for those from bright to dark, too.



v1.2:

1. Now LabGym can also be used in categorizing binary behaviors (yes or no behavior, or behaviors with only 2 categories)
2. Fixed a bug that caused a path error if users did not select any behavior parameters for quantification.
3. Now users have an option to choose whether to relink newly detected animals to deregistered IDs.



v1.1:

Changed a typo in setup.



v1.0:

Initial release.










            

Raw data

            {
    "_id": null,
    "home_page": "http://github.com/umyelab/LabGym",
    "name": "LabGym",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "",
    "keywords": "quantification of user-defined behaviors",
    "author": "Yujia Hu",
    "author_email": "henryhu@umich.edu",
    "download_url": "https://files.pythonhosted.org/packages/9c/1b/3941b0bb4af7a2c6b93caf56ba4ac52f237c8d0443d00fcfd84e68f3e656/LabGym-1.4.tar.gz",
    "platform": null,
    "description": "# What is LabGym?\n\nLabGym is a multi-animal-tracking and deep-learning based package for end-to-end classification and quantification of user-defined animal behaviors without restrictions on animal species or behavior types. It also provides users a way to generate visualizable datasets for the user-defined behaviors.\n\nPlease cite:\nhttps://www.biorxiv.org/content/10.1101/2022.02.17.480911v3\n\n\n\n\n\n\nThe graphical user interface (GUI) of LabGym has 4 functional units: 'Generate Datasets', 'Train Networks', 'Test Networks', and 'Analyze Behaviors':\n\n![alt text](https://github.com/yujiahu415/LabGym/blob/3cac15a69c386673853d91a93f73818f35726e71/Examples/Graphical_user_interface.png)\n\n\n\n\n\n\nFirst you need to use the 'Generate Datasets' functional unit to generate some visualizable behavior data pairs (a data pair comprises an animation & a pattern image) like:\n\n![alt text](https://github.com/yujiahu415/LabGym/blob/a9c77cd1f25ca1edc97aadb2257dd8fc0552483d/Examples/Larvae.gif)\n![alt text](https://github.com/yujiahu415/LabGym/blob/4484050e52480cdc0e0611eaff3545dfedf03908/Examples/Flies.gif)\n![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Mice.gif)\n![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Rats.gif)\n\n\n\n\n\n\nThe duration of the animation is user-definable.\n\nNext, you need to manually sort them into different folders under the behavior names defined by you. Then input all the folders into LabGym to let it generated a labeled training dataset for training a 'Categorizer' using the 'Train Network functional unit'. There are various complexity levels of the Categorizer for you to choose to suit different behavior datasets. This is the end-to-end process that you 'teach' LabGym to recognize the behaviors defined by you. \n\nAfter the Categorizer is trained, you can use 'Test Networks' functional unit to test it in unbiased manner and the trained Categorizer will appear in the 'Analyze Behavior' functional unit. You can select it to analyze behavior videos and output annotated videos with behavior names (and %confidence) in each frame, like:\n\n![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Categorizer_larvae.gif)\n![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Categorizer_mice_1.gif)\n![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Categorizer_mice_2.gif)\n![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Categorizer_rats_1.gif)\n![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Categorizer_rats_2.gif)\n\n\n\n\n\n\nNotably, LabGym calculates diverse behavioral parameters to provide quantitative measurements of the intensity and dynamics of each user-defined behavior, and the animal movement kinetics during a behavior, like:\n\n![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Quantify%20behavior_1.jpg)\n![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Quantify%20behavior_2.jpg)\n\n\n\n\n\n\nThe outputs of analysis results are:\n\n![alt text](https://github.com/yujiahu415/LabGym/blob/6ea290e8b86b30ae882631a8301ef6c80545f802/Examples/Analysis_output.jpg)\n\n\n\n\n\n\n\n\n\n\n# How to use LabGym:\n\nLabGym does not require labeling or training to track the animals. In turn, it does have preferred video recording setting: LabGym works best for videos with stable background and illumination (the illumination can have dark-to-bright or bright-to-dark transitions but need to be stable before and after the transitions). Animals are expected to present some locational changes instead of being completely immobile all the time during a video recording. Users need to specify a time window during which the animals are moving for background extraction (the shorter the duration of the time window is, the shorter processing time it would take).\n\n\n\n\n\n\nTo use LabGym:\n\nFirst install Python3 (version >= 3.9)\n\nThen in your terminal or cmd prompt, type:\n\n    pip install LabGym\n\nor\n\n    pip3 install LabGym\n\nor\n\n    python3 -m pip install LabGym\n\nor\n\n    py -m pip install LabGym\n\nAfter LabGym is installed, activate python interaction shell by typing 'python3' or 'py' in the terminal or cmd prompt.\n\nThen type:\n\n    from LabGym import gui\n\nThen type:\n\n    gui.gui()\n\nNow the graphical user interface is initiated and is ready to use.\n\n\n\nThe video tutorials are in the /Tutorials/ folder (https://github.com/umyelab/LabGym/tree/master/Tutorials).\n\nA manual containing explanations on all the buttons in the GUI and the tips for use will come soon.\n\n\n\n\n\n\n\n\n\n\n# If you encounter any issue in using LabGym:\n\nPlease first refer to the issue page (https://github.com/umyelab/LabGym/issues?q=) to see whether it was listed in addressed issues. If not, please contact the author: Yujia Hu (henryhu@umich.edu).\n\n\n\n\n\n\n\n\n\n\n# Change logs:\n\n\n\nv1.4:\n\n1. Make the time points in the output time-series sheets more precise.\n2. Fixed an error when using the 'load background image' option.\n\n\n\nv1.3:\n\n1. Improved background subtraction and the tracking is more accurate.\n2. Now LabGym not only can work for videos with illumination transitions from dark to bright, but also can work for those from bright to dark, too.\n\n\n\nv1.2:\n\n1. Now LabGym can also be used in categorizing binary behaviors (yes or no behavior, or behaviors with only 2 categories)\n2. Fixed a bug that caused a path error if users did not select any behavior parameters for quantification.\n3. Now users have an option to choose whether to relink newly detected animals to deregistered IDs.\n\n\n\nv1.1:\n\nChanged a typo in setup.\n\n\n\nv1.0:\n\nInitial release.\n\n\n\n\n\n\n\n\n\n",
    "bugtrack_url": null,
    "license": "GNU General Public License v3 (GPLv3)",
    "summary": "Quantification of user-defined animal behaviors",
    "version": "1.4",
    "split_keywords": [
        "quantification",
        "of",
        "user-defined",
        "behaviors"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "md5": "d4f29d7fb654a697b1946bbc6bff4b23",
                "sha256": "dc19dc7a1e6d26bc459618691eccbce21d8d942a7448fb213ab1aced075016dd"
            },
            "downloads": -1,
            "filename": "LabGym-1.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d4f29d7fb654a697b1946bbc6bff4b23",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 53639834,
            "upload_time": "2022-06-23T20:20:14",
            "upload_time_iso_8601": "2022-06-23T20:20:14.086192Z",
            "url": "https://files.pythonhosted.org/packages/eb/05/1755a603383e35e7c88115ab57ecb77fe1f9a8c812de54fb3ae455f8743a/LabGym-1.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "md5": "b1de9315862cd545cee60863f4d3ce6e",
                "sha256": "3bb6fd9c9b30569d5b5d1dc0884340d5fe7a0f636fbd0f835eab5c8be4413f0a"
            },
            "downloads": -1,
            "filename": "LabGym-1.4.tar.gz",
            "has_sig": false,
            "md5_digest": "b1de9315862cd545cee60863f4d3ce6e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 53591372,
            "upload_time": "2022-06-23T20:21:02",
            "upload_time_iso_8601": "2022-06-23T20:21:02.848377Z",
            "url": "https://files.pythonhosted.org/packages/9c/1b/3941b0bb4af7a2c6b93caf56ba4ac52f237c8d0443d00fcfd84e68f3e656/LabGym-1.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2022-06-23 20:21:02",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "umyelab",
    "github_project": "LabGym",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "labgym"
}
        
Elapsed time: 0.50977s