# ndx-pose Extension for NWB
[![PyPI version](https://badge.fury.io/py/ndx-pose.svg)](https://badge.fury.io/py/ndx-pose)
ndx-pose is a standardized format for storing pose estimation data in NWB, such as from
[DeepLabCut](http://www.mackenziemathislab.org/deeplabcut) and [SLEAP](https://sleap.ai/).
Please post an issue or PR to suggest or add support for another pose estimation tool.
This extension consists of several new neurodata types:
- `Skeleton` which stores the relationship between the body parts (nodes and edges).
- `Skeletons` which is a container that stores multiple `Skeleton` objects.
- `PoseEstimationSeries` which stores the estimated positions (x, y) or (x, y, z) of a body part over time as well as
the confidence/likelihood of the estimated positions.
- `PoseEstimation` which stores the estimated position data (`PoseEstimationSeries`) for multiple body parts,
computed from the same video(s) with the same tool/algorithm.
- `SkeletonInstance` which stores the estimated positions and visibility of the body parts for a single frame.
- `TrainingFrame` which stores the ground truth data for a single frame. It contains `SkeletonInstance` objects and
references a frame of a source video (`ImageSeries`). The source videos can be stored internally as data arrays or
externally as files referenced by relative file path.
- `TrainingFrames` which is a container that stores multiple `TrainingFrame` objects.
- `SourceVideos` which is a container that stores multiple `ImageSeries` objects representing source videos used in training.
- `PoseTraining` which is a container thatstores the ground truth data (`TrainingFrames`) and source videos (`SourceVideos`)
used to train the pose estimation model.
It is recommended to place the `Skeletons`, `PoseEstimation`, and `PoseTraining` objects in an NWB processing module
named "behavior", as shown below.
## Installation
`pip install ndx-pose`
## Usage examples
1. [Example writing pose estimates (keypoints) to an NWB file](examples/write_pose_estimates_only.py).
2. [Example writing training data to an NWB file](examples/write_pose_training.py).
## Handling pose estimates for multiple subjects
NWB files are designed to store data from a single subject and have only one root-level `Subject` object.
As a result, ndx-pose was designed to store pose estimates from a single subject.
Pose estimates data from different subjects should be stored in separate NWB files.
Training images can involve multiple skeletons, however. These training images may be the same across subjects,
and therefore the same across NWB files. These training images should be duplicated between files, until
multi-subject support is added to NWB and ndx-pose. See https://github.com/rly/ndx-pose/pull/3
## Resources
Utilities to convert DLC output to/from NWB: https://github.com/DeepLabCut/DLC2NWB
- For multi-animal projects, one NWB file is created per animal. The NWB file contains only a `PoseEstimation` object
under `/processing/behavior`. That `PoseEstimation` object contains `PoseEstimationSeries` objects, one for each
body part, and general metadata about the pose estimation process, skeleton, and videos. The
`PoseEstimationSeries` objects contain the estimated positions for that body part for a particular animal.
Utilities to convert SLEAP pose tracking data to/from NWB: https://github.com/talmolab/sleap-io
- Used by SLEAP (sleap.io.dataset.Labels.export_nwb)
- See also https://github.com/talmolab/sleap/blob/develop/sleap/io/format/ndx_pose.py
Keypoint MoSeq: https://github.com/dattalab/keypoint-moseq
- Supports read of `PoseEstimation` objects from NWB files.
NeuroConv: https://neuroconv.readthedocs.io/en/main/conversion_examples_gallery/conversion_example_gallery.html#behavior
- NeuroConv supports converting data from DeepLabCut (using `dlc2nwb` described above),
SLEAP (using `sleap_io` described above), FicTrac, and LightningPose to NWB. It supports appending pose estimation data to an existing NWB file.
Ethome: Tools for machine learning of animal behavior: https://github.com/benlansdell/ethome
- Supports read of `PoseEstimation` objects from NWB files.
Related work:
- https://github.com/ndx-complex-behavior
- https://github.com/datajoint/element-deeplabcut
Several NWB datasets use ndx-pose 0.1.1:
- [A detailed behavioral, videographic, and neural dataset on object recognition in mice](https://dandiarchive.org/dandiset/000231)
- [IBL Brain Wide Map](https://dandiarchive.org/dandiset/000409)
Several [open-source conversion scripts on GitHub](https://github.com/search?q=ndx-pose&type=code&p=1)
also use ndx-pose.
## Diagram of non-training-related types
```mermaid
%%{init: {'theme': 'base', 'themeVariables': {'primaryColor': '#ffffff', "primaryBorderColor': '#144E73', 'lineColor': '#D96F32'}}}%%
classDiagram
direction LR
namespace ndx-pose {
class PoseEstimationSeries{
<<SpatialSeries>>
name : str
description : str
timestamps : array[float; dims [frame]]
data : array[float; dims [frame, [x, y]] or [frame, [x, y, z]]]
confidence : array[float; dims [frame]]
reference_frame: str
}
class PoseEstimation {
<<NWBDataInterface>>
name : str
description : str, optional
original_videos : array[str; dims [file]], optional
labeled_videos : array[str; dims [file]], optional
dimensions : array[uint, dims [file, [width, height]]], optional
scorer : str, optional
scorer_software : str, optional
scorer_software__version : str, optional
PoseEstimationSeries
Skeleton, link
Device, link
}
class Skeleton {
<<NWBDataInterface>>
name : str
nodes : array[str; dims [body part]]
edges : array[uint; dims [edge, [node, node]]]
}
}
class Device
PoseEstimation --o PoseEstimationSeries : contains 0 or more
PoseEstimation --> Skeleton : links to
PoseEstimation --> Device : links to
```
## Diagram of all types
```mermaid
%%{init: {'theme': 'base', 'themeVariables': {'primaryColor': '#ffffff', "primaryBorderColor': '#144E73', 'lineColor': '#D96F32'}}}%%
classDiagram
direction LR
namespace ndx-pose {
class PoseEstimationSeries{
<<SpatialSeries>>
name : str
description : str
timestamps : array[float; dims [frame]]
data : array[float; dims [frame, [x, y]] or [frame, [x, y, z]]]
confidence : array[float; dims [frame]]
reference_frame: str
}
class PoseEstimation {
<<NWBDataInterface>>
name : str
description : str, optional
original_videos : array[str; dims [file]], optional
labeled_videos : array[str; dims [file]], optional
dimensions : array[uint, dims [file, [width, height]]], optional
scorer : str, optional
scorer_software : str, optional
scorer_software__version : str, optional
PoseEstimationSeries
Skeleton, link
Device, link
}
class Skeleton {
<<NWBDataInterface>>
name : str
nodes : array[str; dims [body part]]
edges : array[uint; dims [edge, [node, node]]]
}
class TrainingFrame {
<<NWBDataInterface>>
name : str
annotator : str, optional
source_video_frame_index : uint, optional
skeleton_instances : SkeletonInstances
source_video : ImageSeries, link, optional
source_frame : Image, link, optional
}
class SkeletonInstance {
<<NWBDataInterface>>
id: uint, optional
node_locations : array[float; dims [body part, [x, y]] or [body part, [x, y, z]]]
node_visibility : array[bool; dims [body part]], optional
Skeleton, link
}
class TrainingFrames {
<<NWBDataInterface>>
TrainingFrame
}
class SkeletonInstances {
<<NWBDataInterface>>
SkeletonInstance
}
class SourceVideos {
<<NWBDataInterface>>
ImageSeries
}
class Skeletons {
<<NWBDataInterface>>
Skeleton
}
class PoseTraining {
<<NWBDataInterface>>>
training_frames : TrainingFrames, optional
source_videos : SourceVideos, optional
}
}
class Device
class ImageSeries
class Image
PoseEstimation --o PoseEstimationSeries : contains 0 or more
PoseEstimation --> Skeleton : links to
PoseEstimation --> Device : links to
PoseTraining --o TrainingFrames : contains
PoseTraining --o SourceVideos : contains
TrainingFrames --o TrainingFrame : contains 0 or more
TrainingFrame --o SkeletonInstances : contains
TrainingFrame --> ImageSeries : links to
TrainingFrame --> Image : links to
SkeletonInstances --o SkeletonInstance : contains 0 or more
SkeletonInstance --o Skeleton : links to
SourceVideos --o ImageSeries : contains 0 or more
Skeletons --o Skeleton : contains 0 or more
```
## Contributors
- @rly
- @bendichter
- @AlexEMG
- @roomrys
- @CBroz1
- @h-mayorquin
- @talmo
- @eberrigan
This extension was created using [ndx-template](https://github.com/nwb-extensions/ndx-template).
Raw data
{
"_id": null,
"home_page": null,
"name": "ndx-pose",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "NWB, NeurodataWithoutBorders, ndx-extension, nwb-extension",
"author": null,
"author_email": "Ryan Ly <rly@lbl.gov>, Ben Dichter <bdichter@lbl.gov>, Alexander Mathis <alexander.mathis@epfl.ch>, Liezl Maree <lmaree@salk.edu>, Chris Brozdowski <Chris.Broz@ucsf.edu>, Heberto Mayorquin <h.mayorquin@gmail.com>, Talmo Pereira <talmo@salk.edu>, Elizabeth Berrigan <eberrigan@salk.edu>",
"download_url": "https://files.pythonhosted.org/packages/f6/7e/425021f760351ecdf92eea3519e08ed13a36294628cb81490a5f2c15b63c/ndx_pose-0.2.1.tar.gz",
"platform": null,
"description": "# ndx-pose Extension for NWB\n\n[![PyPI version](https://badge.fury.io/py/ndx-pose.svg)](https://badge.fury.io/py/ndx-pose)\n\nndx-pose is a standardized format for storing pose estimation data in NWB, such as from\n[DeepLabCut](http://www.mackenziemathislab.org/deeplabcut) and [SLEAP](https://sleap.ai/).\nPlease post an issue or PR to suggest or add support for another pose estimation tool.\n\nThis extension consists of several new neurodata types:\n- `Skeleton` which stores the relationship between the body parts (nodes and edges).\n- `Skeletons` which is a container that stores multiple `Skeleton` objects.\n- `PoseEstimationSeries` which stores the estimated positions (x, y) or (x, y, z) of a body part over time as well as\nthe confidence/likelihood of the estimated positions.\n- `PoseEstimation` which stores the estimated position data (`PoseEstimationSeries`) for multiple body parts,\ncomputed from the same video(s) with the same tool/algorithm.\n- `SkeletonInstance` which stores the estimated positions and visibility of the body parts for a single frame.\n- `TrainingFrame` which stores the ground truth data for a single frame. It contains `SkeletonInstance` objects and\nreferences a frame of a source video (`ImageSeries`). The source videos can be stored internally as data arrays or\nexternally as files referenced by relative file path.\n- `TrainingFrames` which is a container that stores multiple `TrainingFrame` objects.\n- `SourceVideos` which is a container that stores multiple `ImageSeries` objects representing source videos used in training.\n- `PoseTraining` which is a container thatstores the ground truth data (`TrainingFrames`) and source videos (`SourceVideos`)\nused to train the pose estimation model.\n\nIt is recommended to place the `Skeletons`, `PoseEstimation`, and `PoseTraining` objects in an NWB processing module\nnamed \"behavior\", as shown below.\n\n## Installation\n\n`pip install ndx-pose`\n\n## Usage examples\n\n1. [Example writing pose estimates (keypoints) to an NWB file](examples/write_pose_estimates_only.py).\n\n2. [Example writing training data to an NWB file](examples/write_pose_training.py).\n\n## Handling pose estimates for multiple subjects\n\nNWB files are designed to store data from a single subject and have only one root-level `Subject` object.\nAs a result, ndx-pose was designed to store pose estimates from a single subject.\nPose estimates data from different subjects should be stored in separate NWB files.\n\nTraining images can involve multiple skeletons, however. These training images may be the same across subjects,\nand therefore the same across NWB files. These training images should be duplicated between files, until\nmulti-subject support is added to NWB and ndx-pose. See https://github.com/rly/ndx-pose/pull/3\n\n## Resources\n\nUtilities to convert DLC output to/from NWB: https://github.com/DeepLabCut/DLC2NWB\n- For multi-animal projects, one NWB file is created per animal. The NWB file contains only a `PoseEstimation` object\n under `/processing/behavior`. That `PoseEstimation` object contains `PoseEstimationSeries` objects, one for each\n body part, and general metadata about the pose estimation process, skeleton, and videos. The\n `PoseEstimationSeries` objects contain the estimated positions for that body part for a particular animal.\n\nUtilities to convert SLEAP pose tracking data to/from NWB: https://github.com/talmolab/sleap-io\n- Used by SLEAP (sleap.io.dataset.Labels.export_nwb)\n- See also https://github.com/talmolab/sleap/blob/develop/sleap/io/format/ndx_pose.py\n\nKeypoint MoSeq: https://github.com/dattalab/keypoint-moseq\n- Supports read of `PoseEstimation` objects from NWB files.\n\nNeuroConv: https://neuroconv.readthedocs.io/en/main/conversion_examples_gallery/conversion_example_gallery.html#behavior\n- NeuroConv supports converting data from DeepLabCut (using `dlc2nwb` described above),\n SLEAP (using `sleap_io` described above), FicTrac, and LightningPose to NWB. It supports appending pose estimation data to an existing NWB file.\n\nEthome: Tools for machine learning of animal behavior: https://github.com/benlansdell/ethome\n- Supports read of `PoseEstimation` objects from NWB files.\n\nRelated work:\n- https://github.com/ndx-complex-behavior\n- https://github.com/datajoint/element-deeplabcut\n\nSeveral NWB datasets use ndx-pose 0.1.1:\n- [A detailed behavioral, videographic, and neural dataset on object recognition in mice](https://dandiarchive.org/dandiset/000231)\n- [IBL Brain Wide Map](https://dandiarchive.org/dandiset/000409)\n\nSeveral [open-source conversion scripts on GitHub](https://github.com/search?q=ndx-pose&type=code&p=1)\nalso use ndx-pose.\n\n## Diagram of non-training-related types\n\n```mermaid\n%%{init: {'theme': 'base', 'themeVariables': {'primaryColor': '#ffffff', \"primaryBorderColor': '#144E73', 'lineColor': '#D96F32'}}}%%\n\nclassDiagram\n direction LR\n namespace ndx-pose {\n class PoseEstimationSeries{\n <<SpatialSeries>>\n name : str\n description : str\n timestamps : array[float; dims [frame]]\n data : array[float; dims [frame, [x, y]] or [frame, [x, y, z]]]\n confidence : array[float; dims [frame]]\n reference_frame: str\n }\n\n class PoseEstimation {\n <<NWBDataInterface>>\n name : str\n description : str, optional\n original_videos : array[str; dims [file]], optional\n labeled_videos : array[str; dims [file]], optional\n dimensions : array[uint, dims [file, [width, height]]], optional\n scorer : str, optional\n scorer_software : str, optional\n scorer_software__version : str, optional\n PoseEstimationSeries\n Skeleton, link\n Device, link\n }\n\n class Skeleton {\n <<NWBDataInterface>>\n name : str\n nodes : array[str; dims [body part]]\n edges : array[uint; dims [edge, [node, node]]]\n }\n\n }\n\n class Device\n\n PoseEstimation --o PoseEstimationSeries : contains 0 or more\n PoseEstimation --> Skeleton : links to\n PoseEstimation --> Device : links to\n```\n\n## Diagram of all types\n\n```mermaid\n%%{init: {'theme': 'base', 'themeVariables': {'primaryColor': '#ffffff', \"primaryBorderColor': '#144E73', 'lineColor': '#D96F32'}}}%%\n\nclassDiagram\n direction LR\n namespace ndx-pose {\n class PoseEstimationSeries{\n <<SpatialSeries>>\n name : str\n description : str\n timestamps : array[float; dims [frame]]\n data : array[float; dims [frame, [x, y]] or [frame, [x, y, z]]]\n confidence : array[float; dims [frame]]\n reference_frame: str\n }\n\n class PoseEstimation {\n <<NWBDataInterface>>\n name : str\n description : str, optional\n original_videos : array[str; dims [file]], optional\n labeled_videos : array[str; dims [file]], optional\n dimensions : array[uint, dims [file, [width, height]]], optional\n scorer : str, optional\n scorer_software : str, optional\n scorer_software__version : str, optional\n PoseEstimationSeries\n Skeleton, link\n Device, link\n }\n\n class Skeleton {\n <<NWBDataInterface>>\n name : str\n nodes : array[str; dims [body part]]\n edges : array[uint; dims [edge, [node, node]]]\n }\n\n class TrainingFrame {\n <<NWBDataInterface>>\n name : str\n annotator : str, optional\n source_video_frame_index : uint, optional\n skeleton_instances : SkeletonInstances\n source_video : ImageSeries, link, optional\n source_frame : Image, link, optional\n }\n\n class SkeletonInstance {\n <<NWBDataInterface>>\n id: uint, optional\n node_locations : array[float; dims [body part, [x, y]] or [body part, [x, y, z]]]\n node_visibility : array[bool; dims [body part]], optional\n Skeleton, link\n }\n\n class TrainingFrames {\n <<NWBDataInterface>>\n TrainingFrame\n }\n\n class SkeletonInstances {\n <<NWBDataInterface>>\n SkeletonInstance\n }\n\n class SourceVideos {\n <<NWBDataInterface>>\n ImageSeries\n }\n\n class Skeletons {\n <<NWBDataInterface>>\n Skeleton\n }\n\n class PoseTraining {\n <<NWBDataInterface>>>\n training_frames : TrainingFrames, optional\n source_videos : SourceVideos, optional\n }\n\n }\n\n class Device\n class ImageSeries\n class Image\n\n PoseEstimation --o PoseEstimationSeries : contains 0 or more\n PoseEstimation --> Skeleton : links to\n PoseEstimation --> Device : links to\n\n PoseTraining --o TrainingFrames : contains\n PoseTraining --o SourceVideos : contains\n\n TrainingFrames --o TrainingFrame : contains 0 or more\n\n TrainingFrame --o SkeletonInstances : contains\n TrainingFrame --> ImageSeries : links to\n TrainingFrame --> Image : links to\n\n SkeletonInstances --o SkeletonInstance : contains 0 or more\n SkeletonInstance --o Skeleton : links to\n\n SourceVideos --o ImageSeries : contains 0 or more\n\n Skeletons --o Skeleton : contains 0 or more\n```\n\n## Contributors\n- @rly\n- @bendichter\n- @AlexEMG\n- @roomrys\n- @CBroz1\n- @h-mayorquin\n- @talmo\n- @eberrigan\n\nThis extension was created using [ndx-template](https://github.com/nwb-extensions/ndx-template).\n",
"bugtrack_url": null,
"license": "BSD-3",
"summary": "NWB extension to store pose estimation data",
"version": "0.2.1",
"project_urls": {
"Bug Tracker": "https://github.com/rly/ndx-pose/issues",
"Changelog": "https://github.com/rly/ndx-pose/blob/main/CHANGELOG.md",
"Homepage": "https://github.com/rly/ndx-pose"
},
"split_keywords": [
"nwb",
" neurodatawithoutborders",
" ndx-extension",
" nwb-extension"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "d7af515fbf0575643ac845b285bb5649d50f04e006d63d019ccfac1da3ee7aaa",
"md5": "d6a0a1dc7339b4d2eb6679602885bdd5",
"sha256": "a59ee2bcdb2f141cb6a4f7254d79351b3520f5a99f18fbc016e3d92daa7d4e7d"
},
"downloads": -1,
"filename": "ndx_pose-0.2.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d6a0a1dc7339b4d2eb6679602885bdd5",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 14826,
"upload_time": "2024-09-26T22:27:22",
"upload_time_iso_8601": "2024-09-26T22:27:22.083993Z",
"url": "https://files.pythonhosted.org/packages/d7/af/515fbf0575643ac845b285bb5649d50f04e006d63d019ccfac1da3ee7aaa/ndx_pose-0.2.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "f67e425021f760351ecdf92eea3519e08ed13a36294628cb81490a5f2c15b63c",
"md5": "6720262d83ed9e11cda630951fdfe7c6",
"sha256": "188b68cc7b20644b9aae2a403ff364f4e85fd7db0a8528ac3c197088800ec8f6"
},
"downloads": -1,
"filename": "ndx_pose-0.2.1.tar.gz",
"has_sig": false,
"md5_digest": "6720262d83ed9e11cda630951fdfe7c6",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 99141,
"upload_time": "2024-09-26T22:27:23",
"upload_time_iso_8601": "2024-09-26T22:27:23.251461Z",
"url": "https://files.pythonhosted.org/packages/f6/7e/425021f760351ecdf92eea3519e08ed13a36294628cb81490a5f2c15b63c/ndx_pose-0.2.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-26 22:27:23",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "rly",
"github_project": "ndx-pose",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "ndx-pose"
}