Name | upc-pymotion JSON |
Version |
0.2.0
JSON |
| download |
home_page | None |
Summary | A Python library for working with motion data in NumPy or PyTorch. |
upload_time | 2025-03-09 19:13:56 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.0 |
license | MIT License
Copyright (c) 2025 Jose Luis Ponton
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. |
keywords |
blender
dual quaternion
forward kinematics
motion
numpy
pytorch
quaternion
rotation matrix
skeleton
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# PyMotion: A Python Library for Motion Data
PyMotion is a Python library that provides various functions for manipulating and processing motion data in NumPy or PyTorch. It is designed to facilitate the development of neural networks for character animation.
Some features of PyMotion are:
- A comprehensive set of quaternion operations and conversions to other rotation representations, such as rotation matrix, axis-angle, euler, and 6D representation
- A dual quaternion representation for rigid displacements, which can help neural networks better understand poses, as proposed by [Andreou et al. [2022]](https://doi.org/10.1111/cgf.14632) and later adopted by [Ponton et al. [2023]](https://upc-virvig.github.io/SparsePoser/)
- A continuous 6D rotation representation, as introduced by [Zhou et al. [2019]](https://doi.org/10.1109/CVPR.2019.00589)
- A BVH file reader and preprocessor for loading and transforming motion data
- Skeletal operations such as Forward Kinematics for computing global joint positions from local joint rotations
- A plotly-based visualizer for debugging and visualizing character animation directly in Python
- [**Experimental**] PyMotion to Blender automatic communication for debugging and visualizing character animation
- NumPy and PyTorch implementations and tests for all functions
## Contents
1. [Installation](#installation)
2. [Examples](#examples)
3. [Roadmap](#roadmap)
4. [License](#license)
## Installation
1. **[Optional]** Install PyTorch using Pip as instructed in their [webpage](https://pytorch.org/get-started/locally/).
2. Install PyMotion:
```bash
pip install upc-pymotion
```
3. **[Optional]** Install Plotly and Dash for the visualizer (no needed for Blender visualization):
```bash
pip install upc-pymotion[viewer]
```
## Examples
<details>
<summary> Read and save a BVH file </summary>
```python
import numpy as np
from pymotion.io.bvh import BVH
bvh = BVH()
bvh.load("test.bvh")
print(bvh.data["names"])
# Example Output: ['Hips', 'LeftHip', 'LeftKnee', 'LeftAnkle', 'LeftToe', 'RightHip', 'RightKnee', 'RightAnkle', 'RightToe', 'Chest', 'Chest3', 'Chest4', 'Neck', 'Head', 'LeftCollar', 'LeftShoulder', 'LeftElbow', 'LeftWrist', 'RightCollar', 'RightShoulder', 'RightElbow', 'RightWrist']
# Move root joint to (0, 0, 0)
local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()
local_positions[:, 0, :] = np.zeros((local_positions.shape[0], 3))
bvh.set_data(local_rotations, local_positions)
# Scale the skeleton
bvh.set_scale(0.75)
bvh.save("test_out.bvh")
```
</details>
<details>
<summary> Compute world positions and rotations from a BVH file </summary> <br/>
**NumPy**
```python
from pymotion.io.bvh import BVH
from pymotion.ops.forward_kinematics import fk
bvh = BVH()
bvh.load("test.bvh")
local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()
global_positions = local_positions[:, 0, :] # root joint
pos, rotmats = fk(local_rotations, global_positions, offsets, parents)
```
**PyTorch**
```python
from pymotion.io.bvh import BVH
from pymotion.ops.forward_kinematics_torch import fk
import torch
bvh = BVH()
bvh.load("test.bvh")
local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()
global_positions = local_positions[:, 0, :] # root joint
pos, rotmats = fk(
torch.from_numpy(local_rotations),
torch.from_numpy(global_positions),
torch.from_numpy(offsets),
torch.from_numpy(parents),
)
```
</details>
<details>
<summary> Quaternion conversion to other representations </summary> <br/>
**NumPy**
```python
import pymotion.rotations.quat as quat
import numpy as np
angles = np.array([np.pi / 2, np.pi, np.pi / 4])[..., np.newaxis]
# angles.shape = [3, 1]
axes = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1]])
# axes.shape = [3, 3]
q = quat.from_angle_axis(angles, axes)
rotmats = quat.to_matrix(q)
euler = quat.to_euler(q, np.array([["x", "y", "z"], ["z", "y", "x"], ["y", "z", "x"]]))
euler_degrees = np.degrees(euler)
scaled_axis = quat.to_scaled_angle_axis(q)
```
**PyTorch**
```python
import pymotion.rotations.quat_torch as quat
import numpy as np
import torch
angles = torch.Tensor([torch.pi / 2, torch.pi, torch.pi / 4]).unsqueeze(-1)
# angles.shape = [3, 1]
axes = torch.Tensor([[1, 0, 0], [0, 1, 0], [0, 0, 1]])
# axes.shape = [3, 3]
q = quat.from_angle_axis(angles, axes)
rotmats = quat.to_matrix(q)
euler = quat.to_euler(q, np.array([["x", "y", "z"], ["z", "y", "x"], ["y", "z", "x"]]))
euler_degrees = torch.rad2deg(euler)
scaled_axis = quat.to_scaled_angle_axis(q)
```
</details>
<details>
<summary> Root-centered dual quaternions from a BVH file </summary> <br/>
**NumPy**
```python
from pymotion.io.bvh import BVH
import pymotion.ops.skeleton as sk
import numpy as np
bvh = BVH()
bvh.load("test.bvh")
local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()
root_dual_quats = sk.to_root_dual_quat(
local_rotations, local_positions[:, 0, :], parents, offsets
)
local_translations, local_rotations = sk.from_root_dual_quat(root_dual_quats, parents)
global_positions = local_translations[:, 0, :]
offsets = local_translations.copy()
offsets[:, 0, :] = np.zeros((offsets.shape[0], 3))
```
**PyTorch**
```python
from pymotion.io.bvh import BVH
import pymotion.ops.skeleton_torch as sk
import torch
bvh = BVH()
bvh.load("test.bvh")
local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()
root_dual_quats = sk.to_root_dual_quat(
torch.from_numpy(local_rotations),
torch.from_numpy(local_positions[:, 0, :]),
torch.from_numpy(parents),
torch.from_numpy(offsets),
)
local_translations, local_rotations = sk.from_root_dual_quat(root_dual_quats, parents)
global_positions = local_translations[:, 0, :]
offsets = local_translations.clone()
offsets[:, 0, :] = torch.zeros((offsets.shape[0], 3))
```
</details>
<details>
<summary> 6D representation from a BVH file </summary> <br/>
**NumPy**
```python
from pymotion.io.bvh import BVH
import pymotion.rotations.ortho6d as sixd
bvh = BVH()
bvh.load("test.bvh")
local_rotations, _, _, _, _, _ = bvh.get_data()
continuous = sixd.from_quat(local_rotations)
local_rotations = sixd.to_quat(continuous)
```
**PyTorch**
```python
from pymotion.io.bvh import BVH
import pymotion.rotations.ortho6d_torch as sixd
import torch
bvh = BVH()
bvh.load("test.bvh")
local_rotations, _, _, _, _, _ = bvh.get_data()
continuous = sixd.from_quat(torch.from_numpy(local_rotations))
local_rotations = sixd.to_quat(continuous)
```
</details>
<details>
<summary> Skeleton local rotations from root-centered positions </summary> <br/>
**NumPy**
```python
import numpy as np
from pymotion.io.bvh import BVH
from pymotion.ops.forward_kinematics import fk
from pymotion.ops.skeleton import from_root_positions
bvh = BVH()
bvh.load("test.bvh")
local_rotations, local_positions, parents, offsets, _, _ = bvh.get_data()
pos, _ = fk(local_rotations, np.zeros((local_positions.shape[0], 3)), offsets, parents)
pred_rots = from_root_positions(pos, parents, offsets)
bvh.set_data(pred_rots, local_positions)
bvh.save("test_out.bvh") # joint positions should be similar as test.bvh
```
**PyTorch**
```python
import torch
from pymotion.io.bvh import BVH
from pymotion.ops.forward_kinematics_torch import fk
from pymotion.ops.skeleton_torch import from_root_positions
bvh = BVH()
bvh.load("test.bvh")
local_rotations, local_positions, parents, offsets, _, _ = bvh.get_data()
offsets = torch.from_numpy(offsets)
parents = torch.from_numpy(parents)
pos, _ = fk(
torch.from_numpy(local_rotations),
torch.zeros((local_positions.shape[0], 3)),
offsets,
parents,
)
pred_rots = from_root_positions(pos, parents, offsets)
bvh.set_data(pred_rots.numpy(), local_positions)
bvh.save("test_out.bvh") # joint positions should be similar as test.bvh
```
</details>
<details>
<summary> Visualize motion in Python </summary> <br/>
```python
from pymotion.render.viewer import Viewer
from pymotion.io.bvh import BVH
from pymotion.ops.forward_kinematics import fk
bvh = BVH()
bvh.load("test.bvh")
local_rotations, local_positions, parents, offsets, _, _ = bvh.get_data()
global_positions = local_positions[:, 0, :] # root joint
pos, rotmats = fk(local_rotations, global_positions, offsets, parents)
viewer = Viewer(use_reloader=True, xy_size=5)
viewer.add_skeleton(pos, parents)
# add additional info using add_sphere(...) and/or add_line(...), examples:
# viewer.add_sphere(sphere_pos, color="green")
# viewer.add_line(start_pos, end_pos, color="green")
viewer.add_floor()
viewer.run()
```
</details>
<details>
<summary> Visualize a pose in Blender </summary> <br/>
```python
import numpy as np
from pymotion.io.bvh import BVH
from pymotion.render.blender import BlenderConnection
with BlenderConnection() as conn:
conn.clear_scene()
conn.render_checkerboard_floor()
conn.render_points(
np.array([[0, -3, 0], [1, 2, 3]]), np.array([[0, 0, 1], [0, 1, 0]]), radius=np.array([[0.25], [0.05]])
)
conn.render_orientations(
np.array([[1, 0, 0, 0], [np.cos(np.pi / 4.0), np.sin(np.pi / 4.0), 0, 0]]),
np.array([[0, -3, 0], [1, 2, 3]]),
scale=np.array([[0.5], [0.25]]),
)
# BVH files can be rendered directly from file path
path = "test.bvh"
conn.render_bvh_from_path(
path,
np.array([0, 0, 1]),
end_joints=["RightWrist", "LeftWrist", "RightToe", "LeftToe", "Head"],
)
# or by using a BVH object
bvh = BVH()
path = "test2.bvh"
bvh.load(path)
conn.render_bvh(
bvh, np.array([0, 1, 0]), end_joints=["RightWrist", "LeftWrist", "RightToe", "LeftToe", "Head"]
)
```
</details>
## Roadmap
This repository is authored and maintained by [Jose Luis Ponton](https://github.com/JLPM22) as part of his Ph.D.
Features will be added when new operations or rotation representations are needed in the development of research projects. Here it is a list of possible features and improvements for the future:
- Extend documentation and add examples in the description of each function.
- Include new animation importers such as FBX
- Include useful operations for data augmentation such as animation mirroring
- Create an Inverse Kinematics module
## License
This work is licensed under the MIT license. Please, see the [LICENSE](LICENSE) for further details.
Raw data
{
"_id": null,
"home_page": null,
"name": "upc-pymotion",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.0",
"maintainer_email": null,
"keywords": "blender, dual quaternion, forward kinematics, motion, numpy, pytorch, quaternion, rotation matrix, skeleton",
"author": null,
"author_email": "Jose Luis Ponton <jose.luis.ponton@upc.edu>",
"download_url": "https://files.pythonhosted.org/packages/e1/25/70fd472a66a545807b629c300c649f5ac2a5ec6cb36da13ffceb4a131ed8/upc_pymotion-0.2.0.tar.gz",
"platform": null,
"description": "# PyMotion: A Python Library for Motion Data\n\nPyMotion is a Python library that provides various functions for manipulating and processing motion data in NumPy or PyTorch. It is designed to facilitate the development of neural networks for character animation.\n\nSome features of PyMotion are:\n\n- A comprehensive set of quaternion operations and conversions to other rotation representations, such as rotation matrix, axis-angle, euler, and 6D representation\n- A dual quaternion representation for rigid displacements, which can help neural networks better understand poses, as proposed by [Andreou et al. [2022]](https://doi.org/10.1111/cgf.14632) and later adopted by [Ponton et al. [2023]](https://upc-virvig.github.io/SparsePoser/)\n- A continuous 6D rotation representation, as introduced by [Zhou et al. [2019]](https://doi.org/10.1109/CVPR.2019.00589)\n- A BVH file reader and preprocessor for loading and transforming motion data\n- Skeletal operations such as Forward Kinematics for computing global joint positions from local joint rotations\n- A plotly-based visualizer for debugging and visualizing character animation directly in Python\n- [**Experimental**] PyMotion to Blender automatic communication for debugging and visualizing character animation\n- NumPy and PyTorch implementations and tests for all functions\n\n## Contents\n\n1. [Installation](#installation)\n2. [Examples](#examples)\n3. [Roadmap](#roadmap)\n4. [License](#license)\n\n## Installation\n1. **[Optional]** Install PyTorch using Pip as instructed in their [webpage](https://pytorch.org/get-started/locally/).\n\n2. Install PyMotion:\n```bash\npip install upc-pymotion\n```\n\n3. **[Optional]** Install Plotly and Dash for the visualizer (no needed for Blender visualization):\n```bash\npip install upc-pymotion[viewer]\n```\n\n## Examples\n\n<details>\n<summary> Read and save a BVH file </summary>\n\n```python\nimport numpy as np\nfrom pymotion.io.bvh import BVH\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nprint(bvh.data[\"names\"])\n# Example Output: ['Hips', 'LeftHip', 'LeftKnee', 'LeftAnkle', 'LeftToe', 'RightHip', 'RightKnee', 'RightAnkle', 'RightToe', 'Chest', 'Chest3', 'Chest4', 'Neck', 'Head', 'LeftCollar', 'LeftShoulder', 'LeftElbow', 'LeftWrist', 'RightCollar', 'RightShoulder', 'RightElbow', 'RightWrist']\n\n\n# Move root joint to (0, 0, 0)\nlocal_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()\nlocal_positions[:, 0, :] = np.zeros((local_positions.shape[0], 3))\nbvh.set_data(local_rotations, local_positions)\n\n# Scale the skeleton\nbvh.set_scale(0.75)\n\nbvh.save(\"test_out.bvh\")\n```\n\n</details>\n\n<details>\n<summary> Compute world positions and rotations from a BVH file </summary> <br/>\n\n**NumPy**\n```python\nfrom pymotion.io.bvh import BVH\nfrom pymotion.ops.forward_kinematics import fk\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()\nglobal_positions = local_positions[:, 0, :] # root joint\npos, rotmats = fk(local_rotations, global_positions, offsets, parents)\n```\n\n**PyTorch**\n```python\nfrom pymotion.io.bvh import BVH\nfrom pymotion.ops.forward_kinematics_torch import fk\nimport torch\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()\nglobal_positions = local_positions[:, 0, :] # root joint\npos, rotmats = fk(\n torch.from_numpy(local_rotations),\n torch.from_numpy(global_positions),\n torch.from_numpy(offsets),\n torch.from_numpy(parents),\n)\n```\n\n</details>\n\n<details>\n<summary> Quaternion conversion to other representations </summary> <br/>\n\n**NumPy**\n```python\nimport pymotion.rotations.quat as quat\nimport numpy as np\n\nangles = np.array([np.pi / 2, np.pi, np.pi / 4])[..., np.newaxis]\n# angles.shape = [3, 1]\naxes = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1]])\n# axes.shape = [3, 3]\n\nq = quat.from_angle_axis(angles, axes)\n\nrotmats = quat.to_matrix(q)\n\neuler = quat.to_euler(q, np.array([[\"x\", \"y\", \"z\"], [\"z\", \"y\", \"x\"], [\"y\", \"z\", \"x\"]]))\neuler_degrees = np.degrees(euler)\n\nscaled_axis = quat.to_scaled_angle_axis(q)\n```\n\n**PyTorch**\n```python\nimport pymotion.rotations.quat_torch as quat\nimport numpy as np\nimport torch\n\nangles = torch.Tensor([torch.pi / 2, torch.pi, torch.pi / 4]).unsqueeze(-1)\n# angles.shape = [3, 1]\naxes = torch.Tensor([[1, 0, 0], [0, 1, 0], [0, 0, 1]])\n# axes.shape = [3, 3]\n\nq = quat.from_angle_axis(angles, axes)\n\nrotmats = quat.to_matrix(q)\n\neuler = quat.to_euler(q, np.array([[\"x\", \"y\", \"z\"], [\"z\", \"y\", \"x\"], [\"y\", \"z\", \"x\"]]))\neuler_degrees = torch.rad2deg(euler)\n\nscaled_axis = quat.to_scaled_angle_axis(q)\n```\n\n</details>\n\n<details>\n<summary> Root-centered dual quaternions from a BVH file </summary> <br/>\n\n**NumPy**\n```python\nfrom pymotion.io.bvh import BVH\nimport pymotion.ops.skeleton as sk\nimport numpy as np\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()\n\nroot_dual_quats = sk.to_root_dual_quat(\n local_rotations, local_positions[:, 0, :], parents, offsets\n)\n\nlocal_translations, local_rotations = sk.from_root_dual_quat(root_dual_quats, parents)\nglobal_positions = local_translations[:, 0, :]\noffsets = local_translations.copy()\noffsets[:, 0, :] = np.zeros((offsets.shape[0], 3))\n```\n\n**PyTorch**\n```python\nfrom pymotion.io.bvh import BVH\nimport pymotion.ops.skeleton_torch as sk\nimport torch\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()\n\nroot_dual_quats = sk.to_root_dual_quat(\n torch.from_numpy(local_rotations),\n torch.from_numpy(local_positions[:, 0, :]),\n torch.from_numpy(parents),\n torch.from_numpy(offsets),\n)\n\nlocal_translations, local_rotations = sk.from_root_dual_quat(root_dual_quats, parents)\nglobal_positions = local_translations[:, 0, :]\noffsets = local_translations.clone()\noffsets[:, 0, :] = torch.zeros((offsets.shape[0], 3))\n```\n\n</details>\n\n<details>\n<summary> 6D representation from a BVH file </summary> <br/>\n\n**NumPy**\n```python\nfrom pymotion.io.bvh import BVH\nimport pymotion.rotations.ortho6d as sixd\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, _, _, _, _, _ = bvh.get_data()\n\ncontinuous = sixd.from_quat(local_rotations)\n\nlocal_rotations = sixd.to_quat(continuous)\n```\n\n**PyTorch**\n```python\nfrom pymotion.io.bvh import BVH\nimport pymotion.rotations.ortho6d_torch as sixd\nimport torch\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, _, _, _, _, _ = bvh.get_data()\n\ncontinuous = sixd.from_quat(torch.from_numpy(local_rotations))\n\nlocal_rotations = sixd.to_quat(continuous)\n```\n\n</details>\n\n<details>\n<summary> Skeleton local rotations from root-centered positions </summary> <br/>\n\n**NumPy**\n```python\nimport numpy as np\nfrom pymotion.io.bvh import BVH\nfrom pymotion.ops.forward_kinematics import fk\nfrom pymotion.ops.skeleton import from_root_positions\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\nlocal_rotations, local_positions, parents, offsets, _, _ = bvh.get_data()\npos, _ = fk(local_rotations, np.zeros((local_positions.shape[0], 3)), offsets, parents)\n\npred_rots = from_root_positions(pos, parents, offsets)\n\nbvh.set_data(pred_rots, local_positions)\nbvh.save(\"test_out.bvh\") # joint positions should be similar as test.bvh\n```\n\n**PyTorch**\n```python\nimport torch\nfrom pymotion.io.bvh import BVH\nfrom pymotion.ops.forward_kinematics_torch import fk\nfrom pymotion.ops.skeleton_torch import from_root_positions\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\nlocal_rotations, local_positions, parents, offsets, _, _ = bvh.get_data()\noffsets = torch.from_numpy(offsets)\nparents = torch.from_numpy(parents)\npos, _ = fk(\n torch.from_numpy(local_rotations),\n torch.zeros((local_positions.shape[0], 3)),\n offsets,\n parents,\n)\n\npred_rots = from_root_positions(pos, parents, offsets)\n\nbvh.set_data(pred_rots.numpy(), local_positions)\nbvh.save(\"test_out.bvh\") # joint positions should be similar as test.bvh\n```\n\n</details>\n\n<details>\n<summary> Visualize motion in Python </summary> <br/>\n\n```python\nfrom pymotion.render.viewer import Viewer\nfrom pymotion.io.bvh import BVH\nfrom pymotion.ops.forward_kinematics import fk\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, local_positions, parents, offsets, _, _ = bvh.get_data()\nglobal_positions = local_positions[:, 0, :] # root joint\npos, rotmats = fk(local_rotations, global_positions, offsets, parents)\n\nviewer = Viewer(use_reloader=True, xy_size=5)\nviewer.add_skeleton(pos, parents)\n# add additional info using add_sphere(...) and/or add_line(...), examples:\n# viewer.add_sphere(sphere_pos, color=\"green\")\n# viewer.add_line(start_pos, end_pos, color=\"green\")\nviewer.add_floor()\nviewer.run()\n```\n\n</details>\n\n<details>\n<summary> Visualize a pose in Blender </summary> <br/>\n\n```python\nimport numpy as np\nfrom pymotion.io.bvh import BVH\nfrom pymotion.render.blender import BlenderConnection\n\nwith BlenderConnection() as conn:\n conn.clear_scene()\n conn.render_checkerboard_floor()\n conn.render_points(\n np.array([[0, -3, 0], [1, 2, 3]]), np.array([[0, 0, 1], [0, 1, 0]]), radius=np.array([[0.25], [0.05]])\n )\n conn.render_orientations(\n np.array([[1, 0, 0, 0], [np.cos(np.pi / 4.0), np.sin(np.pi / 4.0), 0, 0]]),\n np.array([[0, -3, 0], [1, 2, 3]]),\n scale=np.array([[0.5], [0.25]]),\n )\n # BVH files can be rendered directly from file path\n path = \"test.bvh\"\n conn.render_bvh_from_path(\n path,\n np.array([0, 0, 1]),\n end_joints=[\"RightWrist\", \"LeftWrist\", \"RightToe\", \"LeftToe\", \"Head\"],\n )\n # or by using a BVH object\n bvh = BVH()\n path = \"test2.bvh\"\n bvh.load(path)\n conn.render_bvh(\n bvh, np.array([0, 1, 0]), end_joints=[\"RightWrist\", \"LeftWrist\", \"RightToe\", \"LeftToe\", \"Head\"]\n )\n```\n\n</details>\n\n## Roadmap\n\nThis repository is authored and maintained by [Jose Luis Ponton](https://github.com/JLPM22) as part of his Ph.D.\n\nFeatures will be added when new operations or rotation representations are needed in the development of research projects. Here it is a list of possible features and improvements for the future:\n\n- Extend documentation and add examples in the description of each function.\n- Include new animation importers such as FBX\n- Include useful operations for data augmentation such as animation mirroring\n- Create an Inverse Kinematics module\n\n## License\n\nThis work is licensed under the MIT license. Please, see the [LICENSE](LICENSE) for further details.\n",
"bugtrack_url": null,
"license": "MIT License\n \n Copyright (c) 2025 Jose Luis Ponton\n \n Permission is hereby granted, free of charge, to any person obtaining a copy\n of this software and associated documentation files (the \"Software\"), to deal\n in the Software without restriction, including without limitation the rights\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n copies of the Software, and to permit persons to whom the Software is\n furnished to do so, subject to the following conditions:\n \n The above copyright notice and this permission notice shall be included in all\n copies or substantial portions of the Software.\n \n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n SOFTWARE.",
"summary": "A Python library for working with motion data in NumPy or PyTorch.",
"version": "0.2.0",
"project_urls": {
"Homepage": "https://github.com/UPC-ViRVIG/pymotion"
},
"split_keywords": [
"blender",
" dual quaternion",
" forward kinematics",
" motion",
" numpy",
" pytorch",
" quaternion",
" rotation matrix",
" skeleton"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "2eb0e76fa81388ed4c59b2f11454cf69893b1a74a5510e9216139a11734b2ef9",
"md5": "571554a8b23c4e67b815a59bacb6918f",
"sha256": "20f2bf98ed935fd4e5e2577289b56178de19a8402c020fe78a8be290b23deb17"
},
"downloads": -1,
"filename": "upc_pymotion-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "571554a8b23c4e67b815a59bacb6918f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.0",
"size": 49246,
"upload_time": "2025-03-09T19:13:55",
"upload_time_iso_8601": "2025-03-09T19:13:55.324393Z",
"url": "https://files.pythonhosted.org/packages/2e/b0/e76fa81388ed4c59b2f11454cf69893b1a74a5510e9216139a11734b2ef9/upc_pymotion-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "e12570fd472a66a545807b629c300c649f5ac2a5ec6cb36da13ffceb4a131ed8",
"md5": "fa8c4490d089bf04e6987de71335592c",
"sha256": "b4a5cca1c17f310b40f8e0a91f3bc95f0b4fa15f7d070a2af761b11f60ff07eb"
},
"downloads": -1,
"filename": "upc_pymotion-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "fa8c4490d089bf04e6987de71335592c",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.0",
"size": 35901,
"upload_time": "2025-03-09T19:13:56",
"upload_time_iso_8601": "2025-03-09T19:13:56.972342Z",
"url": "https://files.pythonhosted.org/packages/e1/25/70fd472a66a545807b629c300c649f5ac2a5ec6cb36da13ffceb4a131ed8/upc_pymotion-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-03-09 19:13:56",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "UPC-ViRVIG",
"github_project": "pymotion",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "upc-pymotion"
}