upc-pymotion


Nameupc-pymotion JSON
Version 0.1.4 PyPI version JSON
download
home_pageNone
SummaryA Python library for working with motion data in NumPy or PyTorch.
upload_time2024-04-03 16:42:44
maintainerNone
docs_urlNone
authorNone
requires_python>=3.0
licenseMIT License Copyright (c) 2024 Jose Luis Ponton Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords blender dual quaternion forward kinematics motion numpy pytorch quaternion rotation matrix skeleton
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # PyMotion: A Python Library for Motion Data

PyMotion is a Python library that provides various functions for manipulating and processing motion data in NumPy or PyTorch. It is designed to facilitate the development of neural networks for character animation.

Some features of PyMotion are:

- A comprehensive set of quaternion operations and conversions to other rotation representations, such as rotation matrix, axis-angle, euler, and 6D representation
- A dual quaternion representation for rigid displacements, which can help neural networks better understand poses, as proposed by [Andreou et al. [2022]](https://doi.org/10.1111/cgf.14632) and later adopted by [Ponton et al. [2023]](https://upc-virvig.github.io/SparsePoser/)
- A continuous 6D rotation representation, as introduced by [Zhou et al. [2019]](https://doi.org/10.1109/CVPR.2019.00589)
- A BVH file reader and preprocessor for loading and transforming motion data
- Skeletal operations such as Forward Kinematics for computing global joint positions from local joint rotations
- [Beta] A plotly-based visualizer for debugging and visualizing character animation directly in Python
- [Beta] A blender visualizer for debugging and visualizing character animation
- NumPy and PyTorch implementations and tests for all functions

## Contents

1. [Installation](#installation)
2. [Examples](#examples)
3. [Roadmap](#roadmap)
4. [License](#license)

## Installation
1. **[Optional]** Install PyTorch using Pip as instructed in their [webpage](https://pytorch.org/get-started/locally/).

2. Install PyMotion:
```bash
pip install upc-pymotion
```

3. **[Optional]** Install Plotly and Dash for the visualizer:
```bash
pip install upc-pymotion[viewer]
```

## Examples

<details>
<summary> Read and save a BVH file </summary>

```python
import numpy as np
from pymotion.io.bvh import BVH

bvh = BVH()
bvh.load("test.bvh")

print(bvh.data["names"])
# Example Output: ['Hips', 'LeftHip', 'LeftKnee', 'LeftAnkle', 'LeftToe', 'RightHip', 'RightKnee', 'RightAnkle', 'RightToe', 'Chest', 'Chest3', 'Chest4', 'Neck', 'Head', 'LeftCollar', 'LeftShoulder', 'LeftElbow', 'LeftWrist', 'RightCollar', 'RightShoulder', 'RightElbow', 'RightWrist']


# Move root joint to (0, 0, 0)
local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()
local_positions[:, 0, :] = np.zeros((local_positions.shape[0], 3))
bvh.set_data(local_rotations, local_positions)

# Scale the skeleton
bvh.set_scale(0.75)

bvh.save("test_out.bvh")
```

</details>

<details>
<summary> Compute world positions and rotations from a BVH file </summary> <br/>

**NumPy**
```python
from pymotion.io.bvh import BVH
from pymotion.ops.forward_kinematics import fk

bvh = BVH()
bvh.load("test.bvh")

local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()
global_positions = local_positions[:, 0, :]  # root joint
pos, rotmats = fk(local_rotations, global_positions, offsets, parents)
```

**PyTorch**
```python
from pymotion.io.bvh import BVH
from pymotion.ops.forward_kinematics_torch import fk
import torch

bvh = BVH()
bvh.load("test.bvh")

local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()
global_positions = local_positions[:, 0, :]  # root joint
pos, rotmats = fk(
    torch.from_numpy(local_rotations),
    torch.from_numpy(global_positions),
    torch.from_numpy(offsets),
    torch.from_numpy(parents),
)
```

</details>

<details>
<summary> Quaternion conversion to other representations </summary> <br/>

**NumPy**
```python
import pymotion.rotations.quat as quat
import numpy as np

angles = np.array([np.pi / 2, np.pi, np.pi / 4])[..., np.newaxis]
# angles.shape = [3, 1]
axes = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1]])
# axes.shape = [3, 3]

q = quat.from_angle_axis(angles, axes)

rotmats = quat.to_matrix(q)

euler = quat.to_euler(q, np.array([["x", "y", "z"], ["z", "y", "x"], ["y", "z", "x"]]))
euler_degrees = np.degrees(euler)

scaled_axis = quat.to_scaled_angle_axis(q)
```

**PyTorch**
```python
import pymotion.rotations.quat_torch as quat
import numpy as np
import torch

angles = torch.Tensor([torch.pi / 2, torch.pi, torch.pi / 4]).unsqueeze(-1)
# angles.shape = [3, 1]
axes = torch.Tensor([[1, 0, 0], [0, 1, 0], [0, 0, 1]])
# axes.shape = [3, 3]

q = quat.from_angle_axis(angles, axes)

rotmats = quat.to_matrix(q)

euler = quat.to_euler(q, np.array([["x", "y", "z"], ["z", "y", "x"], ["y", "z", "x"]]))
euler_degrees = torch.rad2deg(euler)

scaled_axis = quat.to_scaled_angle_axis(q)
```

</details>

<details>
<summary> Root-centered dual quaternions from a BVH file </summary> <br/>

**NumPy**
```python
from pymotion.io.bvh import BVH
import pymotion.ops.skeleton as sk
import numpy as np

bvh = BVH()
bvh.load("test.bvh")

local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()

root_dual_quats = sk.to_root_dual_quat(
    local_rotations, local_positions[:, 0, :], parents, offsets
)

local_translations, local_rotations = sk.from_root_dual_quat(root_dual_quats, parents)
global_positions = local_translations[:, 0, :]
offsets = local_translations.copy()
offsets[:, 0, :] = np.zeros((offsets.shape[0], 3))
```

**PyTorch**
```python
from pymotion.io.bvh import BVH
import pymotion.ops.skeleton_torch as sk
import torch

bvh = BVH()
bvh.load("test.bvh")

local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()

root_dual_quats = sk.to_root_dual_quat(
    torch.from_numpy(local_rotations),
    torch.from_numpy(local_positions[:, 0, :]),
    torch.from_numpy(parents),
    torch.from_numpy(offsets),
)

local_translations, local_rotations = sk.from_root_dual_quat(root_dual_quats, parents)
global_positions = local_translations[:, 0, :]
offsets = local_translations.clone()
offsets[:, 0, :] = torch.zeros((offsets.shape[0], 3))
```

</details>

<details>
<summary> 6D representation from a BVH file </summary> <br/>

**NumPy**
```python
from pymotion.io.bvh import BVH
import pymotion.rotations.ortho6d as sixd

bvh = BVH()
bvh.load("test.bvh")

local_rotations, _, _, _, _, _ = bvh.get_data()

continuous = sixd.from_quat(local_rotations)

local_rotations = sixd.to_quat(continuous)
```

**PyTorch**
```python
from pymotion.io.bvh import BVH
import pymotion.rotations.ortho6d_torch as sixd
import torch

bvh = BVH()
bvh.load("test.bvh")

local_rotations, _, _, _, _, _ = bvh.get_data()

continuous = sixd.from_quat(torch.from_numpy(local_rotations))

local_rotations = sixd.to_quat(continuous)
```

</details>

<details>
<summary> Visualize motion in Python </summary> <br/>

```python
from pymotion.render.viewer import Viewer
from pymotion.io.bvh import BVH
from pymotion.ops.forward_kinematics import fk

bvh = BVH()
bvh.load("test.bvh")

local_rotations, local_positions, parents, offsets, _, _ = bvh.get_data()
global_positions = local_positions[:, 0, :]  # root joint
pos, rotmats = fk(local_rotations, global_positions, offsets, parents)

viewer = Viewer(use_reloader=True, xy_size=5)
viewer.add_skeleton(pos, parents)
# add additional info using add_sphere(...) and/or add_line(...), examples:
# viewer.add_sphere(sphere_pos, color="green")
# viewer.add_line(start_pos, end_pos, color="green")
viewer.add_floor()
viewer.run()
```

</details>

<details>
<summary> Visualize a pose in Blender </summary> <br/>

1. Open the Test Exitor window in Blender
2. Open the the file ```blender/pymotion_blender.py``` that can be found in this repository
3. Run the script (Blender will freeze)
![Blender script image](docs/img/blender_script.png)

4. Run the following Python code in a seperate environment:
```python
from pymotion.io.bvh import BVH
from pymotion.ops.forward_kinematics import fk
from pymotion.visualizer.blender import BlenderConnection

bvh = BVH()
bvh.load("test.bvh")

local_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()
global_positions = local_positions[:, 0, :]  # root joint
pos, _ = fk(local_rotations, global_positions, offsets, parents)

# Render points
frame = 0
conn = BlenderConnection("127.0.0.1", 2222)
conn.render_points(pos[0])
conn.close()
```

5. Press ESC key in Blender to stop the server

</details>

## Roadmap

This repository is authored and maintained by [Jose Luis Ponton](https://github.com/JLPM22) as part of his Ph.D.

Features will be added when new operations or rotation representations are needed in the development of research projects. Here it is a list of possible features and improvements for the future:

- Extend documentation and add examples in the description of each function.
- Include new animation importers such as FBX
- Improve the usability of the Blender visualization workflow
- Include useful operations for data augmentation such as animation mirroring
- Create an Inverse Kinematics module

## License

This work is licensed under the MIT license. Please, see the [LICENSE](LICENSE) for further details.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "upc-pymotion",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.0",
    "maintainer_email": null,
    "keywords": "blender, dual quaternion, forward kinematics, motion, numpy, pytorch, quaternion, rotation matrix, skeleton",
    "author": null,
    "author_email": "Jose Luis Ponton <jose.luis.ponton@upc.edu>",
    "download_url": "https://files.pythonhosted.org/packages/39/af/25b862c6ad988aec1121f64ef84306f8d64012ba28055265c2d713e71f20/upc_pymotion-0.1.4.tar.gz",
    "platform": null,
    "description": "# PyMotion: A Python Library for Motion Data\n\nPyMotion is a Python library that provides various functions for manipulating and processing motion data in NumPy or PyTorch. It is designed to facilitate the development of neural networks for character animation.\n\nSome features of PyMotion are:\n\n- A comprehensive set of quaternion operations and conversions to other rotation representations, such as rotation matrix, axis-angle, euler, and 6D representation\n- A dual quaternion representation for rigid displacements, which can help neural networks better understand poses, as proposed by [Andreou et al. [2022]](https://doi.org/10.1111/cgf.14632) and later adopted by [Ponton et al. [2023]](https://upc-virvig.github.io/SparsePoser/)\n- A continuous 6D rotation representation, as introduced by [Zhou et al. [2019]](https://doi.org/10.1109/CVPR.2019.00589)\n- A BVH file reader and preprocessor for loading and transforming motion data\n- Skeletal operations such as Forward Kinematics for computing global joint positions from local joint rotations\n- [Beta] A plotly-based visualizer for debugging and visualizing character animation directly in Python\n- [Beta] A blender visualizer for debugging and visualizing character animation\n- NumPy and PyTorch implementations and tests for all functions\n\n## Contents\n\n1. [Installation](#installation)\n2. [Examples](#examples)\n3. [Roadmap](#roadmap)\n4. [License](#license)\n\n## Installation\n1. **[Optional]** Install PyTorch using Pip as instructed in their [webpage](https://pytorch.org/get-started/locally/).\n\n2. Install PyMotion:\n```bash\npip install upc-pymotion\n```\n\n3. **[Optional]** Install Plotly and Dash for the visualizer:\n```bash\npip install upc-pymotion[viewer]\n```\n\n## Examples\n\n<details>\n<summary> Read and save a BVH file </summary>\n\n```python\nimport numpy as np\nfrom pymotion.io.bvh import BVH\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nprint(bvh.data[\"names\"])\n# Example Output: ['Hips', 'LeftHip', 'LeftKnee', 'LeftAnkle', 'LeftToe', 'RightHip', 'RightKnee', 'RightAnkle', 'RightToe', 'Chest', 'Chest3', 'Chest4', 'Neck', 'Head', 'LeftCollar', 'LeftShoulder', 'LeftElbow', 'LeftWrist', 'RightCollar', 'RightShoulder', 'RightElbow', 'RightWrist']\n\n\n# Move root joint to (0, 0, 0)\nlocal_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()\nlocal_positions[:, 0, :] = np.zeros((local_positions.shape[0], 3))\nbvh.set_data(local_rotations, local_positions)\n\n# Scale the skeleton\nbvh.set_scale(0.75)\n\nbvh.save(\"test_out.bvh\")\n```\n\n</details>\n\n<details>\n<summary> Compute world positions and rotations from a BVH file </summary> <br/>\n\n**NumPy**\n```python\nfrom pymotion.io.bvh import BVH\nfrom pymotion.ops.forward_kinematics import fk\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()\nglobal_positions = local_positions[:, 0, :]  # root joint\npos, rotmats = fk(local_rotations, global_positions, offsets, parents)\n```\n\n**PyTorch**\n```python\nfrom pymotion.io.bvh import BVH\nfrom pymotion.ops.forward_kinematics_torch import fk\nimport torch\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()\nglobal_positions = local_positions[:, 0, :]  # root joint\npos, rotmats = fk(\n    torch.from_numpy(local_rotations),\n    torch.from_numpy(global_positions),\n    torch.from_numpy(offsets),\n    torch.from_numpy(parents),\n)\n```\n\n</details>\n\n<details>\n<summary> Quaternion conversion to other representations </summary> <br/>\n\n**NumPy**\n```python\nimport pymotion.rotations.quat as quat\nimport numpy as np\n\nangles = np.array([np.pi / 2, np.pi, np.pi / 4])[..., np.newaxis]\n# angles.shape = [3, 1]\naxes = np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1]])\n# axes.shape = [3, 3]\n\nq = quat.from_angle_axis(angles, axes)\n\nrotmats = quat.to_matrix(q)\n\neuler = quat.to_euler(q, np.array([[\"x\", \"y\", \"z\"], [\"z\", \"y\", \"x\"], [\"y\", \"z\", \"x\"]]))\neuler_degrees = np.degrees(euler)\n\nscaled_axis = quat.to_scaled_angle_axis(q)\n```\n\n**PyTorch**\n```python\nimport pymotion.rotations.quat_torch as quat\nimport numpy as np\nimport torch\n\nangles = torch.Tensor([torch.pi / 2, torch.pi, torch.pi / 4]).unsqueeze(-1)\n# angles.shape = [3, 1]\naxes = torch.Tensor([[1, 0, 0], [0, 1, 0], [0, 0, 1]])\n# axes.shape = [3, 3]\n\nq = quat.from_angle_axis(angles, axes)\n\nrotmats = quat.to_matrix(q)\n\neuler = quat.to_euler(q, np.array([[\"x\", \"y\", \"z\"], [\"z\", \"y\", \"x\"], [\"y\", \"z\", \"x\"]]))\neuler_degrees = torch.rad2deg(euler)\n\nscaled_axis = quat.to_scaled_angle_axis(q)\n```\n\n</details>\n\n<details>\n<summary> Root-centered dual quaternions from a BVH file </summary> <br/>\n\n**NumPy**\n```python\nfrom pymotion.io.bvh import BVH\nimport pymotion.ops.skeleton as sk\nimport numpy as np\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()\n\nroot_dual_quats = sk.to_root_dual_quat(\n    local_rotations, local_positions[:, 0, :], parents, offsets\n)\n\nlocal_translations, local_rotations = sk.from_root_dual_quat(root_dual_quats, parents)\nglobal_positions = local_translations[:, 0, :]\noffsets = local_translations.copy()\noffsets[:, 0, :] = np.zeros((offsets.shape[0], 3))\n```\n\n**PyTorch**\n```python\nfrom pymotion.io.bvh import BVH\nimport pymotion.ops.skeleton_torch as sk\nimport torch\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()\n\nroot_dual_quats = sk.to_root_dual_quat(\n    torch.from_numpy(local_rotations),\n    torch.from_numpy(local_positions[:, 0, :]),\n    torch.from_numpy(parents),\n    torch.from_numpy(offsets),\n)\n\nlocal_translations, local_rotations = sk.from_root_dual_quat(root_dual_quats, parents)\nglobal_positions = local_translations[:, 0, :]\noffsets = local_translations.clone()\noffsets[:, 0, :] = torch.zeros((offsets.shape[0], 3))\n```\n\n</details>\n\n<details>\n<summary> 6D representation from a BVH file </summary> <br/>\n\n**NumPy**\n```python\nfrom pymotion.io.bvh import BVH\nimport pymotion.rotations.ortho6d as sixd\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, _, _, _, _, _ = bvh.get_data()\n\ncontinuous = sixd.from_quat(local_rotations)\n\nlocal_rotations = sixd.to_quat(continuous)\n```\n\n**PyTorch**\n```python\nfrom pymotion.io.bvh import BVH\nimport pymotion.rotations.ortho6d_torch as sixd\nimport torch\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, _, _, _, _, _ = bvh.get_data()\n\ncontinuous = sixd.from_quat(torch.from_numpy(local_rotations))\n\nlocal_rotations = sixd.to_quat(continuous)\n```\n\n</details>\n\n<details>\n<summary> Visualize motion in Python </summary> <br/>\n\n```python\nfrom pymotion.render.viewer import Viewer\nfrom pymotion.io.bvh import BVH\nfrom pymotion.ops.forward_kinematics import fk\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, local_positions, parents, offsets, _, _ = bvh.get_data()\nglobal_positions = local_positions[:, 0, :]  # root joint\npos, rotmats = fk(local_rotations, global_positions, offsets, parents)\n\nviewer = Viewer(use_reloader=True, xy_size=5)\nviewer.add_skeleton(pos, parents)\n# add additional info using add_sphere(...) and/or add_line(...), examples:\n# viewer.add_sphere(sphere_pos, color=\"green\")\n# viewer.add_line(start_pos, end_pos, color=\"green\")\nviewer.add_floor()\nviewer.run()\n```\n\n</details>\n\n<details>\n<summary> Visualize a pose in Blender </summary> <br/>\n\n1. Open the Test Exitor window in Blender\n2. Open the the file ```blender/pymotion_blender.py``` that can be found in this repository\n3. Run the script (Blender will freeze)\n![Blender script image](docs/img/blender_script.png)\n\n4. Run the following Python code in a seperate environment:\n```python\nfrom pymotion.io.bvh import BVH\nfrom pymotion.ops.forward_kinematics import fk\nfrom pymotion.visualizer.blender import BlenderConnection\n\nbvh = BVH()\nbvh.load(\"test.bvh\")\n\nlocal_rotations, local_positions, parents, offsets, end_sites, end_sites_parents = bvh.get_data()\nglobal_positions = local_positions[:, 0, :]  # root joint\npos, _ = fk(local_rotations, global_positions, offsets, parents)\n\n# Render points\nframe = 0\nconn = BlenderConnection(\"127.0.0.1\", 2222)\nconn.render_points(pos[0])\nconn.close()\n```\n\n5. Press ESC key in Blender to stop the server\n\n</details>\n\n## Roadmap\n\nThis repository is authored and maintained by [Jose Luis Ponton](https://github.com/JLPM22) as part of his Ph.D.\n\nFeatures will be added when new operations or rotation representations are needed in the development of research projects. Here it is a list of possible features and improvements for the future:\n\n- Extend documentation and add examples in the description of each function.\n- Include new animation importers such as FBX\n- Improve the usability of the Blender visualization workflow\n- Include useful operations for data augmentation such as animation mirroring\n- Create an Inverse Kinematics module\n\n## License\n\nThis work is licensed under the MIT license. Please, see the [LICENSE](LICENSE) for further details.\n",
    "bugtrack_url": null,
    "license": "MIT License  Copyright (c) 2024 Jose Luis Ponton  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
    "summary": "A Python library for working with motion data in NumPy or PyTorch.",
    "version": "0.1.4",
    "project_urls": {
        "Homepage": "https://github.com/UPC-ViRVIG/pymotion"
    },
    "split_keywords": [
        "blender",
        " dual quaternion",
        " forward kinematics",
        " motion",
        " numpy",
        " pytorch",
        " quaternion",
        " rotation matrix",
        " skeleton"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3e5d13f637bfe6b7aee16862374225b1b1d88c77cf5af2c72649cefdccb52605",
                "md5": "e0cc65c49b54d222bda18978301a17a4",
                "sha256": "1a9c5f97e3cf04d53580712d5babfec8381c5c0bcd467260fe5189b50391e414"
            },
            "downloads": -1,
            "filename": "upc_pymotion-0.1.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e0cc65c49b54d222bda18978301a17a4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.0",
            "size": 35174,
            "upload_time": "2024-04-03T16:42:43",
            "upload_time_iso_8601": "2024-04-03T16:42:43.139090Z",
            "url": "https://files.pythonhosted.org/packages/3e/5d/13f637bfe6b7aee16862374225b1b1d88c77cf5af2c72649cefdccb52605/upc_pymotion-0.1.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "39af25b862c6ad988aec1121f64ef84306f8d64012ba28055265c2d713e71f20",
                "md5": "31799901178fe5c656586a055f7697a8",
                "sha256": "b2309267a003ea264b423f5ab1e9b7012500edf357e412d6f65c508505062895"
            },
            "downloads": -1,
            "filename": "upc_pymotion-0.1.4.tar.gz",
            "has_sig": false,
            "md5_digest": "31799901178fe5c656586a055f7697a8",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.0",
            "size": 24322,
            "upload_time": "2024-04-03T16:42:44",
            "upload_time_iso_8601": "2024-04-03T16:42:44.852951Z",
            "url": "https://files.pythonhosted.org/packages/39/af/25b862c6ad988aec1121f64ef84306f8d64012ba28055265c2d713e71f20/upc_pymotion-0.1.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-03 16:42:44",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "UPC-ViRVIG",
    "github_project": "pymotion",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "upc-pymotion"
}
        
Elapsed time: 0.22804s