# adam
[![adam](https://github.com/ami-iit/ADAM/actions/workflows/tests.yml/badge.svg?branch=main)](https://github.com/ami-iit/ADAM/actions/workflows/tests.yml)
[![](https://img.shields.io/badge/License-BSD--3--Clause-blue.svg)](https://github.com/ami-iit/ADAM/blob/main/LICENSE)
**Automatic Differentiation for rigid-body-dynamics AlgorithMs**
**adam** implements a collection of algorithms for calculating rigid-body dynamics for **floating-base** robots, in _mixed_ and _body fixed representations_ (see [Traversaro's A Unified View of the Equations of Motion used for Control Design of Humanoid Robots](https://www.researchgate.net/publication/312200239_A_Unified_View_of_the_Equations_of_Motion_used_for_Control_Design_of_Humanoid_Robots)) using:
- [Jax](https://github.com/google/jax)
- [CasADi](https://web.casadi.org/)
- [PyTorch](https://github.com/pytorch/pytorch)
- [NumPy](https://numpy.org/)
**adam** employs the **automatic differentiation** capabilities of these frameworks to compute, if needed, gradients, Jacobian, Hessians of rigid-body dynamics quantities. This approach enables the design of optimal control and reinforcement learning strategies in robotics.
**adam** is based on Roy Featherstone's Rigid Body Dynamics Algorithms.
### Table of contents
- [π Dependencies](#-dependencies)
- [πΎ Installation](#-installation)
- [π Installation with pip](#-installation-with-pip)
- [π¦ Installation with conda](#-installation-with-conda)
- [Installation from conda-forge package](#installation-from-conda-forge-package)
- [π¨ Installation from repo](#-installation-from-repo)
- [π Usage](#-usage)
- [Jax interface](#jax-interface)
- [CasADi interface](#casadi-interface)
- [PyTorch interface](#pytorch-interface)
- [PyTorch Batched interface](#pytorch-batched-interface)
- [π¦ΈββοΈ Contributing](#οΈ-contributing)
- [Todo](#todo)
## π Dependencies
- [`python3`](https://wiki.python.org/moin/BeginnersGuide)
Other requisites are:
- `urdf_parser_py`
- `jax`
- `casadi`
- `pytorch`
- `numpy`
- `jax2torch`
They will be installed in the installation step!
## πΎ Installation
The installation can be done either using the Python provided by apt (on Debian-based distros) or via conda (on Linux and macOS).
### π Installation with pip
Install `python3`, if not installed (in **Ubuntu 20.04**):
```bash
sudo apt install python3.8
```
Create a [virtual environment](https://docs.python.org/3/library/venv.html#venv-def), if you prefer. For example:
```bash
pip install virtualenv
python3 -m venv your_virtual_env
source your_virtual_env/bin/activate
```
Inside the virtual environment, install the library from pip:
- Install **Jax** interface:
```bash
pip install adam-robotics[jax]
```
- Install **CasADi** interface:
```bash
pip install adam-robotics[casadi]
```
- Install **PyTorch** interface:
```bash
pip install adam-robotics[pytorch]
```
- Install **ALL** interfaces:
```bash
pip install adam-robotics[all]
```
If you want the last version:
```bash
pip install adam-robotics[selected-interface]@git+https://github.com/ami-iit/ADAM
```
or clone the repo and install:
```bash
git clone https://github.com/ami-iit/adam.git
cd adam
pip install .[selected-interface]
```
### π¦ Installation with conda
#### Installation from conda-forge package
```bash
mamba create -n adamenv -c conda-forge adam-robotics
```
If you want to use `jax` or `pytorch`, just install the corresponding package as well.
> [!NOTE]
> Check also the conda JAX installation guide [here](https://jax.readthedocs.io/en/latest/installation.html#conda-community-supported)
### π¨ Installation from repo
Install in a conda environment the required dependencies:
- **Jax** interface dependencies:
```bash
mamba create -n adamenv -c conda-forge jax numpy lxml prettytable matplotlib urdfdom-py
```
- **CasADi** interface dependencies:
```bash
mamba create -n adamenv -c conda-forge casadi numpy lxml prettytable matplotlib urdfdom-py
```
- **PyTorch** interface dependencies:
```bash
mamba create -n adamenv -c conda-forge pytorch numpy lxml prettytable matplotlib urdfdom-py jax2torch
```
- **ALL** interfaces dependencies:
```bash
mamba create -n adamenv -c conda-forge jax casadi pytorch numpy lxml prettytable matplotlib urdfdom-py jax2torch
```
Activate the environment, clone the repo and install the library:
```bash
mamba activate adamenv
git clone https://github.com/ami-iit/ADAM.git
cd adam
pip install --no-deps .
```
## π Usage
The following are small snippets of the use of **adam**. More examples are arriving!
Have also a look at the `tests` folder.
### Jax interface
> [!NOTE]
> Check also the Jax installation guide [here](https://jax.readthedocs.io/en/latest/installation.html#)
```python
import adam
from adam.jax import KinDynComputations
import icub_models
import numpy as np
import jax.numpy as jnp
from jax import jit, vmap
# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdf
model_path = icub_models.get_model_file("iCubGazeboV2_5")
# The joint list
joints_name_list = [
'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',
'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',
'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',
'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',
'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'
]
kinDyn = KinDynComputations(model_path, joints_name_list)
# choose the representation, if you want to use the body fixed representation
kinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)
# or, if you want to use the mixed representation (that is the default)
kinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)
w_H_b = np.eye(4)
joints = np.ones(len(joints_name_list))
M = kinDyn.mass_matrix(w_H_b, joints)
print(M)
w_H_f = kinDyn.forward_kinematics('frame_name', w_H_b, joints)
# IMPORTANT! The Jax Interface function execution can be slow! We suggest to jit them.
# For example:
def frame_forward_kinematics(w_H_b, joints):
# This is needed since str is not a valid JAX type
return kinDyn.forward_kinematics('frame_name', w_H_b, joints)
jitted_frame_fk = jit(frame_forward_kinematics)
w_H_f = jitted_frame_fk(w_H_b, joints)
# In the same way, the functions can be also vmapped
vmapped_frame_fk = vmap(frame_forward_kinematics, in_axes=(0, 0))
# which can be also jitted
jitted_vmapped_frame_fk = jit(vmapped_frame_fk)
# and called on a batch of data
joints_batch = jnp.tile(joints, (1024, 1))
w_H_b_batch = jnp.tile(w_H_b, (1024, 1, 1))
w_H_f_batch = jitted_vmapped_frame_fk(w_H_b_batch, joints_batch)
```
> [!NOTE]
> The first call of the jitted function can be slow, since JAX needs to compile the function. Then it will be faster!
### CasADi interface
```python
import adam
from adam.casadi import KinDynComputations
import icub_models
import numpy as np
# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdf
model_path = icub_models.get_model_file("iCubGazeboV2_5")
# The joint list
joints_name_list = [
'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',
'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',
'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',
'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',
'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'
]
kinDyn = KinDynComputations(model_path, joints_name_list)
# choose the representation you want to use the body fixed representation
kinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)
# or, if you want to use the mixed representation (that is the default)
kinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)
w_H_b = np.eye(4)
joints = np.ones(len(joints_name_list))
M = kinDyn.mass_matrix_fun()
print(M(w_H_b, joints))
# If you want to use the symbolic version
w_H_b = cs.SX.eye(4)
joints = cs.SX.sym('joints', len(joints_name_list))
M = kinDyn.mass_matrix_fun()
print(M(w_H_b, joints))
# This is usable also with casadi.MX
w_H_b = cs.MX.eye(4)
joints = cs.MX.sym('joints', len(joints_name_list))
M = kinDyn.mass_matrix_fun()
print(M(w_H_b, joints))
```
### PyTorch interface
```python
import adam
from adam.pytorch import KinDynComputations
import icub_models
import numpy as np
# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdf
model_path = icub_models.get_model_file("iCubGazeboV2_5")
# The joint list
joints_name_list = [
'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',
'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',
'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',
'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',
'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'
]
kinDyn = KinDynComputations(model_path, joints_name_list)
# choose the representation you want to use the body fixed representation
kinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)
# or, if you want to use the mixed representation (that is the default)
kinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)
w_H_b = np.eye(4)
joints = np.ones(len(joints_name_list))
M = kinDyn.mass_matrix(w_H_b, joints)
print(M)
```
### PyTorch Batched interface
> [!NOTE]
> When using this interface, note that the first call of the jitted function can be slow, since JAX needs to compile the function. Then it will be faster!
```python
import adam
from adam.pytorch import KinDynComputationsBatch
import icub_models
# if you want to icub-models
model_path = icub_models.get_model_file("iCubGazeboV2_5")
# The joint list
joints_name_list = [
'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',
'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',
'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',
'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',
'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'
]
kinDyn = KinDynComputationsBatch(model_path, joints_name_list)
# choose the representation you want to use the body fixed representation
kinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)
# or, if you want to use the mixed representation (that is the default)
kinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)
w_H_b = np.eye(4)
joints = np.ones(len(joints_name_list))
num_samples = 1024
w_H_b_batch = torch.tensor(np.tile(w_H_b, (num_samples, 1, 1)), dtype=torch.float32)
joints_batch = torch.tensor(np.tile(joints, (num_samples, 1)), dtype=torch.float32)
M = kinDyn.mass_matrix(w_H_b_batch, joints_batch)
w_H_f = kinDyn.forward_kinematics('frame_name', w_H_b_batch, joints_batch)
```
## π¦ΈββοΈ Contributing
**adam** is an open-source project. Contributions are very welcome!
Open an issue with your feature request or if you spot a bug. Then, you can also proceed with a Pull-requests! :rocket:
> [!WARNING]
> REPOSITORY UNDER DEVELOPMENT! We cannot guarantee stable API
## Todo
- [x] Center of Mass position
- [x] Jacobians
- [x] Forward kinematics
- [x] Mass Matrix via CRBA
- [x] Centroidal Momentum Matrix via CRBA
- [x] Recursive Newton-Euler algorithm (still no acceleration in the algorithm, since it is used only for the computation of the bias force)
- [ ] Articulated Body algorithm
Raw data
{
"_id": null,
"home_page": "https://github.com/ami-iit/ADAM",
"name": "adam-robotics",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "robotics, urdf, rigid body dynamics, featherstone, automatic-differentiation, optimization, casadi, jax, pytorch, reinforcement-learning, motion-planning",
"author": "Giuseppe L'Erario",
"author_email": "gl.giuseppelerario@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/bc/b8/ba22ce5a0d5131c47da68404fcd4bf3c988adef668923e1560868f9b1567/adam_robotics-0.3.1.tar.gz",
"platform": null,
"description": "# adam\n\n[![adam](https://github.com/ami-iit/ADAM/actions/workflows/tests.yml/badge.svg?branch=main)](https://github.com/ami-iit/ADAM/actions/workflows/tests.yml)\n[![](https://img.shields.io/badge/License-BSD--3--Clause-blue.svg)](https://github.com/ami-iit/ADAM/blob/main/LICENSE)\n\n**Automatic Differentiation for rigid-body-dynamics AlgorithMs**\n\n**adam** implements a collection of algorithms for calculating rigid-body dynamics for **floating-base** robots, in _mixed_ and _body fixed representations_ (see [Traversaro's A Unified View of the Equations of Motion used for Control Design of Humanoid Robots](https://www.researchgate.net/publication/312200239_A_Unified_View_of_the_Equations_of_Motion_used_for_Control_Design_of_Humanoid_Robots)) using:\n\n- [Jax](https://github.com/google/jax)\n- [CasADi](https://web.casadi.org/)\n- [PyTorch](https://github.com/pytorch/pytorch)\n- [NumPy](https://numpy.org/)\n\n**adam** employs the **automatic differentiation** capabilities of these frameworks to compute, if needed, gradients, Jacobian, Hessians of rigid-body dynamics quantities. This approach enables the design of optimal control and reinforcement learning strategies in robotics.\n\n**adam** is based on Roy Featherstone's Rigid Body Dynamics Algorithms.\n\n### Table of contents\n\n- [\ud83d\udc0d Dependencies](#-dependencies)\n- [\ud83d\udcbe Installation](#-installation)\n - [\ud83d\udc0d Installation with pip](#-installation-with-pip)\n - [\ud83d\udce6 Installation with conda](#-installation-with-conda)\n - [Installation from conda-forge package](#installation-from-conda-forge-package)\n - [\ud83d\udd28 Installation from repo](#-installation-from-repo)\n- [\ud83d\ude80 Usage](#-usage)\n - [Jax interface](#jax-interface)\n - [CasADi interface](#casadi-interface)\n - [PyTorch interface](#pytorch-interface)\n - [PyTorch Batched interface](#pytorch-batched-interface)\n- [\ud83e\uddb8\u200d\u2642\ufe0f Contributing](#\ufe0f-contributing)\n- [Todo](#todo)\n\n## \ud83d\udc0d Dependencies\n\n- [`python3`](https://wiki.python.org/moin/BeginnersGuide)\n\nOther requisites are:\n\n- `urdf_parser_py`\n- `jax`\n- `casadi`\n- `pytorch`\n- `numpy`\n- `jax2torch`\n\nThey will be installed in the installation step!\n\n## \ud83d\udcbe Installation\n\nThe installation can be done either using the Python provided by apt (on Debian-based distros) or via conda (on Linux and macOS).\n\n### \ud83d\udc0d Installation with pip\n\nInstall `python3`, if not installed (in **Ubuntu 20.04**):\n\n```bash\nsudo apt install python3.8\n```\n\nCreate a [virtual environment](https://docs.python.org/3/library/venv.html#venv-def), if you prefer. For example:\n\n```bash\npip install virtualenv\npython3 -m venv your_virtual_env\nsource your_virtual_env/bin/activate\n```\n\nInside the virtual environment, install the library from pip:\n\n- Install **Jax** interface:\n\n ```bash\n pip install adam-robotics[jax]\n ```\n\n- Install **CasADi** interface:\n\n ```bash\n pip install adam-robotics[casadi]\n ```\n\n- Install **PyTorch** interface:\n\n ```bash\n pip install adam-robotics[pytorch]\n ```\n\n- Install **ALL** interfaces:\n\n ```bash\n pip install adam-robotics[all]\n ```\n\nIf you want the last version:\n\n```bash\npip install adam-robotics[selected-interface]@git+https://github.com/ami-iit/ADAM\n```\n\nor clone the repo and install:\n\n```bash\ngit clone https://github.com/ami-iit/adam.git\ncd adam\npip install .[selected-interface]\n```\n\n### \ud83d\udce6 Installation with conda\n\n#### Installation from conda-forge package\n\n```bash\nmamba create -n adamenv -c conda-forge adam-robotics\n```\n\nIf you want to use `jax` or `pytorch`, just install the corresponding package as well.\n\n> [!NOTE]\n> Check also the conda JAX installation guide [here](https://jax.readthedocs.io/en/latest/installation.html#conda-community-supported)\n\n### \ud83d\udd28 Installation from repo\n\nInstall in a conda environment the required dependencies:\n\n- **Jax** interface dependencies:\n\n ```bash\n mamba create -n adamenv -c conda-forge jax numpy lxml prettytable matplotlib urdfdom-py\n ```\n\n- **CasADi** interface dependencies:\n\n ```bash\n mamba create -n adamenv -c conda-forge casadi numpy lxml prettytable matplotlib urdfdom-py\n ```\n\n- **PyTorch** interface dependencies:\n\n ```bash\n mamba create -n adamenv -c conda-forge pytorch numpy lxml prettytable matplotlib urdfdom-py jax2torch\n ```\n\n- **ALL** interfaces dependencies:\n\n ```bash\n mamba create -n adamenv -c conda-forge jax casadi pytorch numpy lxml prettytable matplotlib urdfdom-py jax2torch\n ```\n\nActivate the environment, clone the repo and install the library:\n\n```bash\nmamba activate adamenv\ngit clone https://github.com/ami-iit/ADAM.git\ncd adam\npip install --no-deps .\n```\n\n## \ud83d\ude80 Usage\n\nThe following are small snippets of the use of **adam**. More examples are arriving!\nHave also a look at the `tests` folder.\n\n### Jax interface\n\n> [!NOTE]\n> Check also the Jax installation guide [here](https://jax.readthedocs.io/en/latest/installation.html#)\n\n```python\nimport adam\nfrom adam.jax import KinDynComputations\nimport icub_models\nimport numpy as np\nimport jax.numpy as jnp\nfrom jax import jit, vmap\n\n# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdf\nmodel_path = icub_models.get_model_file(\"iCubGazeboV2_5\")\n# The joint list\njoints_name_list = [\n 'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',\n 'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',\n 'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',\n 'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',\n 'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'\n]\n\nkinDyn = KinDynComputations(model_path, joints_name_list)\n# choose the representation, if you want to use the body fixed representation\nkinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)\n# or, if you want to use the mixed representation (that is the default)\nkinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)\nw_H_b = np.eye(4)\njoints = np.ones(len(joints_name_list))\nM = kinDyn.mass_matrix(w_H_b, joints)\nprint(M)\nw_H_f = kinDyn.forward_kinematics('frame_name', w_H_b, joints)\n\n# IMPORTANT! The Jax Interface function execution can be slow! We suggest to jit them.\n# For example:\n\ndef frame_forward_kinematics(w_H_b, joints):\n # This is needed since str is not a valid JAX type\n return kinDyn.forward_kinematics('frame_name', w_H_b, joints)\n\njitted_frame_fk = jit(frame_forward_kinematics)\nw_H_f = jitted_frame_fk(w_H_b, joints)\n\n# In the same way, the functions can be also vmapped\nvmapped_frame_fk = vmap(frame_forward_kinematics, in_axes=(0, 0))\n# which can be also jitted\njitted_vmapped_frame_fk = jit(vmapped_frame_fk)\n# and called on a batch of data\njoints_batch = jnp.tile(joints, (1024, 1))\nw_H_b_batch = jnp.tile(w_H_b, (1024, 1, 1))\nw_H_f_batch = jitted_vmapped_frame_fk(w_H_b_batch, joints_batch)\n\n\n```\n\n> [!NOTE]\n> The first call of the jitted function can be slow, since JAX needs to compile the function. Then it will be faster!\n\n### CasADi interface\n\n```python\nimport adam\nfrom adam.casadi import KinDynComputations\nimport icub_models\nimport numpy as np\n\n# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdf\nmodel_path = icub_models.get_model_file(\"iCubGazeboV2_5\")\n# The joint list\njoints_name_list = [\n 'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',\n 'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',\n 'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',\n 'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',\n 'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'\n]\n\nkinDyn = KinDynComputations(model_path, joints_name_list)\n# choose the representation you want to use the body fixed representation\nkinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)\n# or, if you want to use the mixed representation (that is the default)\nkinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)\nw_H_b = np.eye(4)\njoints = np.ones(len(joints_name_list))\nM = kinDyn.mass_matrix_fun()\nprint(M(w_H_b, joints))\n\n# If you want to use the symbolic version\nw_H_b = cs.SX.eye(4)\njoints = cs.SX.sym('joints', len(joints_name_list))\nM = kinDyn.mass_matrix_fun()\nprint(M(w_H_b, joints))\n\n# This is usable also with casadi.MX\nw_H_b = cs.MX.eye(4)\njoints = cs.MX.sym('joints', len(joints_name_list))\nM = kinDyn.mass_matrix_fun()\nprint(M(w_H_b, joints))\n\n```\n\n### PyTorch interface\n\n```python\nimport adam\nfrom adam.pytorch import KinDynComputations\nimport icub_models\nimport numpy as np\n\n# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdf\nmodel_path = icub_models.get_model_file(\"iCubGazeboV2_5\")\n# The joint list\njoints_name_list = [\n 'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',\n 'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',\n 'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',\n 'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',\n 'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'\n]\n\nkinDyn = KinDynComputations(model_path, joints_name_list)\n# choose the representation you want to use the body fixed representation\nkinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)\n# or, if you want to use the mixed representation (that is the default)\nkinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)\nw_H_b = np.eye(4)\njoints = np.ones(len(joints_name_list))\nM = kinDyn.mass_matrix(w_H_b, joints)\nprint(M)\n```\n\n### PyTorch Batched interface\n\n> [!NOTE]\n> When using this interface, note that the first call of the jitted function can be slow, since JAX needs to compile the function. Then it will be faster!\n\n```python\nimport adam\nfrom adam.pytorch import KinDynComputationsBatch\nimport icub_models\n\n# if you want to icub-models\nmodel_path = icub_models.get_model_file(\"iCubGazeboV2_5\")\n# The joint list\njoints_name_list = [\n 'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',\n 'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',\n 'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',\n 'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',\n 'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'\n]\n\nkinDyn = KinDynComputationsBatch(model_path, joints_name_list)\n# choose the representation you want to use the body fixed representation\nkinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)\n# or, if you want to use the mixed representation (that is the default)\nkinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)\nw_H_b = np.eye(4)\njoints = np.ones(len(joints_name_list))\n\nnum_samples = 1024\nw_H_b_batch = torch.tensor(np.tile(w_H_b, (num_samples, 1, 1)), dtype=torch.float32)\njoints_batch = torch.tensor(np.tile(joints, (num_samples, 1)), dtype=torch.float32)\n\nM = kinDyn.mass_matrix(w_H_b_batch, joints_batch)\nw_H_f = kinDyn.forward_kinematics('frame_name', w_H_b_batch, joints_batch)\n```\n\n## \ud83e\uddb8\u200d\u2642\ufe0f Contributing\n\n**adam** is an open-source project. Contributions are very welcome!\n\nOpen an issue with your feature request or if you spot a bug. Then, you can also proceed with a Pull-requests! :rocket:\n\n> [!WARNING]\n> REPOSITORY UNDER DEVELOPMENT! We cannot guarantee stable API\n\n## Todo\n\n- [x] Center of Mass position\n- [x] Jacobians\n- [x] Forward kinematics\n- [x] Mass Matrix via CRBA\n- [x] Centroidal Momentum Matrix via CRBA\n- [x] Recursive Newton-Euler algorithm (still no acceleration in the algorithm, since it is used only for the computation of the bias force)\n- [ ] Articulated Body algorithm\n",
"bugtrack_url": null,
"license": null,
"summary": "Automatic Differentiation for rigid-body-dynamics AlgorithMs",
"version": "0.3.1",
"project_urls": {
"Homepage": "https://github.com/ami-iit/ADAM"
},
"split_keywords": [
"robotics",
" urdf",
" rigid body dynamics",
" featherstone",
" automatic-differentiation",
" optimization",
" casadi",
" jax",
" pytorch",
" reinforcement-learning",
" motion-planning"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "b7e91c87281b3c189d4ffe30deb41042f7f39acebe10fb278898b215a64e1d30",
"md5": "49fa31ddde1603082aa7d4ce9c3a6d2b",
"sha256": "bc1aec918bb2e834199f156f2d5b918372b47c47edce49aff94ac2e395a27f2a"
},
"downloads": -1,
"filename": "adam_robotics-0.3.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "49fa31ddde1603082aa7d4ce9c3a6d2b",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 59244,
"upload_time": "2024-10-30T13:56:31",
"upload_time_iso_8601": "2024-10-30T13:56:31.191095Z",
"url": "https://files.pythonhosted.org/packages/b7/e9/1c87281b3c189d4ffe30deb41042f7f39acebe10fb278898b215a64e1d30/adam_robotics-0.3.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "bcb8ba22ce5a0d5131c47da68404fcd4bf3c988adef668923e1560868f9b1567",
"md5": "838ad2f609f25df6172e044a746a0a14",
"sha256": "d7613346238e80fe2841ebd92142f7a9022a438b5e8c7b93aa31469f35a1797b"
},
"downloads": -1,
"filename": "adam_robotics-0.3.1.tar.gz",
"has_sig": false,
"md5_digest": "838ad2f609f25df6172e044a746a0a14",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 69869,
"upload_time": "2024-10-30T13:56:32",
"upload_time_iso_8601": "2024-10-30T13:56:32.903001Z",
"url": "https://files.pythonhosted.org/packages/bc/b8/ba22ce5a0d5131c47da68404fcd4bf3c988adef668923e1560868f9b1567/adam_robotics-0.3.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-30 13:56:32",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ami-iit",
"github_project": "ADAM",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "adam-robotics"
}