movement-primitives


Namemovement-primitives JSON
Version 0.9.0 PyPI version JSON
download
home_pagehttps://github.com/dfki-ric/movement_primitives
SummaryMovement primitives
upload_time2024-10-28 15:35:56
maintainerNone
docs_urlNone
authorAlexander Fabisch
requires_pythonNone
licenseBSD-3-clause
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            ![CI](https://github.com/dfki-ric/movement_primitives/actions/workflows/python-package.yml/badge.svg)
[![codecov](https://codecov.io/gh/dfki-ric/movement_primitives/branch/main/graph/badge.svg?token=EFHUC81DBL)](https://codecov.io/gh/dfki-ric/movement_primitives)
[![Paper DOI](https://joss.theoj.org/papers/10.21105/joss.06695/status.svg)](https://doi.org/10.21105/joss.06695)
[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.6491361.svg)](https://doi.org/10.5281/zenodo.6491361)

# Movement Primitives

> Dynamical movement primitives (DMPs), probabilistic movement primitives
> (ProMPs), and spatially coupled bimanual DMPs for imitation learning.

Movement primitives are a common representation of movements in robotics for
imitation learning, reinforcement learning, and black-box optimization of
behaviors. There are many types and variations. The Python library
movement_primitives focuses on imitation learning, generalization, and
adaptation of movement primitives in Cartesian space. It implements dynamical
movement primitives, probabilistic movement primitives, as well as Cartesian
and dual Cartesian movement primitives with coupling terms to constrain
relative movements in bimanual manipulation. They are implemented in Cython to
speed up online execution and batch processing in an offline setting. In
addition, the library provides tools for data analysis and movement evaluation.
It can be installed directly from
[PyPI](https://pypi.org/project/movement-primitives/).

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/summary.png" width="100%" />

## Content

* [Statement of Need](#statement-of-need)
* [Features](#features)
* [API Documentation](#api-documentation)
* [Install Library](#install-library)
* [Examples](#examples)
* [Build API Documentation](#build-api-documentation)
* [Test](#test)
* [Contributing](#contributing)
* [Non-public Extensions](#non-public-extensions)
* [Related Publications](#related-publications)
* [Citation](#citation)
* [Funding](#funding)

## Statement of Need

Movement primitives are a common group of policy representations in robotics.
They are able to represent complex movement patterns, allow temporal and
spatial modification, offer stability guarantees, and are suitable for
imitation learning without complicated hyperparameter tuning, which are
advantages over general function approximators like neural networks. Movement
primitives are white-box models for movement generation and allow to control
several aspects of the movement. There are types of dynamical movement
primitives that allow to directly control the goal in state space, the final
velocity, or the relative pose of two robotic end-effectors. Probabilistic
movement primitives capture distributions of movements adequately and allow
conditioning in state space and blending of multiple movements. The main
disadvantage of movement primitives in comparison to general function
approximators is that they are limited in their capacity to represent behavior
that takes into account complex sensor data during execution. Nevertheless,
various types of movement primitives have proven to be a reliable and effective
tool in robot learning. A reliable tool deserves a similarly reliable open
source implementation. However, there are only a few actively maintained,
documented, and easy to use implementations. One of these is the library
*movement_primitives*. It combines several types of dynamical movement
primitives and probabilistic movement primitives in a single library with
a focus on Cartesian and bimanual movements.

## Features

* Dynamical Movement Primitives (DMPs) for
    * positions (with fast Runge-Kutta integration)
    * Cartesian position and orientation (with fast Cython implementation)
    * Dual Cartesian position and orientation (with fast Cython implementation)
* Coupling terms for synchronization of position and/or orientation of dual Cartesian DMPs
* Propagation of DMP weight distribution to state space distribution
* Probabilistic Movement Primitives (ProMPs)

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dual_cart_dmp_rh5_with_panel.gif" height="200px" /><img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_ur5_minimum_jerk.gif" height="200px" />

Left: Example of dual Cartesian DMP with [RH5 Manus](https://robotik.dfki-bremen.de/en/research/robot-systems/rh5-manus/).
Right: Example of joint space DMP with UR5.

## API Documentation

The API documentation is available
[here](https://dfki-ric.github.io/movement_primitives/).

## Install Library

This library requires Python 3.6 or later and pip is recommended for the
installation. In the following instructions, we assume that the command
`python` refers to Python 3. If you use the system's Python version, you
might have to add the flag `--user` to any installation command.

### PyPI (Recommended)

I recommend to install the library via pip from the Python package index
(PyPI):

```bash
python -m pip install movement_primitives[all]
```

If you don't want to have all dependencies installed, just omit `[all]`.

This will install the latest release. If you want to install the latest
development version, you have to install from git.

### Git + Editable Mode

Editable mode means that you don't have to install the library after editing
the source code. Changes will be directly available in the installed library
since pip creates a symlink.

You can clone the git repository and install it in editable mode with pip:

```bash
git clone https://github.com/dfki-ric/movement_primitives.git
python -m pip install -e .[all]
```

If you don't want to have all dependencies installed, just omit `[all]`.

### Git

Alternatively, you can install the library and its dependencies without pip
from the git repository:

```bash
git clone https://github.com/dfki-ric/movement_primitives.git
python setup.py install
```

### Build Cython Extension

You could also just build the Cython extension with

```bash
python setup.py build_ext --inplace
```

### Dependencies

An alternative way to install dependencies is the requirements.txt file in the
main folder of the git repository:

```bash
python -m pip install -r requirements.txt
```

## Examples

You will find a lot of examples in the subfolder
[`examples/`](https://github.com/dfki-ric/movement_primitives/tree/main/examples).
Here are just some highlights to showcase the library.

### Potential Field of 2D DMP

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_potential_field.png" width="800px" />

A Dynamical Movement Primitive defines a potential field that superimposes
several components: transformation system (goal-directed movement), forcing
term (learned shape), and coupling terms (e.g., obstacle avoidance).

[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/plot_dmp_potential_field.py)

### DMP with Final Velocity

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_with_final_velocity.png" width="800px" />

Not all DMPs allow a final velocity > 0. In this case we analyze the effect
of changing final velocities in an appropriate variation of the DMP
formulation that allows to set the final velocity.

[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/plot_dmp_with_final_velocity.py)

### ProMPs

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/promp_lasa.png" width="800px" />

The LASA Handwriting dataset learned with ProMPs. The dataset consists of
2D handwriting motions. The first and third column of the plot represent
demonstrations and the second and fourth column show the imitated ProMPs
with 1-sigma interval.

[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/plot_promp_lasa.py)

### Conditional ProMPs

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/conditional_promps.png" width="800px" />

Probabilistic Movement Primitives (ProMPs) define distributions over
trajectories that can be conditioned on viapoints. In this example, we
plot the resulting posterior distribution after conditioning on varying
start positions.

[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/plot_conditional_promp.py)

### Cartesian DMPs

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/cart_dmp_ur5.png" width="100%" />

A trajectory is created manually, imitated with a Cartesian DMP, converted
to a joint trajectory by inverse kinematics, and executed with a UR5.

[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/vis_cartesian_dmp.py)

### Contextual ProMPs

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/contextual_promps_kuka_panel_width_open3d.png" width="60%" /><img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/contextual_promps_kuka_panel_width_open3d2.png" width="40%" />

We use a dataset of [Mronga and Kirchner (2021)](https://doi.org/10.1016/j.robot.2021.103779),
in which a dual-arm robot rotates panels of varying widths. 10 demonstrations
were recorded for 3 different panel widths through kinesthetic teaching.
The panel width is the context over which we generalize with contextual ProMPs.
We learn a joint distribution of contexts and ProMP weights, and then condition
the distribution on the contexts to obtain a ProMP adapted to the context. Each
color in the above visualizations corresponds to a ProMP for a different
context.

[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/vis_contextual_promp_distribution.py)

**Dependencies that are not publicly available:**

* Dataset: panel rotation dataset of
  [Mronga and Kirchner (2021)](https://doi.org/10.1016/j.robot.2021.103779)
* MoCap library
* URDF of dual arm Kuka system from
  [DFKI RIC's HRC lab](https://robotik.dfki-bremen.de/en/research/research-facilities-labs/hrc-lab):
  ```bash
  git clone git@git.hb.dfki.de:models-robots/kuka_lbr.git
  ```

### Dual Cartesian DMP

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dual_cart_dmps_rh5_open3d.png" width="50%" /><img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dual_cart_dmps_rh5_pybullet.png" width="50%" />

This library implements specific dual Cartesian DMPs to control dual-arm
robotic systems like humanoid robots.

Scripts: [Open3D](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/vis_solar_panel.py), [PyBullet](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/sim_solar_panel.py)

**Dependencies that are not publicly available:**

* MoCap library
* URDF of [DFKI RIC's RH5 robot](https://www.youtube.com/watch?v=jjGQNstmLvY):
  ```bash
  git clone git@git.hb.dfki.de:models-robots/rh5_models/pybullet-only-arms-urdf.git --recursive
  ```
* URDF of solar panel:
  ```bash
  git clone git@git.hb.dfki.de:models-objects/solar_panels.git
  ```

### Coupled Dual Cartesian DMP

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/coupled_dual_cart_dmps_gripper_open3d.png" width="60%" /><img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/coupled_dual_cart_dmps_rh5_pybullet.png" width="40%" />

We can introduce a coupling term in a dual Cartesian DMP to constrain the
relative position, orientation, or pose of two end-effectors of a dual-arm
robot.

Scripts: [Open3D](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/vis_cartesian_dual_dmp.py), [PyBullet](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/sim_cartesian_dual_dmp.py)

**Dependencies that are not publicly available:**

* URDF of DFKI RIC's gripper:
  ```bash
  git clone git@git.hb.dfki.de:motto/abstract-urdf-gripper.git --recursive
  ```
* URDF of [DFKI RIC's RH5 robot](https://www.youtube.com/watch?v=jjGQNstmLvY):
  ```bash
  git clone git@git.hb.dfki.de:models-robots/rh5_models/pybullet-only-arms-urdf.git --recursive
  ```

### Propagation of DMP Distribution to State Space

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_state_space_distribution_kuka_peginhole_matplotlib.png" width="60%" /><img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_state_space_distribution_kuka_peginhole_open3d.png" width="40%" />

If we have a distribution over DMP parameters, we can propagate them to state
space through an unscented transform.
On the left we see the original demonstration of a dual-arm movement in state
space (two 3D positions and two quaternions) and the distribution of several
DMP weight vectors projected to the state space.
On the right side we see several dual-arm trajectories sampled from the
distribution in state space.

[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/vis_dmp_to_state_variance.py)

**Dependencies that are not publicly available:**

* Dataset: panel rotation dataset of
  [Mronga and Kirchner (2021)](https://doi.org/10.1016/j.robot.2021.103779)
* MoCap library
* URDF of dual arm
  [Kuka system](https://robotik.dfki-bremen.de/en/research/robot-systems/imrk/)
  from
  [DFKI RIC's HRC lab](https://robotik.dfki-bremen.de/en/research/research-facilities-labs/hrc-lab):
  ```bash
  git clone git@git.hb.dfki.de:models-robots/kuka_lbr.git
  ```

## Build API Documentation

You can build an API documentation with sphinx.
You can install all dependencies with

```bash
python -m pip install movement_primitives[doc]
```

... and build the documentation from the folder `doc/` with

```bash
make html
```

It will be located at `doc/build/html/index.html`.

## Test

To run the tests some python libraries are required:

```bash
python -m pip install -e .[test]
```

The tests are located in the folder `test/` and can be executed with:
`python -m pytest`

This command searches for all files with `test` and executes the functions
with `test_*`. You will find a test coverage report at `htmlcov/index.html`.

## Contributing

You can report bugs in the [issue tracker](https://github.com/dfki-ric/movement_primitives/issues).
If you have questions about the software, please use the [discussions
section](https://github.com/dfki-ric/movement_primitives/discussions).
To add new features, documentation, or fix bugs you can open a pull request
on [GitHub](https://github.com/dfki-ric/movement_primitives). Directly pushing
to the main branch is not allowed.

The recommended workflow to add a new feature, add documentation, or fix a bug
is the following:

* Push your changes to a branch (e.g., feature/x, doc/y, or fix/z) of your fork
  of the repository.
* Open a pull request to the main branch of the main repository.

This is a checklist for new features:

- are there unit tests?
- does it have docstrings?
- is it included in the API documentation?
- run flake8 and pylint
- should it be part of the readme?
- should it be included in any example script?

## Non-public Extensions

Scripts from the subfolder `examples/external_dependencies/` require access to
git repositories (URDF files or optional dependencies) and datasets that are
not publicly available. They are available on request (email
alexander.fabisch@dfki.de).

Note that the library does not have any non-public dependencies! They are only
required to run all examples.

### MoCap Library

```bash
# untested: pip install git+https://git.hb.dfki.de/dfki-interaction/mocap.git
git clone git@git.hb.dfki.de:dfki-interaction/mocap.git
cd mocap
python -m pip install -e .
cd ..
```

### Get URDFs

```bash
# RH5
git clone git@git.hb.dfki.de:models-robots/rh5_models/pybullet-only-arms-urdf.git --recursive
# RH5v2
git clone git@git.hb.dfki.de:models-robots/rh5v2_models/pybullet-urdf.git --recursive
# Kuka
git clone git@git.hb.dfki.de:models-robots/kuka_lbr.git
# Solar panel
git clone git@git.hb.dfki.de:models-objects/solar_panels.git
# RH5 Gripper
git clone git@git.hb.dfki.de:motto/abstract-urdf-gripper.git --recursive
```

### Data

I assume that your data is located in the folder `data/` in most scripts.
You should put a symlink there to point to your actual data folder.

## Related Publications

This library implements several types of dynamical movement primitives and
probabilistic movement primitives. These are described in detail in the
following papers.

[1] Ijspeert, A. J., Nakanishi, J., Hoffmann, H., Pastor, P., Schaal, S. (2013).
    Dynamical Movement Primitives: Learning Attractor Models for Motor
    Behaviors, Neural Computation 25 (2), 328-373. DOI: 10.1162/NECO_a_00393,
    https://homes.cs.washington.edu/~todorov/courses/amath579/reading/DynamicPrimitives.pdf

[2] Pastor, P., Hoffmann, H., Asfour, T., Schaal, S. (2009).
    Learning and Generalization of Motor Skills by Learning from Demonstration.
    In 2009 IEEE International Conference on Robotics and Automation,
    (pp. 763-768). DOI: 10.1109/ROBOT.2009.5152385,
    https://h2t.iar.kit.edu/pdf/Pastor2009.pdf

[3] Muelling, K., Kober, J., Kroemer, O., Peters, J. (2013).
    Learning to Select and Generalize Striking Movements in Robot Table Tennis.
    International Journal of Robotics Research 32 (3), 263-279.
    https://www.ias.informatik.tu-darmstadt.de/uploads/Publications/Muelling_IJRR_2013.pdf

[4] Ude, A., Nemec, B., Petric, T., Murimoto, J. (2014).
    Orientation in Cartesian space dynamic movement primitives.
    In IEEE International Conference on Robotics and Automation (ICRA)
    (pp. 2997-3004). DOI: 10.1109/ICRA.2014.6907291,
    https://acat-project.eu/modules/BibtexModule/uploads/PDF/udenemecpetric2014.pdf

[5] Gams, A., Nemec, B., Zlajpah, L., Wächter, M., Asfour, T., Ude, A. (2013).
    Modulation of Motor Primitives using Force Feedback: Interaction with
    the Environment and Bimanual Tasks (2013), In 2013 IEEE/RSJ International
    Conference on Intelligent Robots and Systems (pp. 5629-5635). DOI:
    10.1109/IROS.2013.6697172,
    https://h2t.anthropomatik.kit.edu/pdf/Gams2013.pdf

[6] Vidakovic, J., Jerbic, B., Sekoranja, B., Svaco, M., Suligoj, F. (2019).
    Task Dependent Trajectory Learning from Multiple Demonstrations Using
    Movement Primitives (2019),
    In International Conference on Robotics in Alpe-Adria Danube Region (RAAD)
    (pp. 275-282). DOI: 10.1007/978-3-030-19648-6_32,
    https://link.springer.com/chapter/10.1007/978-3-030-19648-6_32

[7] Paraschos, A., Daniel, C., Peters, J., Neumann, G. (2013).
    Probabilistic movement primitives, In C.J. Burges and L. Bottou and M.
    Welling and Z. Ghahramani and K.Q. Weinberger (Eds.), Advances in Neural
    Information Processing Systems, 26,
    https://papers.nips.cc/paper/2013/file/e53a0a2978c28872a4505bdb51db06dc-Paper.pdf

[8] Maeda, G. J., Neumann, G., Ewerton, M., Lioutikov, R., Kroemer, O.,
    Peters, J. (2017). Probabilistic movement primitives for coordination of
    multiple human–robot collaborative tasks. Autonomous Robots, 41, 593-612.
    DOI: 10.1007/s10514-016-9556-2,
    https://link.springer.com/article/10.1007/s10514-016-9556-2

[9] Paraschos, A., Daniel, C., Peters, J., Neumann, G. (2018).
    Using probabilistic movement primitives in robotics. Autonomous Robots, 42,
    529-551. DOI: 10.1007/s10514-017-9648-7,
    https://www.ias.informatik.tu-darmstadt.de/uploads/Team/AlexandrosParaschos/promps_auro.pdf

[10] Lazaric, A., Ghavamzadeh, M. (2010).
     Bayesian Multi-Task Reinforcement Learning. In Proceedings of the 27th
     International Conference on International Conference on Machine Learning
     (ICML'10) (pp. 599-606). https://hal.inria.fr/inria-00475214/document

## Citation

If you use `movement_primitives` for a scientific publication, I would
appreciate citation of the following paper:

Fabisch, A., (2024). movement_primitives: Imitation Learning of Cartesian
Motion with Movement Primitives. Journal of Open Source Software, 9(97), 6695,
[![Paper DOI](https://joss.theoj.org/papers/10.21105/joss.06695/status.svg)](https://doi.org/10.21105/joss.06695)

Bibtex entry:

```bibtex
@article{Fabisch2024,
  doi = {10.21105/joss.06695},
  url = {https://doi.org/10.21105/joss.06695},
  year = {2024},
  publisher = {The Open Journal},
  volume = {9},
  number = {97},
  pages = {6695},
  author = {Alexander Fabisch},
  title = {movement_primitives: Imitation Learning of Cartesian Motion with Movement Primitives},
  journal = {Journal of Open Source Software}
}
```

## Funding

This library has been developed initially at the
[Robotics Innovation Center](https://robotik.dfki-bremen.de/en/startpage.html)
of the German Research Center for Artificial Intelligence (DFKI GmbH) in
Bremen. At this phase the work was supported through a grant of the German
Federal Ministry of Economic Affairs and Energy (BMWi, FKZ 50 RA 1701).

<img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/DFKI_Logo.jpg" height="150px" /><img src="https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/241-logo-bmwi-jpg.jpg" height="150px" />

```{toctree}
:hidden:
api
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/dfki-ric/movement_primitives",
    "name": "movement-primitives",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": null,
    "author": "Alexander Fabisch",
    "author_email": "alexander.fabisch@dfki.de",
    "download_url": "https://files.pythonhosted.org/packages/73/b9/771d3b4a7296ffe61902e424122f8630816476b5e46fe8a1719b0b727813/movement_primitives-0.9.0.tar.gz",
    "platform": null,
    "description": "![CI](https://github.com/dfki-ric/movement_primitives/actions/workflows/python-package.yml/badge.svg)\n[![codecov](https://codecov.io/gh/dfki-ric/movement_primitives/branch/main/graph/badge.svg?token=EFHUC81DBL)](https://codecov.io/gh/dfki-ric/movement_primitives)\n[![Paper DOI](https://joss.theoj.org/papers/10.21105/joss.06695/status.svg)](https://doi.org/10.21105/joss.06695)\n[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.6491361.svg)](https://doi.org/10.5281/zenodo.6491361)\n\n# Movement Primitives\n\n> Dynamical movement primitives (DMPs), probabilistic movement primitives\n> (ProMPs), and spatially coupled bimanual DMPs for imitation learning.\n\nMovement primitives are a common representation of movements in robotics for\nimitation learning, reinforcement learning, and black-box optimization of\nbehaviors. There are many types and variations. The Python library\nmovement_primitives focuses on imitation learning, generalization, and\nadaptation of movement primitives in Cartesian space. It implements dynamical\nmovement primitives, probabilistic movement primitives, as well as Cartesian\nand dual Cartesian movement primitives with coupling terms to constrain\nrelative movements in bimanual manipulation. They are implemented in Cython to\nspeed up online execution and batch processing in an offline setting. In\naddition, the library provides tools for data analysis and movement evaluation.\nIt can be installed directly from\n[PyPI](https://pypi.org/project/movement-primitives/).\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/summary.png\" width=\"100%\" />\n\n## Content\n\n* [Statement of Need](#statement-of-need)\n* [Features](#features)\n* [API Documentation](#api-documentation)\n* [Install Library](#install-library)\n* [Examples](#examples)\n* [Build API Documentation](#build-api-documentation)\n* [Test](#test)\n* [Contributing](#contributing)\n* [Non-public Extensions](#non-public-extensions)\n* [Related Publications](#related-publications)\n* [Citation](#citation)\n* [Funding](#funding)\n\n## Statement of Need\n\nMovement primitives are a common group of policy representations in robotics.\nThey are able to represent complex movement patterns, allow temporal and\nspatial modification, offer stability guarantees, and are suitable for\nimitation learning without complicated hyperparameter tuning, which are\nadvantages over general function approximators like neural networks. Movement\nprimitives are white-box models for movement generation and allow to control\nseveral aspects of the movement. There are types of dynamical movement\nprimitives that allow to directly control the goal in state space, the final\nvelocity, or the relative pose of two robotic end-effectors. Probabilistic\nmovement primitives capture distributions of movements adequately and allow\nconditioning in state space and blending of multiple movements. The main\ndisadvantage of movement primitives in comparison to general function\napproximators is that they are limited in their capacity to represent behavior\nthat takes into account complex sensor data during execution. Nevertheless,\nvarious types of movement primitives have proven to be a reliable and effective\ntool in robot learning. A reliable tool deserves a similarly reliable open\nsource implementation. However, there are only a few actively maintained,\ndocumented, and easy to use implementations. One of these is the library\n*movement_primitives*. It combines several types of dynamical movement\nprimitives and probabilistic movement primitives in a single library with\na focus on Cartesian and bimanual movements.\n\n## Features\n\n* Dynamical Movement Primitives (DMPs) for\n    * positions (with fast Runge-Kutta integration)\n    * Cartesian position and orientation (with fast Cython implementation)\n    * Dual Cartesian position and orientation (with fast Cython implementation)\n* Coupling terms for synchronization of position and/or orientation of dual Cartesian DMPs\n* Propagation of DMP weight distribution to state space distribution\n* Probabilistic Movement Primitives (ProMPs)\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dual_cart_dmp_rh5_with_panel.gif\" height=\"200px\" /><img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_ur5_minimum_jerk.gif\" height=\"200px\" />\n\nLeft: Example of dual Cartesian DMP with [RH5 Manus](https://robotik.dfki-bremen.de/en/research/robot-systems/rh5-manus/).\nRight: Example of joint space DMP with UR5.\n\n## API Documentation\n\nThe API documentation is available\n[here](https://dfki-ric.github.io/movement_primitives/).\n\n## Install Library\n\nThis library requires Python 3.6 or later and pip is recommended for the\ninstallation. In the following instructions, we assume that the command\n`python` refers to Python 3. If you use the system's Python version, you\nmight have to add the flag `--user` to any installation command.\n\n### PyPI (Recommended)\n\nI recommend to install the library via pip from the Python package index\n(PyPI):\n\n```bash\npython -m pip install movement_primitives[all]\n```\n\nIf you don't want to have all dependencies installed, just omit `[all]`.\n\nThis will install the latest release. If you want to install the latest\ndevelopment version, you have to install from git.\n\n### Git + Editable Mode\n\nEditable mode means that you don't have to install the library after editing\nthe source code. Changes will be directly available in the installed library\nsince pip creates a symlink.\n\nYou can clone the git repository and install it in editable mode with pip:\n\n```bash\ngit clone https://github.com/dfki-ric/movement_primitives.git\npython -m pip install -e .[all]\n```\n\nIf you don't want to have all dependencies installed, just omit `[all]`.\n\n### Git\n\nAlternatively, you can install the library and its dependencies without pip\nfrom the git repository:\n\n```bash\ngit clone https://github.com/dfki-ric/movement_primitives.git\npython setup.py install\n```\n\n### Build Cython Extension\n\nYou could also just build the Cython extension with\n\n```bash\npython setup.py build_ext --inplace\n```\n\n### Dependencies\n\nAn alternative way to install dependencies is the requirements.txt file in the\nmain folder of the git repository:\n\n```bash\npython -m pip install -r requirements.txt\n```\n\n## Examples\n\nYou will find a lot of examples in the subfolder\n[`examples/`](https://github.com/dfki-ric/movement_primitives/tree/main/examples).\nHere are just some highlights to showcase the library.\n\n### Potential Field of 2D DMP\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_potential_field.png\" width=\"800px\" />\n\nA Dynamical Movement Primitive defines a potential field that superimposes\nseveral components: transformation system (goal-directed movement), forcing\nterm (learned shape), and coupling terms (e.g., obstacle avoidance).\n\n[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/plot_dmp_potential_field.py)\n\n### DMP with Final Velocity\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_with_final_velocity.png\" width=\"800px\" />\n\nNot all DMPs allow a final velocity > 0. In this case we analyze the effect\nof changing final velocities in an appropriate variation of the DMP\nformulation that allows to set the final velocity.\n\n[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/plot_dmp_with_final_velocity.py)\n\n### ProMPs\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/promp_lasa.png\" width=\"800px\" />\n\nThe LASA Handwriting dataset learned with ProMPs. The dataset consists of\n2D handwriting motions. The first and third column of the plot represent\ndemonstrations and the second and fourth column show the imitated ProMPs\nwith 1-sigma interval.\n\n[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/plot_promp_lasa.py)\n\n### Conditional ProMPs\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/conditional_promps.png\" width=\"800px\" />\n\nProbabilistic Movement Primitives (ProMPs) define distributions over\ntrajectories that can be conditioned on viapoints. In this example, we\nplot the resulting posterior distribution after conditioning on varying\nstart positions.\n\n[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/plot_conditional_promp.py)\n\n### Cartesian DMPs\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/cart_dmp_ur5.png\" width=\"100%\" />\n\nA trajectory is created manually, imitated with a Cartesian DMP, converted\nto a joint trajectory by inverse kinematics, and executed with a UR5.\n\n[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/vis_cartesian_dmp.py)\n\n### Contextual ProMPs\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/contextual_promps_kuka_panel_width_open3d.png\" width=\"60%\" /><img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/contextual_promps_kuka_panel_width_open3d2.png\" width=\"40%\" />\n\nWe use a dataset of [Mronga and Kirchner (2021)](https://doi.org/10.1016/j.robot.2021.103779),\nin which a dual-arm robot rotates panels of varying widths. 10 demonstrations\nwere recorded for 3 different panel widths through kinesthetic teaching.\nThe panel width is the context over which we generalize with contextual ProMPs.\nWe learn a joint distribution of contexts and ProMP weights, and then condition\nthe distribution on the contexts to obtain a ProMP adapted to the context. Each\ncolor in the above visualizations corresponds to a ProMP for a different\ncontext.\n\n[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/vis_contextual_promp_distribution.py)\n\n**Dependencies that are not publicly available:**\n\n* Dataset: panel rotation dataset of\n  [Mronga and Kirchner (2021)](https://doi.org/10.1016/j.robot.2021.103779)\n* MoCap library\n* URDF of dual arm Kuka system from\n  [DFKI RIC's HRC lab](https://robotik.dfki-bremen.de/en/research/research-facilities-labs/hrc-lab):\n  ```bash\n  git clone git@git.hb.dfki.de:models-robots/kuka_lbr.git\n  ```\n\n### Dual Cartesian DMP\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dual_cart_dmps_rh5_open3d.png\" width=\"50%\" /><img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dual_cart_dmps_rh5_pybullet.png\" width=\"50%\" />\n\nThis library implements specific dual Cartesian DMPs to control dual-arm\nrobotic systems like humanoid robots.\n\nScripts: [Open3D](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/vis_solar_panel.py), [PyBullet](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/sim_solar_panel.py)\n\n**Dependencies that are not publicly available:**\n\n* MoCap library\n* URDF of [DFKI RIC's RH5 robot](https://www.youtube.com/watch?v=jjGQNstmLvY):\n  ```bash\n  git clone git@git.hb.dfki.de:models-robots/rh5_models/pybullet-only-arms-urdf.git --recursive\n  ```\n* URDF of solar panel:\n  ```bash\n  git clone git@git.hb.dfki.de:models-objects/solar_panels.git\n  ```\n\n### Coupled Dual Cartesian DMP\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/coupled_dual_cart_dmps_gripper_open3d.png\" width=\"60%\" /><img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/coupled_dual_cart_dmps_rh5_pybullet.png\" width=\"40%\" />\n\nWe can introduce a coupling term in a dual Cartesian DMP to constrain the\nrelative position, orientation, or pose of two end-effectors of a dual-arm\nrobot.\n\nScripts: [Open3D](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/vis_cartesian_dual_dmp.py), [PyBullet](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/sim_cartesian_dual_dmp.py)\n\n**Dependencies that are not publicly available:**\n\n* URDF of DFKI RIC's gripper:\n  ```bash\n  git clone git@git.hb.dfki.de:motto/abstract-urdf-gripper.git --recursive\n  ```\n* URDF of [DFKI RIC's RH5 robot](https://www.youtube.com/watch?v=jjGQNstmLvY):\n  ```bash\n  git clone git@git.hb.dfki.de:models-robots/rh5_models/pybullet-only-arms-urdf.git --recursive\n  ```\n\n### Propagation of DMP Distribution to State Space\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_state_space_distribution_kuka_peginhole_matplotlib.png\" width=\"60%\" /><img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_state_space_distribution_kuka_peginhole_open3d.png\" width=\"40%\" />\n\nIf we have a distribution over DMP parameters, we can propagate them to state\nspace through an unscented transform.\nOn the left we see the original demonstration of a dual-arm movement in state\nspace (two 3D positions and two quaternions) and the distribution of several\nDMP weight vectors projected to the state space.\nOn the right side we see several dual-arm trajectories sampled from the\ndistribution in state space.\n\n[Script](https://github.com/dfki-ric/movement_primitives/blob/main/examples/external_dependencies/vis_dmp_to_state_variance.py)\n\n**Dependencies that are not publicly available:**\n\n* Dataset: panel rotation dataset of\n  [Mronga and Kirchner (2021)](https://doi.org/10.1016/j.robot.2021.103779)\n* MoCap library\n* URDF of dual arm\n  [Kuka system](https://robotik.dfki-bremen.de/en/research/robot-systems/imrk/)\n  from\n  [DFKI RIC's HRC lab](https://robotik.dfki-bremen.de/en/research/research-facilities-labs/hrc-lab):\n  ```bash\n  git clone git@git.hb.dfki.de:models-robots/kuka_lbr.git\n  ```\n\n## Build API Documentation\n\nYou can build an API documentation with sphinx.\nYou can install all dependencies with\n\n```bash\npython -m pip install movement_primitives[doc]\n```\n\n... and build the documentation from the folder `doc/` with\n\n```bash\nmake html\n```\n\nIt will be located at `doc/build/html/index.html`.\n\n## Test\n\nTo run the tests some python libraries are required:\n\n```bash\npython -m pip install -e .[test]\n```\n\nThe tests are located in the folder `test/` and can be executed with:\n`python -m pytest`\n\nThis command searches for all files with `test` and executes the functions\nwith `test_*`. You will find a test coverage report at `htmlcov/index.html`.\n\n## Contributing\n\nYou can report bugs in the [issue tracker](https://github.com/dfki-ric/movement_primitives/issues).\nIf you have questions about the software, please use the [discussions\nsection](https://github.com/dfki-ric/movement_primitives/discussions).\nTo add new features, documentation, or fix bugs you can open a pull request\non [GitHub](https://github.com/dfki-ric/movement_primitives). Directly pushing\nto the main branch is not allowed.\n\nThe recommended workflow to add a new feature, add documentation, or fix a bug\nis the following:\n\n* Push your changes to a branch (e.g., feature/x, doc/y, or fix/z) of your fork\n  of the repository.\n* Open a pull request to the main branch of the main repository.\n\nThis is a checklist for new features:\n\n- are there unit tests?\n- does it have docstrings?\n- is it included in the API documentation?\n- run flake8 and pylint\n- should it be part of the readme?\n- should it be included in any example script?\n\n## Non-public Extensions\n\nScripts from the subfolder `examples/external_dependencies/` require access to\ngit repositories (URDF files or optional dependencies) and datasets that are\nnot publicly available. They are available on request (email\nalexander.fabisch@dfki.de).\n\nNote that the library does not have any non-public dependencies! They are only\nrequired to run all examples.\n\n### MoCap Library\n\n```bash\n# untested: pip install git+https://git.hb.dfki.de/dfki-interaction/mocap.git\ngit clone git@git.hb.dfki.de:dfki-interaction/mocap.git\ncd mocap\npython -m pip install -e .\ncd ..\n```\n\n### Get URDFs\n\n```bash\n# RH5\ngit clone git@git.hb.dfki.de:models-robots/rh5_models/pybullet-only-arms-urdf.git --recursive\n# RH5v2\ngit clone git@git.hb.dfki.de:models-robots/rh5v2_models/pybullet-urdf.git --recursive\n# Kuka\ngit clone git@git.hb.dfki.de:models-robots/kuka_lbr.git\n# Solar panel\ngit clone git@git.hb.dfki.de:models-objects/solar_panels.git\n# RH5 Gripper\ngit clone git@git.hb.dfki.de:motto/abstract-urdf-gripper.git --recursive\n```\n\n### Data\n\nI assume that your data is located in the folder `data/` in most scripts.\nYou should put a symlink there to point to your actual data folder.\n\n## Related Publications\n\nThis library implements several types of dynamical movement primitives and\nprobabilistic movement primitives. These are described in detail in the\nfollowing papers.\n\n[1] Ijspeert, A. J., Nakanishi, J., Hoffmann, H., Pastor, P., Schaal, S. (2013).\n    Dynamical Movement Primitives: Learning Attractor Models for Motor\n    Behaviors, Neural Computation 25 (2), 328-373. DOI: 10.1162/NECO_a_00393,\n    https://homes.cs.washington.edu/~todorov/courses/amath579/reading/DynamicPrimitives.pdf\n\n[2] Pastor, P., Hoffmann, H., Asfour, T., Schaal, S. (2009).\n    Learning and Generalization of Motor Skills by Learning from Demonstration.\n    In 2009 IEEE International Conference on Robotics and Automation,\n    (pp. 763-768). DOI: 10.1109/ROBOT.2009.5152385,\n    https://h2t.iar.kit.edu/pdf/Pastor2009.pdf\n\n[3] Muelling, K., Kober, J., Kroemer, O., Peters, J. (2013).\n    Learning to Select and Generalize Striking Movements in Robot Table Tennis.\n    International Journal of Robotics Research 32 (3), 263-279.\n    https://www.ias.informatik.tu-darmstadt.de/uploads/Publications/Muelling_IJRR_2013.pdf\n\n[4] Ude, A., Nemec, B., Petric, T., Murimoto, J. (2014).\n    Orientation in Cartesian space dynamic movement primitives.\n    In IEEE International Conference on Robotics and Automation (ICRA)\n    (pp. 2997-3004). DOI: 10.1109/ICRA.2014.6907291,\n    https://acat-project.eu/modules/BibtexModule/uploads/PDF/udenemecpetric2014.pdf\n\n[5] Gams, A., Nemec, B., Zlajpah, L., W\u00e4chter, M., Asfour, T., Ude, A. (2013).\n    Modulation of Motor Primitives using Force Feedback: Interaction with\n    the Environment and Bimanual Tasks (2013), In 2013 IEEE/RSJ International\n    Conference on Intelligent Robots and Systems (pp. 5629-5635). DOI:\n    10.1109/IROS.2013.6697172,\n    https://h2t.anthropomatik.kit.edu/pdf/Gams2013.pdf\n\n[6] Vidakovic, J., Jerbic, B., Sekoranja, B., Svaco, M., Suligoj, F. (2019).\n    Task Dependent Trajectory Learning from Multiple Demonstrations Using\n    Movement Primitives (2019),\n    In International Conference on Robotics in Alpe-Adria Danube Region (RAAD)\n    (pp. 275-282). DOI: 10.1007/978-3-030-19648-6_32,\n    https://link.springer.com/chapter/10.1007/978-3-030-19648-6_32\n\n[7] Paraschos, A., Daniel, C., Peters, J., Neumann, G. (2013).\n    Probabilistic movement primitives, In C.J. Burges and L. Bottou and M.\n    Welling and Z. Ghahramani and K.Q. Weinberger (Eds.), Advances in Neural\n    Information Processing Systems, 26,\n    https://papers.nips.cc/paper/2013/file/e53a0a2978c28872a4505bdb51db06dc-Paper.pdf\n\n[8] Maeda, G. J., Neumann, G., Ewerton, M., Lioutikov, R., Kroemer, O.,\n    Peters, J. (2017). Probabilistic movement primitives for coordination of\n    multiple human\u2013robot collaborative tasks. Autonomous Robots, 41, 593-612.\n    DOI: 10.1007/s10514-016-9556-2,\n    https://link.springer.com/article/10.1007/s10514-016-9556-2\n\n[9] Paraschos, A., Daniel, C., Peters, J., Neumann, G. (2018).\n    Using probabilistic movement primitives in robotics. Autonomous Robots, 42,\n    529-551. DOI: 10.1007/s10514-017-9648-7,\n    https://www.ias.informatik.tu-darmstadt.de/uploads/Team/AlexandrosParaschos/promps_auro.pdf\n\n[10] Lazaric, A., Ghavamzadeh, M. (2010).\n     Bayesian Multi-Task Reinforcement Learning. In Proceedings of the 27th\n     International Conference on International Conference on Machine Learning\n     (ICML'10) (pp. 599-606). https://hal.inria.fr/inria-00475214/document\n\n## Citation\n\nIf you use `movement_primitives` for a scientific publication, I would\nappreciate citation of the following paper:\n\nFabisch, A., (2024). movement_primitives: Imitation Learning of Cartesian\nMotion with Movement Primitives. Journal of Open Source Software, 9(97), 6695,\n[![Paper DOI](https://joss.theoj.org/papers/10.21105/joss.06695/status.svg)](https://doi.org/10.21105/joss.06695)\n\nBibtex entry:\n\n```bibtex\n@article{Fabisch2024,\n  doi = {10.21105/joss.06695},\n  url = {https://doi.org/10.21105/joss.06695},\n  year = {2024},\n  publisher = {The Open Journal},\n  volume = {9},\n  number = {97},\n  pages = {6695},\n  author = {Alexander Fabisch},\n  title = {movement_primitives: Imitation Learning of Cartesian Motion with Movement Primitives},\n  journal = {Journal of Open Source Software}\n}\n```\n\n## Funding\n\nThis library has been developed initially at the\n[Robotics Innovation Center](https://robotik.dfki-bremen.de/en/startpage.html)\nof the German Research Center for Artificial Intelligence (DFKI GmbH) in\nBremen. At this phase the work was supported through a grant of the German\nFederal Ministry of Economic Affairs and Energy (BMWi, FKZ 50 RA 1701).\n\n<img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/DFKI_Logo.jpg\" height=\"150px\" /><img src=\"https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/241-logo-bmwi-jpg.jpg\" height=\"150px\" />\n\n```{toctree}\n:hidden:\napi\n```\n",
    "bugtrack_url": null,
    "license": "BSD-3-clause",
    "summary": "Movement primitives",
    "version": "0.9.0",
    "project_urls": {
        "Homepage": "https://github.com/dfki-ric/movement_primitives"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "73b9771d3b4a7296ffe61902e424122f8630816476b5e46fe8a1719b0b727813",
                "md5": "75752680a7ccad9873698483c8a11b9e",
                "sha256": "31fc274c65ae68ef8658079337c08553a28ee9bf45ea668d7bc197d0fa9fe29f"
            },
            "downloads": -1,
            "filename": "movement_primitives-0.9.0.tar.gz",
            "has_sig": false,
            "md5_digest": "75752680a7ccad9873698483c8a11b9e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 9733133,
            "upload_time": "2024-10-28T15:35:56",
            "upload_time_iso_8601": "2024-10-28T15:35:56.549039Z",
            "url": "https://files.pythonhosted.org/packages/73/b9/771d3b4a7296ffe61902e424122f8630816476b5e46fe8a1719b0b727813/movement_primitives-0.9.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-28 15:35:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "dfki-ric",
    "github_project": "movement_primitives",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "requirements": [],
    "lcname": "movement-primitives"
}
        
Elapsed time: 0.48091s