ivy-robot


Nameivy-robot JSON
Version 1.1.9 PyPI version JSON
download
home_pagehttps://ivy-dl.org/robot
SummaryFunctions and classes for gradient-based robot motion planning, written in Ivy.
upload_time2021-12-01 16:23:51
maintainer
docs_urlNone
authorIvy Team
requires_python
licenseApache 2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            .. image:: https://github.com/ivy-dl/robot/blob/master/docs/partial_source/logos/logo.png?raw=true
   :width: 100%



**Functions and classes for gradient-based robot motion planning, written in Ivy.**



.. image:: https://github.com/ivy-dl/ivy-dl.github.io/blob/master/img/externally_linked/logos/supported/frameworks.png?raw=true
   :width: 100%

Contents
--------

* `Overview`_
* `Run Through`_
* `Interactive Demos`_
* `Get Involed`_

Overview
--------

.. _docs: https://ivy-dl.org/robot

**What is Ivy Robot?**

Ivy robot provides functions and classes for gradient-based motion planning and trajectory optimization.
Classes are provided both for mobile robots and robot manipulators.  Check out the docs_ for more info!

The library is built on top of the Ivy machine learning framework.
This means all functions and classes simultaneously support:
Jax, Tensorflow, PyTorch, MXNet, and Numpy.

**Ivy Libraries**

There are a host of derived libraries written in Ivy, in the areas of mechanics, 3D vision, robotics, gym environments,
neural memory, pre-trained models + implementations, and builder tools with trainers, data loaders and more. Click on
the icons below to learn more!



.. image:: https://github.com/ivy-dl/ivy-dl.github.io/blob/master/img/externally_linked/ivy_libraries.png?raw=true
   :width: 100%













**Quick Start**

Ivy robot can be installed like so: ``pip install ivy-robot``

.. _demos: https://github.com/ivy-dl/robot/tree/master/ivy_robot_demos
.. _interactive: https://github.com/ivy-dl/robot/tree/master/ivy_robot_demos/interactive

To quickly see the different aspects of the library, we suggest you check out the demos_!
We suggest you start by running the script ``run_through.py``,
and read the "Run Through" section below which explains this script.

For more interactive demos, we suggest you run either
``drone_spline_planning.py`` or ``manipulator_spline_planning.py`` in the interactive_ demos folder.

Run Through
-----------

We run through some of the different parts of the library via a simple ongoing example script.
The full script is available in the demos_ folder, as file ``run_through.py``.
First, we select a random backend framework to use for the examples, from the options
``ivy.jax``, ``ivy.tensorflow``, ``ivy.torch``, ``ivy.mxnet`` or ``ivy.numpy``,
and use this to set the ivy backend framework.

.. code-block:: python

    import ivy
    from ivy_demo_utils.framework_utils import choose_random_framework
    ivy.set_framework(choose_random_framework())

**Spline Planning**

We now show how a spline path can be generated from a set of spline anchor points,
using the method ``ivy_robot.sample_spline_path``.
In this example, we generate a spline path representing full 6DOF motion from a starting pose to a target pose.
However, for simplicitly we fix the z translation and 3DOF rotation to zeros in this case.

.. code-block:: python

    # config
    num_free_anchors = 3
    num_samples = 100
    constant_rot_vec = ivy.array([[0., 0., 0.]])
    constant_z = ivy.array([[0.]])

    # xy positions

    # 1 x 2
    start_xy = ivy.array([[0., 0.]])
    target_xy = ivy.array([[1., 1.]])

    # 1 x 2
    anchor1_xy = ivy.array([[0.6, 0.2]])
    anchor2_xy = ivy.array([[0.5, 0.5]])
    anchor3_xy = ivy.array([[0.4, 0.8]])

    # as 6DOF poses

    # 1 x 6
    start_pose = ivy.concatenate((start_xy, constant_z, constant_rot_vec), -1)
    anchor1_pose = ivy.concatenate((anchor1_xy, constant_z, constant_rot_vec), -1)
    anchor2_pose = ivy.concatenate((anchor2_xy, constant_z, constant_rot_vec), -1)
    anchor3_pose = ivy.concatenate((anchor3_xy, constant_z, constant_rot_vec), -1)
    target_pose = ivy.concatenate((target_xy, constant_z, constant_rot_vec), -1)

    num_anchors = num_free_anchors + 2

    # num_anchors x 6
    anchor_poses = ivy.concatenate((start_pose, anchor1_pose, anchor2_pose, anchor3_pose, target_pose), 0)

    # uniform sampling for spline

    # num_anchors x 1
    anchor_points = ivy.expand_dims(ivy.linspace(0., 1., num_anchors), -1)

    # num_samples x 1
    query_points = ivy.expand_dims(ivy.linspace(0., 1., num_samples), -1)

    # interpolated spline poses

    # num_samples x 6
    interpolated_poses = ivy_robot.sample_spline_path(anchor_points, anchor_poses, query_points)

    # xy motion

    # num_samples x 2
    anchor_xy_positions = anchor_poses[..., 0:2]

    # num_samples x 2
    interpolated_xy_positions = interpolated_poses[..., 0:2]

The interpolated xy positions and anchor positions from the path are shown below in the x-y plane.

.. image:: https://github.com/ivy-dl/robot/blob/master/docs/partial_source/images/interpolated_drone_poses.png?raw=true
   :width: 100%

**Rigid Mobile Class**

We now introduce the ``RigidMobile`` robot class,
which can be used to represent rigid jointless robots which are able to move freely.
In this case, we consider the case of a drone executing 6DOF motion in a scene.

The body of the drone is specified by a number of relative body points.
In this case, we represent the drone with 5 points: one in the centre, and one in each of the four corners.

We assume the same target position in the x-y plane as before,
but this time with a self-rotation of 180 degrees about the z-axis.

.. code-block:: python

    # drone relative body points
    rel_body_points = ivy.array([[0., 0., 0.],
                               [-0.1, -0.1, 0.],
                               [-0.1, 0.1, 0.],
                               [0.1, -0.1, 0.],
                               [0.1, 0.1, 0.]])

    # create drone as ivy rigid mobile robot
    drone = RigidMobile(rel_body_points)

    # rotatin vectors

    # 1 x 3
    start_rot_vec = ivy.array([[0., 0., 0.]])
    target_rot_vec = ivy.array([[0., 0., np.pi]])

    # 1 x 3
    anchor1_rot_vec = ivy.array([[0., 0., np.pi/4]])
    anchor2_rot_vec = ivy.array([[0., 0., 2*np.pi/4]])
    anchor3_rot_vec = ivy.array([[0., 0., 3*np.pi/4]])

    # as 6DOF poses

    # 1 x 6
    start_pose = ivy.concatenate((start_xy, constant_z, start_rot_vec), -1)
    anchor1_pose = ivy.concatenate((anchor1_xy, constant_z, anchor1_rot_vec), -1)
    anchor2_pose = ivy.concatenate((anchor2_xy, constant_z, anchor2_rot_vec), -1)
    anchor3_pose = ivy.concatenate((anchor3_xy, constant_z, anchor3_rot_vec), -1)
    target_pose = ivy.concatenate((target_xy, constant_z, target_rot_vec), -1)

    # num_anchors x 6
    anchor_poses = ivy.concatenate((start_pose, anchor1_pose, anchor2_pose, anchor3_pose, target_pose), 0)

    # interpolated spline poses

    # num_samples x 6
    interpolated_poses = ivy_robot.sample_spline_path(anchor_points, anchor_poses, query_points)

    # as matrices

    # num_anchors x 3 x 4
    anchor_matrices = ivy_mech.rot_vec_pose_to_mat_pose(anchor_poses)

    # num_samples x 3 x 4
    interpolated_matrices = ivy_mech.rot_vec_pose_to_mat_pose(interpolated_poses)

    # sample drone body

    # num_anchors x num_body_points x 3
    anchor_body_points = drone.sample_body(anchor_matrices)

    # num_samples x num_body_points x 3
    interpolated_body_points = drone.sample_body(interpolated_matrices)

The sampled drone body xy positions during motion are shown below in the x-y plane.
By tracing the body points for each of the four corners of the drone,
we can see how the drone performs the 180 degree self-rotation about the z-axis during the motion.

.. image:: https://github.com/ivy-dl/robot/blob/master/docs/partial_source/images/sampled_drone_body_positions.png?raw=true
   :width: 100%

**Manipulator Class**

We now introduce the robot Manipulator class,
which can be used to represent arbitrary robot manipulators.
In this case, we consider the case of very simple 2-link manipulator,
which is constrained to move in the x-y plane.

The manipulator is specified by the Denavit–Hartenberg parameters,
as outlined in the newly derived class below.
We assume a manipulator with two 0.5m links,
where a configuration with both joints angles at 0 degrees represents a upright link configuration.
We specify a new set of target joint angles which corresponds with
a forward reaching motion in the positive x direction.

.. code-block:: python

    class SimpleManipulator(Manipulator):

        def __init__(self, base_inv_ext_mat=None):
            a_s = ivy.array([0.5, 0.5])
            d_s = ivy.array([0., 0.])
            alpha_s = ivy.array([0., 0.])
            dh_joint_scales = ivy.ones((2,))
            dh_joint_offsets = ivy.array([-np.pi/2, 0.])
            super().__init__(a_s, d_s, alpha_s, dh_joint_scales, dh_joint_offsets, base_inv_ext_mat)

    # create manipulator as ivy manipulator
    manipulator = SimpleManipulator()

    # joint angles

    # 1 x 2
    start_joint_angles = ivy.array([[0., 0.]])
    target_joint_angles = ivy.array([[-np.pi/4, -np.pi/4]])

    # 1 x 2
    anchor1_joint_angles = -ivy.array([[0.2, 0.6]])*np.pi/4
    anchor2_joint_angles = -ivy.array([[0.5, 0.5]])*np.pi/4
    anchor3_joint_angles = -ivy.array([[0.8, 0.4]])*np.pi/4

    # num_anchors x 2
    anchor_joint_angles = ivy.concatenate(
        (start_joint_angles, anchor1_joint_angles, anchor2_joint_angles, anchor3_joint_angles,
         target_joint_angles), 0)

    # interpolated joint angles

    # num_anchors x 2
    interpolated_joint_angles = ivy_robot.sample_spline_path(anchor_points, anchor_joint_angles, query_points)

The interpolated joint angles are presented below.

.. image:: https://github.com/ivy-dl/robot/blob/master/docs/partial_source/images/interpolated_manipulator_joint_angles.png?raw=true
   :width: 100%

In a similar fashion to how the drone body was sampled in the previous example,
we next use these interpolated joint angles to sample the link positions for the manipulator.

.. code-block:: python

    # sample links

    # num_anchors x num_link_points x 3
    anchor_link_points = manipulator.sample_links(anchor_joint_angles, samples_per_metre=5)

    # num_anchors x num_link_points x 3
    interpolated_link_points = manipulator.sample_links(interpolated_joint_angles, samples_per_metre=5)

we show the sampled link positions during the course of the forward reaching motion in the x-y plane below.

.. image:: https://github.com/ivy-dl/robot/blob/master/docs/partial_source/images/sampled_manipulator_links.png?raw=true
   :width: 100%

Interactive Demos
-----------------

The main benefit of the library is not simply the ability to sample paths, but to optimize these paths using gradients.
For exmaple, the body or link sample positions can be used to query the signed distance function (SDF) of a 3D scene in batch.
Then, assuming the spline anchor points to be free variables,
the gradients of the mean sampled SDF and a path length metric can be computed with respect to the anchor points.
The anhcor points can then be incrementally updated using gradient descent on this loss function.

We provide two further demo scripts which outline this gradient-based path optimization in a 3D scene.
Rather than presenting the code here, we show visualizations of the demos.
The scripts for these demos can be found in the interactive_ demos folder.

**RigidMobile Planning**

The first demo uses the ``RigidMobile`` class to optimzie the motion of a drone to a target pose,
making use of functions ``ivy_robot.sample_spline_path`` and ``ivy_robot.RigidMobile.sample_body``.



.. image:: https://github.com/ivy-dl/ivy-dl.github.io/blob/master/img/externally_linked/ivy_robot/demo_a.png?raw=true
   :width: 100%

**Manipulator Planning**

The second demo uses the ``MicoManipulator`` class, derived from ``Manipulator``,
to optimzie the motion of a mico robot manipulator to a set of target joint angles,
making use of functions ``ivy_robot.sample_spline_path`` and ``ivy_robot.Manipulator.sample_links``.



.. image:: https://github.com/ivy-dl/ivy-dl.github.io/blob/master/img/externally_linked/ivy_robot/demo_b.png?raw=true
   :width: 100%

Get Involed
-----------

We hope the functions in this library are useful to a wide range of machine learning developers.
However, there are many more areas of gradient-based motion planning and broader robotics
which could be covered by this library.

If there are any particular robotics functions you feel are missing,
and your needs are not met by the functions currently on offer,
then we are very happy to accept pull requests!

We look forward to working with the community on expanding and improving the Ivy robot library.

Citation
--------

::

    @article{lenton2021ivy,
      title={Ivy: Unified Machine Learning for Inter-Framework Portability},
      author={Lenton, Daniel and Pardo, Fabio and Falck, Fabian and James, Stephen and Clark, Ronald},
      journal={arXiv preprint arXiv:2102.02886},
      year={2021}
    }


            

Raw data

            {
    "_id": null,
    "home_page": "https://ivy-dl.org/robot",
    "name": "ivy-robot",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "Ivy Team",
    "author_email": "ivydl.team@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/55/1c/1a4d3c7d77003106fc2b1411bf5adf602f3502a04707487333b8c6c5fa7c/ivy-robot-1.1.9.tar.gz",
    "platform": "",
    "description": ".. image:: https://github.com/ivy-dl/robot/blob/master/docs/partial_source/logos/logo.png?raw=true\n   :width: 100%\n\n\n\n**Functions and classes for gradient-based robot motion planning, written in Ivy.**\n\n\n\n.. image:: https://github.com/ivy-dl/ivy-dl.github.io/blob/master/img/externally_linked/logos/supported/frameworks.png?raw=true\n   :width: 100%\n\nContents\n--------\n\n* `Overview`_\n* `Run Through`_\n* `Interactive Demos`_\n* `Get Involed`_\n\nOverview\n--------\n\n.. _docs: https://ivy-dl.org/robot\n\n**What is Ivy Robot?**\n\nIvy robot provides functions and classes for gradient-based motion planning and trajectory optimization.\nClasses are provided both for mobile robots and robot manipulators.  Check out the docs_ for more info!\n\nThe library is built on top of the Ivy machine learning framework.\nThis means all functions and classes simultaneously support:\nJax, Tensorflow, PyTorch, MXNet, and Numpy.\n\n**Ivy Libraries**\n\nThere are a host of derived libraries written in Ivy, in the areas of mechanics, 3D vision, robotics, gym environments,\nneural memory, pre-trained models + implementations, and builder tools with trainers, data loaders and more. Click on\nthe icons below to learn more!\n\n\n\n.. image:: https://github.com/ivy-dl/ivy-dl.github.io/blob/master/img/externally_linked/ivy_libraries.png?raw=true\n   :width: 100%\n\n\n\n\n\n\n\n\n\n\n\n\n\n**Quick Start**\n\nIvy robot can be installed like so: ``pip install ivy-robot``\n\n.. _demos: https://github.com/ivy-dl/robot/tree/master/ivy_robot_demos\n.. _interactive: https://github.com/ivy-dl/robot/tree/master/ivy_robot_demos/interactive\n\nTo quickly see the different aspects of the library, we suggest you check out the demos_!\nWe suggest you start by running the script ``run_through.py``,\nand read the \"Run Through\" section below which explains this script.\n\nFor more interactive demos, we suggest you run either\n``drone_spline_planning.py`` or ``manipulator_spline_planning.py`` in the interactive_ demos folder.\n\nRun Through\n-----------\n\nWe run through some of the different parts of the library via a simple ongoing example script.\nThe full script is available in the demos_ folder, as file ``run_through.py``.\nFirst, we select a random backend framework to use for the examples, from the options\n``ivy.jax``, ``ivy.tensorflow``, ``ivy.torch``, ``ivy.mxnet`` or ``ivy.numpy``,\nand use this to set the ivy backend framework.\n\n.. code-block:: python\n\n    import ivy\n    from ivy_demo_utils.framework_utils import choose_random_framework\n    ivy.set_framework(choose_random_framework())\n\n**Spline Planning**\n\nWe now show how a spline path can be generated from a set of spline anchor points,\nusing the method ``ivy_robot.sample_spline_path``.\nIn this example, we generate a spline path representing full 6DOF motion from a starting pose to a target pose.\nHowever, for simplicitly we fix the z translation and 3DOF rotation to zeros in this case.\n\n.. code-block:: python\n\n    # config\n    num_free_anchors = 3\n    num_samples = 100\n    constant_rot_vec = ivy.array([[0., 0., 0.]])\n    constant_z = ivy.array([[0.]])\n\n    # xy positions\n\n    # 1 x 2\n    start_xy = ivy.array([[0., 0.]])\n    target_xy = ivy.array([[1., 1.]])\n\n    # 1 x 2\n    anchor1_xy = ivy.array([[0.6, 0.2]])\n    anchor2_xy = ivy.array([[0.5, 0.5]])\n    anchor3_xy = ivy.array([[0.4, 0.8]])\n\n    # as 6DOF poses\n\n    # 1 x 6\n    start_pose = ivy.concatenate((start_xy, constant_z, constant_rot_vec), -1)\n    anchor1_pose = ivy.concatenate((anchor1_xy, constant_z, constant_rot_vec), -1)\n    anchor2_pose = ivy.concatenate((anchor2_xy, constant_z, constant_rot_vec), -1)\n    anchor3_pose = ivy.concatenate((anchor3_xy, constant_z, constant_rot_vec), -1)\n    target_pose = ivy.concatenate((target_xy, constant_z, constant_rot_vec), -1)\n\n    num_anchors = num_free_anchors + 2\n\n    # num_anchors x 6\n    anchor_poses = ivy.concatenate((start_pose, anchor1_pose, anchor2_pose, anchor3_pose, target_pose), 0)\n\n    # uniform sampling for spline\n\n    # num_anchors x 1\n    anchor_points = ivy.expand_dims(ivy.linspace(0., 1., num_anchors), -1)\n\n    # num_samples x 1\n    query_points = ivy.expand_dims(ivy.linspace(0., 1., num_samples), -1)\n\n    # interpolated spline poses\n\n    # num_samples x 6\n    interpolated_poses = ivy_robot.sample_spline_path(anchor_points, anchor_poses, query_points)\n\n    # xy motion\n\n    # num_samples x 2\n    anchor_xy_positions = anchor_poses[..., 0:2]\n\n    # num_samples x 2\n    interpolated_xy_positions = interpolated_poses[..., 0:2]\n\nThe interpolated xy positions and anchor positions from the path are shown below in the x-y plane.\n\n.. image:: https://github.com/ivy-dl/robot/blob/master/docs/partial_source/images/interpolated_drone_poses.png?raw=true\n   :width: 100%\n\n**Rigid Mobile Class**\n\nWe now introduce the ``RigidMobile`` robot class,\nwhich can be used to represent rigid jointless robots which are able to move freely.\nIn this case, we consider the case of a drone executing 6DOF motion in a scene.\n\nThe body of the drone is specified by a number of relative body points.\nIn this case, we represent the drone with 5 points: one in the centre, and one in each of the four corners.\n\nWe assume the same target position in the x-y plane as before,\nbut this time with a self-rotation of 180 degrees about the z-axis.\n\n.. code-block:: python\n\n    # drone relative body points\n    rel_body_points = ivy.array([[0., 0., 0.],\n                               [-0.1, -0.1, 0.],\n                               [-0.1, 0.1, 0.],\n                               [0.1, -0.1, 0.],\n                               [0.1, 0.1, 0.]])\n\n    # create drone as ivy rigid mobile robot\n    drone = RigidMobile(rel_body_points)\n\n    # rotatin vectors\n\n    # 1 x 3\n    start_rot_vec = ivy.array([[0., 0., 0.]])\n    target_rot_vec = ivy.array([[0., 0., np.pi]])\n\n    # 1 x 3\n    anchor1_rot_vec = ivy.array([[0., 0., np.pi/4]])\n    anchor2_rot_vec = ivy.array([[0., 0., 2*np.pi/4]])\n    anchor3_rot_vec = ivy.array([[0., 0., 3*np.pi/4]])\n\n    # as 6DOF poses\n\n    # 1 x 6\n    start_pose = ivy.concatenate((start_xy, constant_z, start_rot_vec), -1)\n    anchor1_pose = ivy.concatenate((anchor1_xy, constant_z, anchor1_rot_vec), -1)\n    anchor2_pose = ivy.concatenate((anchor2_xy, constant_z, anchor2_rot_vec), -1)\n    anchor3_pose = ivy.concatenate((anchor3_xy, constant_z, anchor3_rot_vec), -1)\n    target_pose = ivy.concatenate((target_xy, constant_z, target_rot_vec), -1)\n\n    # num_anchors x 6\n    anchor_poses = ivy.concatenate((start_pose, anchor1_pose, anchor2_pose, anchor3_pose, target_pose), 0)\n\n    # interpolated spline poses\n\n    # num_samples x 6\n    interpolated_poses = ivy_robot.sample_spline_path(anchor_points, anchor_poses, query_points)\n\n    # as matrices\n\n    # num_anchors x 3 x 4\n    anchor_matrices = ivy_mech.rot_vec_pose_to_mat_pose(anchor_poses)\n\n    # num_samples x 3 x 4\n    interpolated_matrices = ivy_mech.rot_vec_pose_to_mat_pose(interpolated_poses)\n\n    # sample drone body\n\n    # num_anchors x num_body_points x 3\n    anchor_body_points = drone.sample_body(anchor_matrices)\n\n    # num_samples x num_body_points x 3\n    interpolated_body_points = drone.sample_body(interpolated_matrices)\n\nThe sampled drone body xy positions during motion are shown below in the x-y plane.\nBy tracing the body points for each of the four corners of the drone,\nwe can see how the drone performs the 180 degree self-rotation about the z-axis during the motion.\n\n.. image:: https://github.com/ivy-dl/robot/blob/master/docs/partial_source/images/sampled_drone_body_positions.png?raw=true\n   :width: 100%\n\n**Manipulator Class**\n\nWe now introduce the robot Manipulator class,\nwhich can be used to represent arbitrary robot manipulators.\nIn this case, we consider the case of very simple 2-link manipulator,\nwhich is constrained to move in the x-y plane.\n\nThe manipulator is specified by the Denavit\u2013Hartenberg parameters,\nas outlined in the newly derived class below.\nWe assume a manipulator with two 0.5m links,\nwhere a configuration with both joints angles at 0 degrees represents a upright link configuration.\nWe specify a new set of target joint angles which corresponds with\na forward reaching motion in the positive x direction.\n\n.. code-block:: python\n\n    class SimpleManipulator(Manipulator):\n\n        def __init__(self, base_inv_ext_mat=None):\n            a_s = ivy.array([0.5, 0.5])\n            d_s = ivy.array([0., 0.])\n            alpha_s = ivy.array([0., 0.])\n            dh_joint_scales = ivy.ones((2,))\n            dh_joint_offsets = ivy.array([-np.pi/2, 0.])\n            super().__init__(a_s, d_s, alpha_s, dh_joint_scales, dh_joint_offsets, base_inv_ext_mat)\n\n    # create manipulator as ivy manipulator\n    manipulator = SimpleManipulator()\n\n    # joint angles\n\n    # 1 x 2\n    start_joint_angles = ivy.array([[0., 0.]])\n    target_joint_angles = ivy.array([[-np.pi/4, -np.pi/4]])\n\n    # 1 x 2\n    anchor1_joint_angles = -ivy.array([[0.2, 0.6]])*np.pi/4\n    anchor2_joint_angles = -ivy.array([[0.5, 0.5]])*np.pi/4\n    anchor3_joint_angles = -ivy.array([[0.8, 0.4]])*np.pi/4\n\n    # num_anchors x 2\n    anchor_joint_angles = ivy.concatenate(\n        (start_joint_angles, anchor1_joint_angles, anchor2_joint_angles, anchor3_joint_angles,\n         target_joint_angles), 0)\n\n    # interpolated joint angles\n\n    # num_anchors x 2\n    interpolated_joint_angles = ivy_robot.sample_spline_path(anchor_points, anchor_joint_angles, query_points)\n\nThe interpolated joint angles are presented below.\n\n.. image:: https://github.com/ivy-dl/robot/blob/master/docs/partial_source/images/interpolated_manipulator_joint_angles.png?raw=true\n   :width: 100%\n\nIn a similar fashion to how the drone body was sampled in the previous example,\nwe next use these interpolated joint angles to sample the link positions for the manipulator.\n\n.. code-block:: python\n\n    # sample links\n\n    # num_anchors x num_link_points x 3\n    anchor_link_points = manipulator.sample_links(anchor_joint_angles, samples_per_metre=5)\n\n    # num_anchors x num_link_points x 3\n    interpolated_link_points = manipulator.sample_links(interpolated_joint_angles, samples_per_metre=5)\n\nwe show the sampled link positions during the course of the forward reaching motion in the x-y plane below.\n\n.. image:: https://github.com/ivy-dl/robot/blob/master/docs/partial_source/images/sampled_manipulator_links.png?raw=true\n   :width: 100%\n\nInteractive Demos\n-----------------\n\nThe main benefit of the library is not simply the ability to sample paths, but to optimize these paths using gradients.\nFor exmaple, the body or link sample positions can be used to query the signed distance function (SDF) of a 3D scene in batch.\nThen, assuming the spline anchor points to be free variables,\nthe gradients of the mean sampled SDF and a path length metric can be computed with respect to the anchor points.\nThe anhcor points can then be incrementally updated using gradient descent on this loss function.\n\nWe provide two further demo scripts which outline this gradient-based path optimization in a 3D scene.\nRather than presenting the code here, we show visualizations of the demos.\nThe scripts for these demos can be found in the interactive_ demos folder.\n\n**RigidMobile Planning**\n\nThe first demo uses the ``RigidMobile`` class to optimzie the motion of a drone to a target pose,\nmaking use of functions ``ivy_robot.sample_spline_path`` and ``ivy_robot.RigidMobile.sample_body``.\n\n\n\n.. image:: https://github.com/ivy-dl/ivy-dl.github.io/blob/master/img/externally_linked/ivy_robot/demo_a.png?raw=true\n   :width: 100%\n\n**Manipulator Planning**\n\nThe second demo uses the ``MicoManipulator`` class, derived from ``Manipulator``,\nto optimzie the motion of a mico robot manipulator to a set of target joint angles,\nmaking use of functions ``ivy_robot.sample_spline_path`` and ``ivy_robot.Manipulator.sample_links``.\n\n\n\n.. image:: https://github.com/ivy-dl/ivy-dl.github.io/blob/master/img/externally_linked/ivy_robot/demo_b.png?raw=true\n   :width: 100%\n\nGet Involed\n-----------\n\nWe hope the functions in this library are useful to a wide range of machine learning developers.\nHowever, there are many more areas of gradient-based motion planning and broader robotics\nwhich could be covered by this library.\n\nIf there are any particular robotics functions you feel are missing,\nand your needs are not met by the functions currently on offer,\nthen we are very happy to accept pull requests!\n\nWe look forward to working with the community on expanding and improving the Ivy robot library.\n\nCitation\n--------\n\n::\n\n    @article{lenton2021ivy,\n      title={Ivy: Unified Machine Learning for Inter-Framework Portability},\n      author={Lenton, Daniel and Pardo, Fabio and Falck, Fabian and James, Stephen and Clark, Ronald},\n      journal={arXiv preprint arXiv:2102.02886},\n      year={2021}\n    }\n\n",
    "bugtrack_url": null,
    "license": "Apache 2.0",
    "summary": "Functions and classes for gradient-based robot motion planning, written in Ivy.",
    "version": "1.1.9",
    "project_urls": {
        "Docs": "https://ivy-dl.org/robot/",
        "Homepage": "https://ivy-dl.org/robot",
        "Source": "https://github.com/ivy-dl/robot"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8cb018484d568356abcddb18c07b3cb8b67ec9e43fc63932e5c5f3fbf383af25",
                "md5": "b8d8960b212927cb11e9c5e0d3dc64da",
                "sha256": "d487f2564b0035e062e580a0d40a60c04bf451bcb116ee44cf49c725ce07e45b"
            },
            "downloads": -1,
            "filename": "ivy_robot-1.1.9-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b8d8960b212927cb11e9c5e0d3dc64da",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 14947,
            "upload_time": "2021-12-01T16:23:50",
            "upload_time_iso_8601": "2021-12-01T16:23:50.581145Z",
            "url": "https://files.pythonhosted.org/packages/8c/b0/18484d568356abcddb18c07b3cb8b67ec9e43fc63932e5c5f3fbf383af25/ivy_robot-1.1.9-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "551c1a4d3c7d77003106fc2b1411bf5adf602f3502a04707487333b8c6c5fa7c",
                "md5": "cf2553cccddc4903336b980bb06c8fce",
                "sha256": "17994f4fadef370ac2eb786d0d082859ce3c0f008771b76d036791ad133d3b42"
            },
            "downloads": -1,
            "filename": "ivy-robot-1.1.9.tar.gz",
            "has_sig": false,
            "md5_digest": "cf2553cccddc4903336b980bb06c8fce",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 19182,
            "upload_time": "2021-12-01T16:23:51",
            "upload_time_iso_8601": "2021-12-01T16:23:51.629589Z",
            "url": "https://files.pythonhosted.org/packages/55/1c/1a4d3c7d77003106fc2b1411bf5adf602f3502a04707487333b8c6c5fa7c/ivy-robot-1.1.9.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2021-12-01 16:23:51",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ivy-dl",
    "github_project": "robot",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "ivy-robot"
}
        
Elapsed time: 0.10157s