Vision-Based Trajectory Control for Humanoid Navigation

Consider the problem of robustly
tracking a desired workspace trajectory with a humanoid robot. In [1],
we propose a solution based on the definition of a suitable controlled
output, which represents an averaged motion of the torso after
cancellation of the sway oscillation. The torso motion is reconstructed
using the vision-based odometric
localization method previously presented in [2] and described here. For control design purposes, a unicycle-like model is associated to the evolution of the output signal. The following block scheme summarizes the
developed control paradigm.

Sway motion cancellation

Two different techniques have been used to achieve cancellation of the sway motion, i.e. the natural transversal oscillation of the torso during locomotion. The first proceeds from the observation that swaying is a relatively high-frequency phenomenon, and therefore it may be removed by a suitable low-pass filter (with cut-off frequency of 0.8 Hz). The second technique uses a geometric projection based on kinematic computation to cancel the lateral movement of the torso during locomotion. The plots below show the results of the two techniques on the same torso motion (Left: low-pass filtering; Right: geometric projection). Both are effective in isolating an averaged torso motion.

Experiments

To validate the proposed trajectory
control scheme, we performed some experiments on the humanoid
robot NAO (version 4.0) by Aldebaran Robotics. In our implementation,
the controller updates the robot driving and steering velocity inputs
at 100 Hz.
These commands are then sent to the robot using the NAO APIs, and in
particular the built-in move
function. Since the most recent command overrides all previous
commands, this function can be called with arbitrary rate, thus
providing a convenient mechanism for real-time implementation of a
high-level control loop.

Desired trajectory: line

In the first tracking experiment, the
desired trajectory is a line.

These are the results obtained using low-pass filtering for sway motion cancellation.

In particular, the left plot shows the desired trajectory vs. the actual trajectory of the torso, as estimated by our odometric localization algorithm, whereas the right plot shows the controlled variable vs. the reference signal. The root mean square of the cartesian error is 0.0330 m in this case.

For comparison, here are the corresponding results obtained using geometric projection for
sway motion cancellation. The rms error in this case is slightly larger (0.0808 m).

Desired trajectory: sigmoid

In the second experiment, the
desired trajectory is sigmoidal.

As shown below, results are
satisfactory for both sway cancellation methods, again with a slight
advantage for low-pass filtering (first row, rms error
is 0.0186 m) that achieves a slightly
smoother motion w.r.t. geometric projection (second row, rms error is 0.0191 m).

Video clip

The following clip illustrates the experiments.

Documents

[1] G.
Oriolo, A.
Paolillo, L. Rosa
and M.
Vendittelli,
**Vision-Based Trajectory Control for
Humanoid Navigation**. 2013
IEEE-RAS Int. Conf. on Humanoid Robots, Atlanta, GA, Oct 2013 (pdf).

[2] G.
Oriolo, A.
Paolillo, L. Rosa
and M.
Vendittelli,
**Vision-Based Odometric Localization for
Humanoids using a Kinematic EKF**. 2012
IEEE-RAS Int. Conf. on Humanoid Robots, Osaka, Japan,
Nov-Dec 2012 (pdf).

Robotics Laboratory Homepage