The extrinsic calibration of a sensor is the process of correctly computing its pose with respect to a given reference frame (in this case, the robot reference frame). This is very important, since it allows to accurately express data measurements with respect to different reference frames. The procedure for performing this process with our software is described below.

Calibration log (ROS bag) acquisition

In order to perform  the extrinsic sensor calibration and supposing to use n RGB-D cameras on the robot, at least 2 logs are required. In particular, choosing one of the cameras as a reference, we have:

  1. 1 log (at least), containing the stream of the n cameras, for computing the pose of n − 1 RGB-D sensors with respect to the reference one (all the cameras should see, at least once, the same part of the environment while always respecting the condition of the previous point);
  2. 1 log, containing the robot odometry (or laser data) and the camera stream, for calculating the pose between the robot and reference RGB-D sensor (the robot should slowly translate and rotate while the reference sensor sees at least 3 planes, each of them being non parallel with all the others);

Dump creation

Creating dumps allows to perform calibration offline, without worrying about dropping messages. In this way, in fact, our software can process data in an offline fashion and load the next message only after the current has been processed (i.e., without skipping any messages, unlike the ROS nodes which usually drops data when their queue is full). A dump contains a text file (e.g., ‘./dump_calib_front_left1.txt’) and a folder (e.g., ./dump_calib_front_left1.txt/*) with useful material, and it can be generated by running the following commands  (one per terminal):

roscore
rosrun fps_mapper fps_message_dumper_node -t /camera_1/depth/image_raw ... -t /camera_n/depth/image_raw -o dumpname.txt
rosbag play ./calibration.bag

Since the calibration procedure also requires a dump for each sensor, a separate text file can be easily obtained for each of them from the full dump file. In particular, the grep command line tool allows us to do this:

grep /camera_1/depth ./dumpname.txt > dump_sensor_1.txt

… (for each sensor)

grep /camera_n/depth ./dumpname.txt > dump_sensor_n.txt

No additional effort is required.

Calibration among sensors

The extrinsic calibration is performed by choosing a reference sensor, and determining the pose of all the others with respect to that. Once this activity is completed, the offset of the reference sensor with respect to the base of the robot can be computed in order to have a full transformation tree.

transformations_tree

To do this we have first to create a “local map” (a point cloud) for each sensor i, with i = 1 … n (including the reference sensor):

roscore
rosrun fps_mapper fps_tracker_gui_app -cam_only -t /camera_i/depth/image_raw -bpr 1 -o output_i dump_sensor_i.txt

While creating the local map, you can save it by pressing the W button. The point cloud will be saved as a file named “output_i-n.cloud”, where n is the number of clouds which has been saved.

After that, we can align the local map of a sensor with respect to the reference one, to obtain their pose/rotation offset. This is simply achieved by overlapping their point clouds. To do this, for each sensor k (excluded the reference one r) run:

roscore
rosrun fps_mapper fps_aligner_gui_app output_k-n.cloud output_r-m.cloud

In the viewer window, you can (un-)select a cloud by clicking on it when SHIFT is pressed. Moreover, pressing M toggles between mouse and keyboard mode. This is important, since by using the arrow and page-up/down keys, the clouds can be moved (or rotated, by holding the CTRL key) in order to provide a decent initial guess to the automatic aligner. The aligner can be started via the X key. Finally, the P key prints the current transformations to the console.

Note:

  • Make sure to move only the non-reference cloud , because the transformation printed to the console are relative to the starting position.
  • In order to use the aligner, two clouds must be selected. The order of selection matters: the first cloud will be moved, while the second stays in place.

Example results:
camera_k–> camera_r:

0x1745640 -> 0 0 0 0 0 0
0x1745670 -> -0.608002 0.0789886 -0.0411948 -0.213886 0.449015 -0.866391

camera_j–> camera_r

0x17c7640 -> 0 0 0 0 0 0
0x17c7670 -> -0.215366 -0.00689921 0.0139181 -0.0269052 0.358078 0.160278

 

Save extrinsics to file

Create a text file and copy the transformations from the previous step. Add a rough guess for the translation from the base frame to the reference sensor, like in the example below:

STATIC_TRANSFORM /camera_left_rgb /camera_front_rgb tq -0.608002 0.0789886 -0.0411948 -0.213886 0.449015 -0.866391
STATIC_TRANSFORM /camera_left_rgb /camera_right_rgb tq -0.215366 -0.00689921 0.0139181 -0.0269052 0.358078 0.160278
STATIC_TRANSFORM /base_link /camera_left_rgb Rt 0 0 1 0.19 -1 0 0 0 0 -1 0 0

Note:

  • The “tq” format above contains six float point numbers. The first three are the translation (x,y,z), the last three are the x,y,z components of the unit quaternion that defines the orientation
  • The “Rt” format contains 12 float point number, which represent a (homogeneous) 4×4 transformation matrix, with the bottom row missing.

 

Align camera frames to base frame

For the last step, we align the reference camera frame with the base frame of the robot.

Create a dump from the bag file:

rosrun fps_mapper fps_message_dumper_node -t /camera_front/depth/image_raw -t /camera_left/depth/image_raw -t /camera_right/depth/image_raw -base_link_frame_id /base_link -o ext_calib_dump.txt
rosbag play calib_sensors_base1.bag --pause

Run the application:

rosrun fps_mapper fps_extrinsic_calibrator_gui_app -t /camera_left/depth/image_raw -t /camera_front/depth/image_raw -t /camera_right/depth/image_raw -o ext_calib ext_calib_dump.txt