Part 1: creating the dump files
Create dump files from the bag file, storing the calibrated RGBD sensor data.
Start the odom_tf_publisher_node which will publish the tf topics synchronized with the odometry topic. Make sure, that transforms.txt contains the calibrated transformations from the base link to the sensor frames.
The thin_scan_matcher publishes the odometry topic based on localication via laser range finder sensors. We are using this, because it provides a higher accuracy then the robots wheel-encoder-based odometry.
The dumper node itself simply writes the sensor topics to the dump files.
-t /calib_camera_front/depth/image_raw \
-t /calib_camera_left/depth/image_raw \
-t /calib_camera_right/depth/image_raw \
-t /camera_front/rgb/image_raw \
-t /camera_left/rgb/image_raw \
-t /camera_right/rgb/image_raw \
-base_link_frame_id /base_link \
The driver_node from the easydepthcalibation package will publish the calibrated sensor topics. We need one node for each RGBD sensor.
Finally we can play the bag file. Make sure to set the rate via -r to a value at which your computer can process all data without dropping any frames.
After the playback of the bag files has finished, we have a complete dump of all relevant topics, and can use our offline tools for further processing.
Part 2: creating the map
The fps_local_mapper_gui_app will create a map based on the odometry and RGBD sensor data. We can use the -shrink and -skip command line options to speed up the creation of the map.
-o mapper.house_good1.rgb.txt \
-t /camera_front/depth/image_raw \
-rgbt /camera_front/rgb/image_raw \
-t /camera_left/depth/image_raw \
-rgbt /camera_left/rgb/image_raw \
-t /camera_right/depth/image_raw \
-rgbt /camera_right/rgb/image_raw \
-tbb 2.5 -obb 4 -shrink 2 -skip 2 -max_distance 3.5 \