Skip to content

Latest commit

 

History

History
55 lines (36 loc) · 5.27 KB

README.md

File metadata and controls

55 lines (36 loc) · 5.27 KB

yak

yak (yet another kinfu) is a library and ROS wrapper for Truncated Signed Distance Fields (TSDFs).

A TSDF is a probabilistic representations of a solid surface in 3D space. It's a useful tool for combining many noisy incomplete sensor readings into a single smooth and complete model.

To break down the name:

Distance field: Each voxel in the volume contains a value that represents its metric distance from the closest point on the surface. Voxels very far from the surface have high-magnitude distance values, while those near the surface have values approaching zero.

Signed: Voxels outside the surface have positive distances, while voxels inside the surface have negative distances. This allows the representation of solid objects. The distance field becomes a gradient that shifts from positive to negative as it crosses the surface.

Truncated: Only the distance values of voxels very close to the surface are regularly updated. Distances beyond a certain threshold have their values capped at +/- 1. This decreases the cost of integrating new readings, since not every voxel in the volume needs to be updated.

yak handles two very different use cases. It can reconstruct from a RGBD camera moved around by a human without any knowledge of pose relative to the global frame. It can also reconstruct from a sensor mounted on a robot arm using pose hints provided by TF and robot kinematics. The idea is that this second case doesn't need to deduce sensor motion by comparing the most recent reading to previous readings via ICP, so it should work better in situations with incomplete sensor readings.

The human-guided situation should work out of the box without any other packages. You might need to force the sensor to reset to get the volume positioned in a desirable orientation around your target. The easiest way to do this is to cover and uncover the camera lens.

The robot-assisted situation is currently partially hardcoded to use the sensors and work cell models from the Godel blending project. This will change soon to make it more generalized!

yak_meshing

Aluminum part reconstructed with yak and meshed with yak_meshing

yak_meshing is a ROS package to mesh TSDF volumes generated by Kinect Fusion-like packages.

Meshing happens through the /get_mesh service, which in turn calls the kinfu_ros /get_tsdf service. yak_meshing_node expects a serialized TSDF voxel volume, which is a list of TSDF values and weights for every occupied voxel along with a list of the coordinates of each occupied voxel. OpenVDB's voxel meshing algorithm generates a triangular mesh along the zero-value isosurface of the TSDF volume. The mesh is saved as a .obj, which can be viewed and manipulated in a program like Meshlab or Blender.

nbv_planner

Candidate poses generated and evaluated by nbv_planner

nbv_planner is a ROS package to perform Next Best View analysis using data provided from RGBD cameras like the Asus Xtion. It uses octomap to track voxel occupancy and integrate new readings.

Call the /get_nbv service to return a sorted list (best to worst) of candidate poses near the volume that could expose unknown voxels. Currently evaluation of poses is conducted by casting rays corresponding to the camera's field of view into the octomap. More hits on unknowns = better view.

Executing rosrun nbv_planner exploration_controller_node will execute an exploration routine that will try to move the robot to views that expose unknown regions of a user-specified volume, using the NBV evaluation explained above. The octomap server should be running.

Operating Instructions for Human-Guided Reconstruction

  1. Start TSDF/KinFu processes: roslaunch yak launch_xtion_default.launch
  2. Launch the drivers for the RGBD camera. For the Asus Xtion, this is roslaunch openni2_launch openni2.launch.
  3. Start mapping! Since yak doesn't have any way to relate the pose of the camera to the global frame, the initial position of the volume will be centered in front of the camera. You might have to force the camera to reset a few times to get the volume positioned where you need it.
  4. When you decide that the reconstruction is good enough: rosservice call /get_mesh

Operating Instructions for Autonomous Exploration and Reconstruction

  1. If you intend to use an actual robot, make sure that its state and motion servers are running, and that autonomous motion is allowed (deadman switch engaged, or auto mode).
  2. roslaunch a moveit planning/execution launch file. My command looks like: roslaunch godel_irb2400_moveit_config moveit_planning_execution.launch robot_ip:=192.168.125.1 sim:=False use_ftp:=False. Wait for rviz and moveit to start up.
  3. Launch the TSDF reconstruction nodes. For example, roslaunch yak launch_xtion_robot.launch.
  4. Launch the drivers for the RGBD camera. For the Asus Xtion, this is roslaunch openni2_launch openni2.launch.
  5. Start the octomap server: roslaunch nbv_planner octomap_mapping.launch
  6. When you want to start exploration: rosrun nbv_planner exploration_controller_node
  7. When you decide that the reconstruction is good enough: rosservice call /get_mesh

Build with Docker:

nvidia-docker run -v "<absolute path to your yak workspace:/yak_ws>" rosindustrial/yak:kinetic catkin build --workspace /yak_ws -DCMAKE_LIBRARY_PATH=/usr/local/nvidia/lib64/