User Manual
This procedure is repeated four times in total. Complying to the suggestions to observe the grid from above,
left, front, and right, as sketched in Fig. 6.7.6, in this example the following corresponding camera images
have been sent to the hand-eye calibration component with their associated robot pose:
Fig. 6.7.8: Recorded camera images as input for the calibration procedure
Step 3: Calculating and saving the calibration transformation
The final step in the hand-eye calibration routine consists of issuing the desired calibration transformation to be
computed from the collected poses and camera images. The REST-API offers this functionality via the calibrate
service call (see Services, Section 6.7.5). Depending on the way the rc_visard is mounted, this service computes
and returns the transformation (i.e., the pose) between the camera frame and either the user-defined robot frame
(robot-mounted sensor) or the user-defined external reference frame ext (statically mounted sensor); see Sensor
mounting (Section 6.7.2).
To enable users to judge the quality of the resulting calibration transformation, the component also reports a cali-
bration error. This value is measured in pixels and denotes the root mean square of the reprojection error averaged
over all calibration slots and all corners of the calibration grid. However for a more intuitive understanding, this
measurement might be normalized by utilizing rc_visard’s focal length 𝑓 in pixels:
𝐸 =
𝐸
camera
𝑓
.
Note: The rc_visard reports a focal length factor via its various interfaces. It relates to the image width for
supporting different image resolutions. The focal length 𝑓 in pixels can be easily obtained by multiplying the
focal length factor by the image width in pixels.
The value 𝐸 can now be interpreted as an object-related error in meters in the 3D-world. Given that the distance
between the calibration grid and the rc_visard is one meter, the average accuracy associated with transforming
the grid’s coordinates from the camera frame to the target frame is 1 · 𝐸 m; assuming a distance of 0.5 meters, it
measures 0.5 · 𝐸 m, etc.
Web GUI example: The Web GUI automatically triggers computation of the calibration result immediately after
taking the last of the four pictures. The user just needs to click the Next button to proceed to the result. In
this example with a statically mounted rc_visard, the resulting output is the pose of the sensor’s left camera
in the world coordinate system of the robot – represented in the pose format as specified in step 1 of the
calibration routine.
The reported error of 𝐸
camera
= 0.4 pixels in Fig. 6.7.9 transforms into a calibration accuracy of 𝐸 =
𝐸
camera
𝑓
≈
0.4
1081.46
≈ 0.00036, which is 0.36 mm at 1 meter distance – a submillimeter accuracy for this
calibration run.
6.7. Hand-eye calibration 61