Datasheet
2.3. Sensors
The robot is equipped with perception, navigation, interaction, environment and low-level safety
sensors. For perception the robot is equipped with cameras, providing a middle range perception
(<25m) and short-range perception (1 to 2m). For navigation the robot uses encoders to control the
velocity of the motors, an inertial sensor with GPS and a stereo camera to determine the
position/orientation in the environment, lasers to detect obstacles and for mapping, and a standalone
camera for guidance missions. For interaction the robot uses a standalone camera for people
tracking, face analysis and body gesture recognition, and also microphones. For environmental
sensing the robot will be equipped with temperature, humidity and rain sensors. Finally, the bumpers
and sonar sensors will provide low-level safety sensing.
Here follows a list of sensors that are being used on board.
2.3.1. Perception Sensors
The robot makes use of different cameras for navigation of the environment, feature identification,
pedestrian detection and tracking, body orientation estimation, face analysis and body gesture
recognition. They can also be used to detect changes in the surrounding environment.
• Front Stereo Vision Cameras: Dalsa Genie-HM1400 XDR
1
(2x)
• Function: localization, obstacle detection, pedestrian detection and body orientation
estimation;
• Position on Robot Platform: looking ahead; 1.2m high
• Front Vision Camera: Dalsa Genie-HM1400 XDR (1x)
• Function: face analysis and fine body gesture recognition
• Position on Robot Platform: looking ahead; 1.2m high
• Rear Vision Camera: HD Webcam
• Function: AR and people tracking for guidance mission
• Position on Robot Platform: looking back; 1.2m high
2.3.2. Navigation Sensors
The robot will navigate in the environment while making a fusion of measures provided by different
sensors. Outdoors, the robot will be able to use the stereo pair, lasers, GPS (where available),
encoders odometry and the inertial sensor to estimate its position and orientation. For obstacle
avoidance, mapping and localization it uses the lasers and sonar sensors. The front stereo vision
camera will also help in the navigation by detecting obstacles and persons.
• Inertial Sensor IMU with GPS: Xsens MTI-G
• Function: localization estimation (position and orientation)
• Position on Robot Platform: in the robot’s centre of rotation
• Front 2D laser rangefinder: Hokuyo’s UTM-30LX
• Function: mapping, localization and obstacle avoidance
• Position on Robot Platform: frontal and horizontal
• Front 2D laser rangefinder: Hokuyo’s UTM-30LX
1 http://www.teledynedalsa.com/imaging/products/cameras/area-scan/genie/CR-GM0X-H140X/
FROG – FP7 STREP nr. 288235
Deliverable: D1.4 – Platform User and Developer Manual 9










