User`s guide

3 Simulink Interface
3-22
Working with VRML Sensors
In this section...
“Add VRML Sensors to Virtual Worlds” on page 3-22
“Interactive Mode” on page 3-23
“Read VRML Sensor Values” on page 3-23
Add VRML Sensors to Virtual Worlds
This section describes how to interface a Simulink block diagram to sensors in a virtual
reality scene. It also describes how to programmatically input signals from the virtual
world into a simulation model.
Virtual reality scenes can contain sensors, nodes able to generate events and output
values depending on time, user navigation, and actions and distance changes in the
scene. These nodes add interactivity to the virtual world. You can use Simulink 3D
Animation functions to read sensor field values into simulation models and control
simulation based on the user interaction with the virtual scene.
You can define the following VRML sensors in the scene:
Sensors Description
CylinderSensor Maps pointer motion (for example, a mouse or wand) into a
rotation on an invisible cylinder that is aligned with the y-axis of
the local coordinate system.
PlaneSensor Maps pointing device motion into two-dimensional translation in a
plane parallel to the z=0 plane of the local coordinate system.
ProximitySensor Generates events when the viewer enters, exits, and moves within
a region in space (defined by a box).
SphereSensor Maps pointing device motion into spherical rotation about the
origin of the local coordinate system.
TimeSensor Generates events as time passes.
TouchSensor Tracks the location and state of the pointing device and detects
when you point at geometry contained by the TouchSensor node
parent group.