User`s guide
Virtual Reality World and Dynamic System Examples
1-29
Mutual gravity accelerations of the bodies are computed using Simulink matrix-type data
support.
Plane Takeoff Example (vrtkoff)
The vrtkoff example represents a simplified aircraft taking off from a runway. Several
viewpoints are defined in this model, both static and attached to the plane, allowing you
to see the takeoff from various perspectives.
The model shows the technique of combining several objects imported or obtained from
different sources (CAD packages, general 3-D modelers, and so on) into a virtual reality
scene. Usually it is necessary for you to wrap such imported objects with an additional
VRML Transform node. This wrapper allows you to set appropriately the scaling,
position, and orientation of the objects to fit in the scene. In this example, the aircraft
model from the Ligos V-Realm Builder Object Library is incorporated into the scene. The
file vrtkoff2.wrl uses the same scene with a different type of aircraft.
Plane Take-Off with Trajectory Tracing Example (vrtkoff_trace)
The vrtkoff_trace is a variant of the vrtkoff example that illustrates how to trace
the trajectory of a moving object (plane) in a scene. It uses a VR Tracer block. Using a
predefined sample time, this block allows you to place markers at the current position
of an object. When the simulation stops, the markers indicate the trajectory path of the
object. This example uses an octahedron as a marker.
Plane Take-Off with HUD Text Example (vrtkoff_hud)
The vrtkoff_hud example illustrates how to display signal values as text in the virtual
world and a simple Head-Up Display (HUD). It is a variant of the vrtkoff example.
The example sends the text to a virtual world using the VR Text Output block. This block
formats the input vector using the format string defined in its mask (see sprintf for
more information) and sends the resulting string to the 'string' field of the associated
VRML text node in the scene.
The example achieves HUD behavior (maintaining constant relative position between
the user and the Text node) by defining a ProximitySensor. This sensor senses user
position and orientation as it navigates through the scene and routes this information
to the translation and rotation of the HUD object (in this case, a VRML Transform that
contains the Text node).