User Manual

Table Of Contents
Introduction to Tracking
Tracking is one of the most useful and essential techniques available to a compositor. It can
roughly be defined as the creation of a motion path from analyzing a specific area in a clip over
time. Fusion provides a variety of different tracking nodes that let you analyze different kinds
of motion.
Each tracker type has its own chapter in this manual. This chapter covers the tracking
techniques with the Camera Tracker node.
What Is 3D Camera Tracking?
Camera tracking is used for match moving, and it’s a vital link between 2D scenes and 3D
scenes, allowing compositors to integrate 3D CGI elements into live-action clips. The Camera
Tracker node calculates the path of a live-action camera and generates a virtual camera in 3D
space. This virtual camera is intended to be identical to the actual camera that shot the scene,
not only in terms of motion but in matching the lens focal length as well. The calculated position
and movement of the virtual camera is central to realistically compositing 3D elements with
live action.
An example of 3D elements integrated in a live-action scene.
How Camera Tracking Works
Camera tracking begins by tracking the movement of fixed features from one frame to the next.
To put it another way, camera tracking algorithms follow features that are “nailed to the set.”
Objects in the scene that move independently of the camera movement in the shot, such as
cars driving or people walking, cause poor tracks, so masks can be used to restrict the features
that are tracked in order to improve the results. Additionally, it is helpful to provide specific
camera metadata, such as the sensor size and the focal length of the lens. This information
guides the scene reconstruction calculation, called a solver, toward generating a more accurate
virtual camera.
The Camera Tracker’s purpose is to create a 3D animated camera and point cloud of the scene.
A point cloud is a large group of points generated by the solver that roughly recreates the 3D
positions of the tracked features in a scene. The point cloud can then be used as a guide when
integrating other 2D or 3D elements alongside live-action features.
1589Chapter – 77 3D Camera Tracking