User Manual

Table Of Contents
Matching the Live-Action Camera
Once you have completed tracking, the next stage of this workflow requires the controls in the
Camera tab. This is where you define the actual camera used on set, primarily the film gate size
and focal length. This information should have been logged on the set to make available for
post-production. When using camera-original media, you can sometimes locate this information
in the file metadata.
To locate camera metadata, do the following:
If you are using DaVinci Resolve, select the MediaIn node with the camera clip, open
the Metadata Editor, and view the Camera metadata preset.
If you are using Fusion Studio, display the metadata subview from the viewer toolbar.
If the actual values are not known, try a best guess. The solver attempts to find a camera near
these parameters, and it helps the solver by giving parameters as close to the live action as
possible. The more accurate the information you provide, the more accurate the solver
calculation. At a minimum, try to at least choose the correct camera model from the Film Gate
menu. If the film gate is incorrect, the chances that the Camera Tracker correctly calculates the
lens focal length become very low.
The Camera tab in the Camera Tracker tool.
Unlike the Track and Solve tabs, the Camera tab does not include a button at the top of the
Inspector that executes the process. There is no process to perform on the Camera tab once
you configure the camera settings. After you set the camera settings to match the live-action
camera, you move to the Solve tab.
Running the Solver
The next step in this workflow involves the controls found in the Solve tab. Solving is a
compute-intensive process in which the Camera Tracker analyzes the currently existing tracks
to create a 3D scene. It generates a virtual camera that matches the live action and a point
cloud consisting of 3D locators that recreate the tracked features in 3D space. The analysis is
based on parallax in the frame, which is the perception that features closer to the camera move
quicker than features further away. This is much like when you look out the side window of a car
and can see objects in the distance move more slowly than items near the roadside.
The trackers found in the Track phase of this workflow have a great deal to do with the success
or failure of the solver, making it critical to deliver the best set of tracking points from the very
start. Although the masking you create to occlude objects from being tracked helps to omit
problematic tracking points, you almost always need to further filter and delete poor quality
tracks in the Solver tab. That’s why, from a user’s point of view, solving should be thought of as
an iterative process.
1596Chapter – 77 3D Camera Tracking