User Manual

Table Of Contents
Motion Blur
Motion blur is also a serious problem for the reason explained in the previous point. The
Disparity and Optical Flow algorithms are unsure whether to assign a pixel in the motion blur to
the moving object or the background pixel. Because the algorithms used are global in nature,
not only the vectors on the motion blur will be wrong, but it will confuse the algorithm on
regions close to the motion blur.
Depth of Field
Depth of field is also another problem related to the above two problems. The problem occurs
when you have a defocused foreground object over a background object that is moving
(Optical Flow case) or shifts between L/R (Stereo Disparity case). The blurred edges will
confuse the tracking because they can’t figure out that the edges are actually two
separate objects.
Where to Calculate Disparity and Optical Flow?
Where you choose to generate optical flow or disparity in your composition can drastically
affect the results.
For example, if you have composited a lens flare in, it is better to compute OpticalFlow/Disparity
before that, since the semi-transparent lens flare will confuse the tracking algorithms.
If you are color correcting the left/right eyes to match or for deflickering, it is better to apply the
OpticalFlow/Disparity afterward, since it will be easier for the tracking algorithm to find matches
if the color matches between frames.
If you are removing lens distortion, think carefully about whether you want to do it before or
after Disparity computation. If you do it after, your Disparity map will also act as a lens distortion
map, combining the two effects as one.
As a general rule of thumb, it is best to use OpticalFlow/Disparity before any compositing
operations except an initial color matching correction and a lens distortion removal.
Cropping the Source
As a general tip, if you are cropping down your input images for any reason, it is probably better
to compute the optical flow or disparity before the crop and then afterward crop the flow/
disparity along with the color.
The reason is that flow/disparity matching works well when there is common pixel data to match
in both frames, but when there are pixels that show up in just one frame (or one eye), then the
Disparity/OpticalFlow nodes must make a guess and fill in the data. The biggest occlusions
going from L <> R are usually pixels along the L/R edges of the images that get moved
outside. This is similar for optical flow when you have a moving camera.
Another thing to be aware of are black borders around the edges of your frames, which you
should crop away.
Nodes with Multiple Outputs
Many of the stereo nodes in the Fusion toolset have multiple outputs. This can cause some
confusion to new users. One particularly confusing thing is that when you drag a Stereo node to
the view, it will always display the left output. There is no way to view the right output without
connecting another node like BC (BrightnessContrast) to the right output and viewing that.
Chapter – 79 Optical Flow and Stereoscopic Nodes 1626