User manual
Table Of Contents
- Table of Contents
- Part I: Getting into the details
- About this manual
- Setting up your system
- VST Connections
- The Project window
- Working with projects
- Creating new projects
- Opening projects
- Closing projects
- Saving projects
- The Archive and Backup functions
- Startup Options
- The Project Setup dialog
- Zoom and view options
- Audio handling
- Auditioning audio parts and events
- Scrubbing audio
- Editing parts and events
- Range editing
- Region operations
- The Edit History dialog
- The Preferences dialog
- Working with tracks and lanes
- Playback and the Transport panel
- Recording
- Quantizing MIDI and audio
- Fades, crossfades and envelopes
- The arranger track
- The transpose functions
- Using markers
- The Mixer
- Control Room (Cubase only)
- Audio effects
- VST instruments and instrument tracks
- Surround sound (Cubase only)
- Automation
- Audio processing and functions
- The Sample Editor
- The Audio Part Editor
- The Pool
- The MediaBay
- Introduction
- Working with the MediaBay
- The Define Locations section
- The Locations section
- The Results list
- Previewing files
- The Filters section
- The Attribute Inspector
- The Loop Browser, Sound Browser, and Mini Browser windows
- Preferences
- Key commands
- Working with MediaBay-related windows
- Working with Volume databases
- Working with track presets
- Track Quick Controls
- Remote controlling Cubase
- MIDI realtime parameters and effects
- Using MIDI devices
- MIDI processing
- The MIDI editors
- Introduction
- Opening a MIDI editor
- The Key Editor – Overview
- Key Editor operations
- The In-Place Editor
- The Drum Editor – Overview
- Drum Editor operations
- Working with drum maps
- Using drum name lists
- The List Editor – Overview
- List Editor operations
- Working with SysEx messages
- Recording SysEx parameter changes
- Editing SysEx messages
- The basic Score Editor – Overview
- Score Editor operations
- Expression maps (Cubase only)
- Note Expression (Cubase only)
- The Logical Editor, Transformer, and Input Transformer
- The Project Logical Editor (Cubase only)
- Editing tempo and signature
- The Project Browser (Cubase only)
- Export Audio Mixdown
- Synchronization
- Video
- ReWire
- File handling
- Customizing
- Key commands
- Part II: Score layout and printing (Cubase only)
- How the Score Editor works
- The basics
- About this chapter
- Preparations
- Opening the Score Editor
- The project cursor
- Playing back and recording
- Page Mode
- Changing the zoom factor
- The active staff
- Making page setup settings
- Designing your work space
- About the Score Editor context menus
- About dialogs in the Score Editor
- Setting clef, key, and time signature
- Transposing instruments
- Printing from the Score Editor
- Exporting pages as image files
- Working order
- Force update
- Transcribing MIDI recordings
- Entering and editing notes
- About this chapter
- Score settings
- Note values and positions
- Adding and editing notes
- Selecting notes
- Moving notes
- Duplicating notes
- Cut, copy, and paste
- Editing pitches of individual notes
- Changing the length of notes
- Splitting a note in two
- Working with the Display Quantize tool
- Split (piano) staves
- Strategies: Multiple staves
- Inserting and editing clefs, keys, or time signatures
- Deleting notes
- Staff settings
- Polyphonic voicing
- About this chapter
- Background: Polyphonic voicing
- Setting up the voices
- Strategies: How many voices do I need?
- Entering notes into voices
- Checking which voice a note belongs to
- Moving notes between voices
- Handling rests
- Voices and Display Quantize
- Creating crossed voicings
- Automatic polyphonic voicing – Merge All Staves
- Converting voices to tracks – Extract Voices
- Additional note and rest formatting
- Working with symbols
- Working with chords
- Working with text
- Working with layouts
- Working with MusicXML
- Designing your score: additional techniques
- Scoring for drums
- Creating tablature
- The score and MIDI playback
- Tips and Tricks
- Index
494
Synchronization
Background
What is synchronization?
Synchronization is the process of getting two or more de-
vices to play back together at the same exact speed and
position. These devices can range from audio and video
tape machines to digital audio workstations, MIDI sequenc-
ers, synchronization controllers, and digital video devices.
Synchronization basics
There are three basic components of audio/visual synchro-
nization: position, speed, and phase. If these parameters
are known for a particular device (the master), then a sec
-
ond device (the slave) can have its speed and position “re-
solved” to the first in order to have the two devices play in
perfect sync with one another.
Position
The position of a device is represented by either samples
(audio word clock), video frames (timecode), or musical
bars and beats (MIDI clock).
Speed
The speed of a device is measured either by the frame rate
of the timecode, the sample rate (audio word clock) or by
the tempo of the MIDI clock (bars and beats).
Phase
Phase is the alignment of the position and speed compo-
nents to each other. In other words, each pulse of the
speed component should be aligned with each measure
-
ment of the position for the most accuracy. Each frame of
timecode should be perfectly lined up with the correct sam
-
ple of audio. Put simply, phase is the very precise position
of a synchronized device relative to the master (sample ac-
curacy).
Machine control
When two or more devices are synchronized, the question
remains: how do we control the entire system? We need
to be able to locate to any position, play, record, and even
jog and scrub the entire system using one set of controls.
Machine control is an integral part of any synchronization
setup. In many cases, the device simply called “the mas-
ter” will control the whole system. However, the term
“master” can also refer to the device that is generating the
position and speed references. Care must be taken to dif-
ferentiate between the two.
Master and slave
Calling one device the “master” and another the “slave”
can lead to a great deal of confusion. The timecode rela-
tionship and the machine control relationship must be dif-
ferentiated in this regard.
In this document, the following terms are used:
• The “timecode master” is the device generating position infor-
mation or timecode.
• The “timecode slave” is any device receiving the timecode and
synchronizing or “locking” to it.
• The “machine control master” is the device that issues trans-
port commands to the system.
• The “machine control slave” is the device receiving those
commands and responding to them.
For example, Cubase could be the machine control mas-
ter, sending transport commands to an external device
which in turn sends timecode and audio clock information
back to Cubase. In that case, Cubase would also be the
timecode slave at the same time. So calling Cubase sim
-
ply the master is misleading.
Ö In most scenarios, the machine control slave is also
the timecode master. Once it receives a play command,
that device starts generating timecode for all the timecode
slaves to synchronize to.
Timecode (positional references)
The position of any device is most often described using
timecode. Timecode represents time using hours, min
-
utes, seconds, and frames to provide a location for each
device. Each frame represents a visual film or video frame.
Timecode can be communicated in several ways:
• LTC (Longitudinal Timecode) is an analog signal that can be
recorded on tape. It should be used for positional information
primarily. It can also be used for speed and phase information
as a last resort if no other clock source is available.










