Specifications

158
www.ensembledesigns.com tel +1 530.478.1830
AES/EBU – The digital audio standard dened as a joint
eort of the Audio Engineering Society and the European
Broadcast Union. AES/EBU or AES3 describes a serial
bitstream that carries two audio channels, thus an AES
stream is a stereo pair. The AES/EBU standard covers a
wide range of sample rates and quantizations (bit depths.)
In television systems, these will generally be 48 KHz and
either 20 or 24 bits.
AFD – Active Format Description is a method to carry
information regarding the aspect ratio of the video content.
The specication of AFD was standardized by SMPTE in
2007 and is now beginning to appear in the marketplace.
AFD can be included in both SD and HD SDI transport
systems. There is no legacy analog implementation. (See
WSS).
ASI – A commonly used transport method for MPEG-2 video
streams, ASI or Asynchronous Serial Interface, operates
at the same 270 Mb/s data rate as SD SDI. This makes
it easy to carry an ASI stream through existing digital
television infrastructure. Known more formally as DVB-ASI,
this transport mechanism can be used to carry multiple
program channels.
Bandwidth – Strictly speaking, this refers to the range of
frequencies (i.e. the width of the band of frequency)
used by a signal, or carried by a transmission channel.
Generally, wider bandwidth will carry and reproduce a
signal with greater delity and accuracy.
Beta – Sony Beta SP video tape machines use an analog
component format that is similar to SMPTE, but diers in
the amplitude of the color dierence signals. It may also
carry setup on the luminance channel.
Bit – A binary digit, or bit, is the smallest amount of
information that can be stored or transmitted digitally by
electrical, optical, magnetic, or other means. A single bit
can take on one of two states: On/O, Low/High, Asserted/
Deasserted, etc. It is represented numerically by the
numerals 1 (one) and 0 (zero). A byte, containing 8 bits,
can represent 256 dierent states. The binary number
11010111, for example, has the value of 215 in our base
10 numbering system. When a value is carried digitally,
each additional bit of resolution will double the number
of dierent states that can be represented. Systems that
operate with a greater number of bits of resolution, or
quantization, will be able to capture a signal with more
detail or delity. Thus, a video digitizer with 12 bits of
resolution will capture 4 times as much detail as one with
10 bits.
Blanking – The Horizontal and Vertical blanking intervals of
a television signal refer to the time periods between lines
and between elds. No picture information is transmitted
during these times, which are required in CRT displays
to allow the electron beam to be repositioned for the
start of the next line or eld. They are also used to carry
synchronizing pulses which are used in transmission and
recovery of the image. Although some of these needs are
disappearing, the intervals themselves are retained for
compatibility purposes. They have turned out to be very
useful for the transmission of additional content, such as
teletext and embedded audio.
CAV – Component Analog Video. This is a convenient
shorthand form, but it is subject to confusion. It is
sometimes used to mean ONLY color dierence component
formats (SMPTE or Beta), and other times to include RGB
format. In any case, a CAV signal will always require 3
connectors – either Y/R-Y/B-Y, or R/G/B.
Checkeld – A Checkeld signal is a special test signal that
stresses particular aspects of serial digital transmission.
The performance of the Phase Locked-Loops (PLLs) in an
SDI receiver must be able to tolerate long runs of 0’s and
1s. Under normal conditions, only very short runs of these
are produced due to a scrambling algorithm that is used.
The Checkeld, also referred to as the Pathological test
signal, will “undo” the scrambling and cause extremely
long runs to occur. This test signal is very useful for
testing transmission paths.
Chroma – The color or chroma content of a signal, consisting
of the hue and saturation of the image. See also Color
Dierence.
Component – In a component video system, the totality
of the image is carried by three separate but related
components. This method provides the best image delity
with the fewest artifacts, but it requires three independent
transmission paths (cables). The commonly used
component formats are Luminance and Color Dierence
(Y/Pr/Pb), and RGB. It was far too unwieldy in the early
days of color television to even consider component
transmission.
Composite – Composite television dates back to the early
days of color transmission. This scheme encodes the
color dierence information onto a color subcarrier. The
instantaneous phase of the subcarrier is the color’s hue,
and the amplitude is the colors saturation or intensity.
This subcarrier is then added onto the existing luminance
video signal. This trick works because the subcarrier is
set at a high enough frequency to leave spectrum for the
luminance information. But it is not a seamless matter
to pull the signal apart again at the destination in order
to display it or process it. The resultant artifacts of dot
crawl (also referred to as chroma crawl) are only the most
obvious result. Composite television is the most commonly
used format throughout the world, either as PAL or NTSC. It
is also referred to as Encoded video.
Glossary