Specifications

160
www.ensembledesigns.com tel +1 530.478.1830
point uses three at contact blades in a triangular
arrangement, set in a rectangular connector. The IEC
specication does not dictate line voltage or frequency.
Therefore, the user must take care to verify that a device
either has a universal input (capable of 90 to 230 volts,
either 50 or 60 Hz), or that a line voltage switch, if present,
is set correctly.
Interlace – Human vision can be fooled to see motion by
presenting a series of images, each with a small change
relative to the previous image. In order to eliminate the
icker, our eyes need to see more than 30 images per
second. This is accomplished in television systems by
dividing the lines that make up each video frame (which
run at 25 or 30 frames per second) into two elds. All of
the odd-numbered lines are transmitted in the rst eld,
the even-numbered lines are in the second eld. In this
way, the repetition rate is 50 or 60 Hz, without using more
bandwidth. This trick has worked well for years, but it
introduces other temporal artifacts. Motion pictures use
a slightly dierent technique to raise the repetition rate
from the original 24 frames that make up each second of
lm—they just project each one twice.
IRE - Video level is measured on the IRE scale, where 0 IRE is
black, and 100 IRE is full white. The actual voltages that
these levels correspond to can vary between formats.
ITU-R 601 - This is the principal standard for standard
denition component digital video. It denes the
luminance and color dierence coding system that is also
referred to as 4:2:2. The standard applies to both PAL and
NTSC derived signals. They both will result in an image
that contains 720 pixels horizontally, with 486 vertical
pixels in NTSC, and 576 vertically in PAL. Both systems use
a sample clock rate of 27 Mhz, and are serialized at 270
Mb/s.
Jitter - Serial digital signals (either video or audio) are subject
to the eects of jitter. This refers to the instantaneous
error that can occur from one bit to the next in the exact
position of each digital transition. Although the signal may
be at the correct frequency on average, in the interim it
varies. Some bits come slightly early, others come slightly
late. The measurement of this jitter is given either as the
amount of time uncertainty or as the fraction of a bit
width. For 270 Mb/s SD video, the allowable jitter is 740
picoseconds, or 0.2 UI (Unit Interval – one bit width). For
1.485 Gb/s HD, the same 0.2UI spec corresponds to just
135 pico seconds.
Luminance - The “black & white” content of the image.
Human vision had more acuity in luminance, so television
systems generally devote more bandwidth to the
luminance content. In component systems, the luminance
is referred to as Y.
MPEG – The Moving Picture Experts Group is an industry
group that develops standards for the compression of
moving pictures for television. Their work is an on-going
eort. The understanding of image processing and
information theory is constantly expanding. And the raw
bandwidth of both the hardware and software used for
this work is ever increasing. Accordingly, the compression
methods available today are far superior to the algorithms
that originally made the real-time compression and
decompression of television possible. Today, there are
many variations of these techniques, and the term MPEG
has to some extent become a broad generic label.
Metadata – This word comes from the Greek, meta means
'beyond' or 'after'. When used as a prex to 'data', it can
be thought of as 'data about the data'. In other words, the
metadata in a data stream tells you about that data – but
it is not the data itself. In the television industry, this word
is sometimes used correctly when, for example, we label
as metadata the timecode which accompanies a video
signal. That timecode tells you something about the video,
i.e. when it was shot, but the timecode in and of itself is
of no interest. But in our industry's usual slovenly way
in matters linguistic, the term metadata has also come
to be used to describe data that is associated with the
primary video in a datastream. So embedded audio will
(incorrectly) be called metadata when it tells us nothing
at all about the pictures. Oh well.
Multi-mode – Multi-mode bers have a larger diameter
core than single mode bers (either 50 or 62.5 microns
compared to 9 microns), and a correspondingly larger
aperture. It is much easier to couple light energy into
a multi-mode ber, but internal reections will cause
multiple “modes” of the signal to propagate down the ber.
This will degrade the ability of the ber to be used over
long distances. See also Single Mode.
NTSC – The color television encoding system used in North
America was originally dened by the National Television
Standards Committee. This American standard has also
been adopted by Canada, Mexico, Japan, Korea, and
Taiwan. (This standard is referred to disparagingly as
Never Twice Same Color.)
Optical – An optical interface between two devices carries
data by modulating a light source. This light source is
typically a laser or laser diode (similar to an LED) which is
turned on and o at the bit rate of the datastream.
The light is carried from one device to another through a glass
ber. The ber’s core acts as a waveguide or lightpipe to
carry the light energy from one end to another. Optical
transmission has two very signicant advantages over
metallic copper cables. First, it does not require that the
two endpoint devices have any electrical connection
to each other. This can be very advantageous in large
facilities where problems with ground loops appear. And
secondly, and most important, an optical interface can
carry a signal for many kilometers or miles without any
degradation or loss in the recovered signal. Copper is
barely useful at distances of just 1000 feet.
Oversampling – A technique to perform digital sampling
at a multiple of the required sample rate. This has the
advantage of raising the Nyquist Rate (the maximum
frequency that can be reproduced by a given sample rate)