Specifications

25
The width of the resolution (IF) filter determines the maximum rate at which
the envelope of the IF signal can change. This bandwidth determines how far
apart two input sinusoids can be so that after the mixing process they will
both be within the filter at the same time. Let’s assume a 21.4 MHz final IF
and a 100 kHz bandwidth. Two input signals separated by 100 kHz would
produce mixing products of 21.35 and 21.45 MHz and would meet the
criterion. See Figure 2-16. The detector must be able to follow the changes in
the envelope created by these two signals but not the 21.4 MHz IF signal itself.
The envelope detector is what makes the spectrum analyzer a voltmeter.
Let’s duplicate the situation above and have two equal-amplitude signals in
the passband of the IF at the same time. A power meter would indicate a
power level 3 dB above either signal, that is, the total power of the two.
Assume that the two signals are close enough so that, with the analyzer
tuned half way between them, there is negligible attenuation due to the
roll-off of the filter
8
. Then the analyzer display will vary between a value
that is twice the voltage of either (6 dB greater) and zero (minus infinity
on the log scale). We must remember that the two signals are sine waves
(vectors) at different frequencies, and so they continually change in phase
with respect to each other. At some time they add exactly in phase; at
another, exactly out of phase.
So the envelope detector follows the changing amplitude values of the peaks
of the signal from the IF chain but not the instantaneous values, resulting
in the loss of phase information. This gives the analyzer its voltmeter
characteristics.
Digitally implemented resolution bandwidths do not have an analog envelope
detector. Instead, the digital processing computes the root sum of the squares
of the I and Q data, which is mathematically equivalent to an envelope
detector. For more information on digital architecture, refer to Chapter 3.
Displays
Up until the mid-1970s, spectrum analyzers were purely analog. The
displayed trace presented a continuous indication of the signal envelope,
and no information was lost. However, analog displays had drawbacks. The
major problem was in handling the long sweep times required for narrow
resolution bandwidths. In the extreme case, the display became a spot
that moved slowly across the cathode ray tube (CRT), with no real trace
on the display. So a meaningful display was not possible with the longer
sweep times.
Agilent Technologies (part of Hewlett-Packard at the time) pioneered a
variable-persistence storage CRT in which we could adjust the fade rate of
the display. When properly adjusted, the old trace would just fade out at
the point where the new trace was updating the display. This display was
continuous, had no flicker, and avoided confusing overwrites. It worked quite
well, but the intensity and the fade rate had to be readjusted for each new
measurement situation. When digital circuitry became affordable in the
mid-1970s, it was quickly put to use in spectrum analyzers. Once a trace had
been digitized and put into memory, it was permanently available for display.
It became an easy matter to update the display at a flicker-free rate without
blooming or fading. The data in memory was updated at the sweep rate, and
since the contents of memory were written to the display at a flicker-free
rate, we could follow the updating as the analyzer swept through its selected
frequency span just as we could with analog systems.
8. For this discussion, we assume that the filter is
perfectly rectangular.