Specifications
59
Because the input attenuator has no effect on the actual noise generated in
the system, some early spectrum analyzers simply left the displayed noise
at the same position on the display regardless of the input attenuator
setting. That is, the IF gain remained constant. This being the case, the input
attenuator affected the location of a true input signal on the display. As input
attenuation was increased, further attenuating the input signal, the location
of the signal on the display went down while the noise remained stationary.
Beginning in the late 1970s, spectrum analyzer designers took a different
approach. In newer analyzers, an internal microprocessor changes the IF
gain to offset changes in the input attenuator. Thus, signals present at the
analyzer’s input remain stationary on the display as we change the input
attenuator, while the displayed noise moves up and down. In this case, the
reference level remains unchanged. This is shown in Figure 5-1. As the
attenuation increases from 5 to 15 to 25 dB, the displayed noise rises
while the –30 dBm signal remains constant. In either case, we get the best
signal-to-noise ratio by selecting minimum input attenuation.
Resolution bandwidth also affects signal-to-noise ratio, or sensitivity. The
noise generated in the analyzer is random and has a constant amplitude over
a wide frequency range. Since the resolution, or IF, bandwidth filters come
after the first gain stage, the total noise power that passes through the filters
is determined by the width of the filters. This noise signal is detected and
ultimately reaches the display. The random nature of the noise signal causes
the displayed level to vary as:
10 log (BW
2
/BW
1
)
where BW
1
= starting resolution bandwidth
BW
2
= ending resolution bandwidth
Figure 5-1. Reference level remains constant when changing input attenuation