Specifications

Chapter 3: Testing Performance
Time Scale Accuracy (TSA) Test
38
Connections
Connect the equipment as shown in the following figure.
Procedure
1 Configure the sine wave source to output a 0 dBm (600 mVpp) sine wave into 50 ohms
with a frequency of 10.00002000 MHz.
2 Adjust source amplitude such that displayed sine wave is 600 mVpp.
3 Press [Default Setup] on the oscilloscope.
4 Set channel 1's vertical scale to 100 mV/div.
5 Set the oscilloscope sample rate to 100 kSa/s.
6 Set the scope's horizontal scale to 20 ms/div.
7 Set the measurement thresholds for all waveforms to a fixed voltage level of 0 V and
±20 mV hysteresis. To do this, go under Measure > Thresholds in the top menu. Then
select Custom: level +/- Hysteresis and enter 20 mV into the Hysteresis field and 0
V into the Threshold Level field.
8 Enable a frequency measurement on channel 1.
9 On the oscilloscope, press [Stop].
10 Press [Clear Display].
11 Press [Run], wait until 10 acquisitions have accumulated, and then press [Stop].
12 Convert the average frequency value to time scale error by subtracting 20 Hz and
dividing by 10 Hz/ppm.
13 Record the time since calibration (in years) in the table. The calibration date can be
found in the Calibration menu window.
14 Calculate the test limits using the following formula and record them in the table.
Test Limits = ±(0.100 + 0.100 x Years Since Calibration)
15 Record the results in the Performance Test Record.