User manual

2638A
Users Manual
1-14
Measurement Specifications
Accuracy specifications generally are valid for 6 ½ digit resolution mode (unless otherwise noted) for front panel input
(Channel 001), after a minimum of 1-hour warm-up, and within an environment temperature range of 18 °C to 28 °C.
24-hour specifications are relative to calibration standards and assume a controlled electromagnetic environment per EN
61326. The confidence level for accuracy specifications is 99 % within 1 year of calibration (unless otherwise noted).
Scan rate (typical, depending on function and range)
Fast ..................................................................... 46 channels per second max (0.02 seconds per channel)
Medium ............................................................... 10 channels per second (0.1 seconds per channel)
Slow .................................................................... 2 channels per second (0.5 seconds per channel)
Display Resolution ................................................ 4 ½ to 6 ½ digits, depending on Sample Rate or NPLC
DC Voltage
Maximum Input ...................................................... 300 V on any range
Common Mode Rejection ..................................... 140 dB at 50 Hz or 60 Hz (1 k unbalance for NPLC of 1 or greater,
±500 V peak maximum in the low lead)
Normal Mode Rejection ........................................ 55 dB for NPLC of 1 or greater and power-line frequency ±0.1 %,
±20 % of range peak maximum
Measurement Method ........................................... Multi-ramp A/D
A/D Linearity .......................................................... 2 ppm of measurement + 1 ppm off range
Input Bias Current ................................................. <30 pA at 25 °C
DC Voltage Input Characteristics
Range Resolution
Resolution
Input Impedance
Fast
4½ Digits
Medium
5 ½ Digits
Slow
6 ½ Digits
100 mV 100.0000 mV
10 µV 1 µV 0.1 µV
10 M or >10 G
[1]
1 V 1.000000 V
100 µV 10 µV 1 µV
10 M or >10 G
[1]
10 V 10.00000 V 1 mV
100 µV 10 µV
10 M or >10 G
[1]
100 V 100.0000 V 10 mV 1 mV
100 µV 10 M ±1 %
300 V 300.000 V 100 mV 10 mV 1 mV
10 M ±1 %
Note:
Input beyond ±12 V is clamped. The clamp current is up to 3 mA. 10 M is default input impedance.
DC Voltage Accuracy
Accuracy is given as ± (% measurement + % of range).
Range
24 Hour
(23 ±1 °C)
90 Days
(23 ±5 °C)
1 Year
(23 ±5 °C)
T.C./ °C
Outside
18 °C to 28 °C
100 mV 0.0025 % + 0.003 % 0.0025 % + 0.0035 % 0.0037 % + 0.0035 % 0.0005 % + 0.0005 %
1 V 0.0018 % + 0.0006 % 0.0018 % + 0.0007 % 0.0025 % + 0.0007 % 0.0005 % + 0.0001 %
10 V 0.0013 % + 0.0004 % 0.0018 % + 0.0005 % 0.0024 % + 0.0005 % 0.0005 % + 0.0001 %
100 V 0.0018 % + 0.0006 % 0.0027 % + 0.0006 % 0.0038 % + 0.0006 % 0.0005 % + 0.0001 %
300 V 0.0018 % + 0.002 % 0.0031 % + 0.002 % 0.0041 % + 0.002 % 0.0005 % + 0.0003 %
Notes:
For conducted disturbances on mains input >1 V from 10 MHz to 20 MHz, add 0.02 % of range. For disturbances
>3 V, accuracy is unspecified.
For radiated disturbances >1V/m from 450 MHz to 550 MHz, add 0.02 % of range. For disturbances > 3 V/m,
accuracy is unspecified.