Agilent 34410A/11A 6 ½ Digit Multimeter (includes the L4411A 1U DMM) Service Guide Agilent Technologies
Notices © Agilent Technologies, Inc. 2005 - 2007 Warranty No part of this manual may be reproduced in any form or by any means (including electronic storage and retrieval or translation into a foreign language) without prior agreement and written consent from Agilent Technologies, Inc. as governed by United States and international copyright laws. The material contained in this document is provided “as is,” and is subject to being changed, without notice, in future editions.
Safety Information Do not defeat power cord safety ground feature. Plug in to a grounded (earthed) outlet. Do not use product in any manner not specified by the manufacturer. Do not install substitute parts or perform any unauthorized modification to the product. Return the product to an Agilent Technologies Sales and Service Office for service and repair to ensure that safety features are maintained.
Protection Limits The Agilent 34410A/11A Digital Multimeter provides protection circuitry to prevent damage to the instrument and to protect against the danger of electric shock, provided the Protection Limits are not exceeded. To ensure safe operation of the instrument, do not exceed the Protection Limits shown on the front and rear panel, and defined below: Input Terminal Protection Limits Protection Limits are defined for the input terminals: Main Input (HI and LO) Terminals.
Additional Notices Waste Electrical and Electronic Equipment (WEEE) Directive 2002/96/EC This product complies with the WEEE Directive (2002/96/EC) marking requirement. The affixed product label (see below) indicates that you must not discard this electrical/electronic product in domestic household waste. Product Category: With reference to the equipment types in the WEEE directive Annex 1, this product is classified as a "Monitoring and Control instrumentation" product.
DECLARATION OF CONFORMITY According to EN ISO/IEC 17050-1:2004 Manufacturer’s Name: Manufacturer’s Address: Agilent Technologies, Incorporated 900 South Taft Ave Loveland, CO 80537 USA Declares under sole responsibility that the product as originally delivered Product Name: Model Number: Product Options: 6 ½ Digit Multimeter 34410A, 34411A, L4411A This declaration covers all options of the above product(s) complies with the essential requirements of the following applicable European Directives, and car
Agilent 34410A/11A/L4411A at a Glance The Agilent 34410A, 34411A, or L4411A multimeter provides 6½- digit, high- performance dc and ac measurements. • Voltage and Current Measurements. DC and AC(true- rms). • Resistance Measurements. 2- wire and 4- wire. • Continuity and Diode Testing. • Frequency and Period Measurements. • Capacitance Measurements. • Temperature Measurements. Thermistor and RTD. • Auto and Manual Ranging. • Math Features. Null, dB, dBm, limits, and statistics. • Data Logging.
The Front Panel at a Glance 1 2 3 4 5 6 7 On/Off Switch Measurement Function Keys Configuration Key Second Display Key (Reset) Null Key (Math Functions) Data Logger Key (Utility) Trigger Key (Auto Trig) 8 9 10 11 12 13 14 Exit Key (Auto Range) Shift Key (Local) Menu Navigation Keypad (Range) Front/Rear Switch HI and LO Sense Terminals (4-wire measurements) HI and LO Input Terminals (all functions except current) Current Input Terminal (ac and dc current) WA R N I N G Front/Rear Switch: Do not change th
The Rear Panel at a Glance 1 2 3 4 5 6 7 8 9 10 11 12 Current Input Fuse (front and rear) HI and LO Sense Terminals (4-wire resistance and temperature) HI and LO Input Terminals (voltage, resistance, and other functions) Current Input Terminal (ac current and dc current only) External Trigger Input (BNC) Voltmeter Complete Output (BNC) LAN Interface Connector USB Interface Connector GPIB Interface Connector Chassis Ground Power-Line Voltage Setting Power-Line Fuse-Holder Assembly WA R N I N G For protect
The Display at a Glance Alphanumeric Displays: 1 Primary display line 2 Secondary display line Annunciators: 3 * (measurement in progress) 4 Hi-Z (high input impedance, Vdc only) 5 OComp (offset compensation) 6 ManRng (manual ranging) 7 Trig (wait-for-trigger state) 8 Hold (reading hold) 9 Remote (remote interface operation) 10 Error (error in queue) 11 Null (null function enabled) Annunciators: 12 Shift (shift key just pressed) 13 Math (dB or dBm function enabled) 14 Stats (statistics functions enabled)
The L4411A at a Glance 1 L4411A Option 1 - Front Panel Measurement Terminals 2 2 4 3 1 2 3 4 5 6 7 8 9 5 6 7 8 9 On/Stand-By button Input measurement terminals - rear panel or front panel (optional) Input current protection fuse (Agilent p/n 2110-0780) External trigger input - BNC Voltmeter (measurement) complete output - BNC GPIB interface connector LAN Reset - resets the L4411A LAN configuration to its factory default settings LAN Interface connector - non Auto-MDIX; may require crossover cable
In This Guide… 1 Specifications This chapter lists the multimeter’s specifications and describes how to interpret these specifications. 2 Quick Start This chapter prepares the multimeter for use and helps you get familiar with a few of the front panel features. 3 Calibration This chapter provides calibration, verification, and adjustment procedures for the multimeter.
Contents 1 Specifications 17 DC Characteristics 19 AC Characteristics 22 Frequency and Period Characteristics 24 Capacitance Characteristics 26 Temperature Characteristics 26 Additional 34411A/L4411A Specifications 27 Measurement and System Speeds 28 System Speeds 29 Data From Memory 30 General Specifications 30 Dimensions 32 To Calculate Total Measurement Error 33 Interpreting Accuracy Specifications 35 Transfer Accuracy 35 24–Hour Accuracy 35 90–Day and 1–Year Accuracy 35 Temperature Coefficients 35 Co
Contents 2 Quick Start 37 Basic Multimeter Operations 38 Preparing the Multimeter for Use 38 Using the Front Panel 39 Front-Panel Keys 39 Front-Panel Display Shortcuts 40 Making Basic Measurements 41 To Measure DC Voltage 42 To Measure AC Voltage 42 To Measure DC Current 43 To Measure AC Current 43 To Make a 2-Wire Resistance Measurement 44 To Make a 4-wire Resistance Measurement 44 To Measure Frequency 45 To Measure Period 45 To Measure Capacitance 46 To Make a 2-Wire Temperature Measurement 47 To Make
Contents 3 Calibration Procedures 57 Agilent Technologies Calibration Services 58 Calibration Interval 58 Adjustment is Recommended 58 Time Required for Calibration 59 Automating Calibration Procedures 59 Recommended Test Equipment 60 Performance Verification Tests 60 Self–Test 61 Quick Performance Check 61 Performance Verification Tests 62 Input Connections 62 Test Considerations 63 Verification Tests 64 Zero Offset Verification 64 Gain Verification 66 Optional AC Voltage Performance Verification Tests
Contents 4 Disassembly and Repair 103 Operating Checklist 104 Types of Service Available 105 Extended Service Contracts 105 Obtaining Repair Service (Worldwide) 105 Repackaging for Shipment 106 Cleaning 106 To Replace the 34410A/11A Power Line Fuse 106 To Replace the Current Input Fuse 107 Self Test Procedures 107 Power–On Self–Test 107 34410A/11A Complete Self–Test 107 L4411A Complete Self–Test 107 Self Test Error Numbers 108 Calibration Errors 109 34410A/11A Display and Keypad Tests 110 Electrostatic
Agilent 34410A/11A/L4411A 6½ Digit Multimeter Service Guide 1 Specifications DC Characteristics 19 AC Characteristics 22 Frequency and Period Characteristics 24 Capacitance Characteristics 26 Temperature Characteristics 26 Additional 34411A/L4411A Specifications 27 Measurement and System Speeds 28 General Specifications (34410A/11A) 30 Dimensions 32 To Calculate Total Measurement Error 33 Interpreting Accuracy Specifications 35 Configuring for Highest Accuracy Measurements 36 Agilent Technologies 17
1 Specifications These specifications apply when using the 34410A/11A/L4411A multimeter in an environment that is free of electromagnetic interference and electrostatic charge. When using the multimeter in an environment where electromagnetic interference or significant electrostatic charge is present, measurement accuracy may be reduced.
Specifications 1 DC Characteristics Accuracy Specifications ( % of reading + % of range ) [1] Function Range [3] DC 100.0000 mV 1.000000 V 10.00000 V 100.0000 V 1000.000 V [5] Test Current or Burden Voltage 24 Hour [2] TCAL ± 1 °C 90 Day TCAL ± 5 °C 1 Year TCAL ± 5 °C Temperature Coefficient 0 °C to (TCAL – 5 °C) (TCAL + 5 °C) to 55 °C 0.0030+0.0030 0.0020+0.0006 0.0015+0.0004 0.0020+0.0006 0.0020+0.0006 0.0040+0.0035 0.0030+0.0007 0.0020+0.0005 0.0035+0.0006 0.0035+0.0006 0.0050+0.0035 0.
1 Specifications Performance Versus Integration Time – 60Hz (50Hz) Power line frequency RMS Noise Adder % range [4] Integration Time Number of Power Line Cycles (NPLC) 0.001 [6] 0.002 [6] 0.006 0.02 0.06 0.2 1 2 10 100 Resolution ppm Range [1] NMR db [2] Readings / Second [3] DCV 10, 1000 V DCV 1, 100 V Resistance 1K, 10K ohm DCV 0.1 V Resistance 100 ohm DCI 1 amp 50,000 30 0.0060 0 0.0100 0.1000 25,000 15 0.0030 0 0.0060 0.0600 10,000 6 0.0012 0 0.0040 0.0600 3000 3 0.0006 0 0.0030 0.0300 1000 1.
Specifications 1 DC Voltage Measurement Method: 10 VDC Linearity: Input Resistance: 0.1 V, 1 V, 10 V Ranges 100 V, 1000 V Ranges Input Bias Current: Input Terminals: Input Protection: DC CMRR Continuously integrating multi–slope IV 0.0002% of reading + 0.0001% of range Selectable 10 MΩ or >10 GΩ (For these ranges, inputs beyond ±17 V are clamped through 100 kΩ typical) 10 MΩ ±1% < 50 pA at 25 °C Copper alloy 1000 V 140 dB for 1 kΩ unbalance in LO lead.
1 Specifications AC Characteristics Accuracy Specifications ( % of reading + % of range ) [1] Function Range [3] Frequency Range 24 Hour [2] TCAL ± 1 °C 90 Day TCAL ± 5 °C 1 Year TCAL ± 5 °C Temperature Coefficient 0 °C to (TCAL – 5 °C) (TCAL + 5 °C) to 55 °C 0.50 + 0.03 0.10 + 0.03 0.05 + 0.03 0.09 + 0.05 0.30 + 0.08 1.20 + 0.50 0.50 + 0.03 0.10 + 0.03 0.06 + 0.03 0.10 + 0.05 0.40 + 0.08 1.20 + 0.50 0.010 + 0.003 0.008 + 0.003 0.005 + 0.003 0.010 + 0.005 0.020 + 0.008 0.120 + 0.
Specifications 1 Voltage Transfer Accuracy ( typical ) Frequency 10 Hz to 300 kHz Conditions: Error (24 hour % of range + % of reading)/5 - Sinewave input only using slow filter. - Within 10 minutes and ±0.5 °C. - Within ±10% of initial voltage and ±1% of initial frequency. - Following a 2–hour warm–up. - Fixed range between 10% and 100% of full scale (and <120 V). - Measurements are made using accepted metrology practices True RMS AC Voltage Measurement Type AC–coupled True RMS.
1 Specifications Frequency and Period Characteristics Accuracy Specifications ( % of reading ) [ 1, 3 ] Function Range Frequency Range 24 Hour [2] TCALC ± 1 °C 90 Day TCAL ± 5 °C 1 Year TCAL ± 5 °C Temperature Coefficient 0 °C to (TCAL – 5 °C) (TCAL + 5 °C) to 55 °C Frequency Period 100 mV to 750 V 3 Hz – 5 Hz 5 Hz – 10 Hz 10 Hz – 40 Hz 40 Hz – 300 kHz 0.07 0.04 0.02 0.005 0.07 0.04 0.02 0.006 0.07 0.04 0.02 0.007 0.005 0.005 0.001 0.
Specifications 1 Frequency and Period Measurement Type: Input Impedance: Input Protection: Reciprocal–counting technique. AC–coupled input using the AC voltage measurement function. 1 MΩ ±2%, in parallel with <150 pF 750 V rms all ranges Measurement Considerations All frequency counters are susceptible to error when measuring low–voltage, low–frequency signals. Shielding inputs from external noise pickup is critical for minimizing measurement errors.
1 Specifications Capacitance Characteristics Accuracy Specifications ( % of reading + % of range ) [1] Function Range [2] Test Current 1 Year TCAL ± 5 °C Temperature Coefficient 0 °C to (TCAL – 5 °C) (TCAL + 5 °C) to 55 °C Capacitance 1 nF 500 nA 0.50 + 0.50 0.05 + 0.05 10 nF 1 µA 0.40 + 0.10 0.05 + 0.01 100 nF 10 µA 0.40 + 0.10 0.01 + 0.01 1 µF 100 µA 0.40 + 0.10 0.01 + 0.01 10 µF 1 mA 0.40 + 0.10 0.01 + 0.01 [ 1 ] Specifications are for 90 minute warm–up using Math Null.
Specifications Function[1] DCV 2-wire Ω DCI ACV ACI Frequency Period 1 Digits 4.5 5.5 6.5 50k 50k 50k 500 500 450 450 10k 10k 10k 500 150 90 90 1k 1k 1k 150 150 10 10 Additional 34411A/L4411A Specifications Resolution Overall Bandwidth, DCV and DCI Triggering Timebase Resolution Trigger Jitter External Trigger Latency Internal Trigger Level Accuracy See table on page 20 15 kHz typical @ 20 µs aperture (–3 dB) Pre or Post, Internal or External, Positive or Negative 19.9524 µs, 0.
1 Specifications Measurement and System Speeds DMM Measurements Speeds Direct I/O Measurements [1] Single Reading – Measure and I/O Time Function Resolution (NPLC) DCV (10 V Range) 0.001 [2] 0.0026 0.0029 0.0046 0.0032 50000 0.006 0.0026 0.0029 0.0046 0.0032 10000 0.06 0.0031 0.0032 0.0047 0.0040 1000 1 0.0190 0.0190 0.0200 0.0190 60 ACV (10 V Range) 2–Wire Ω (10 kΩ Range) 4–wire Ω (10 kΩ Range) Frequency 1 KHz, 10 V Range Fast Filter USB 2.0 Sec Slow Filter 0.0100 0.
Specifications 1 Direct I/O Measurements [1] (any remote interface) Sustained maximum reading rate to I/O, 32–bit BINARY data ("SAMP:COUN 50000;:R?") Function Resolution (NPLC) rdgs/Sec DCV 0.001 0.006 50000 (34411A/L441A only) 10000 ACV Fast Filter 500 2–Wire Ω 0.001 0.006 50000 (34411A/L4411A only) 10000 4–Wire Ω 0.001 0.
1 Specifications Data From Memory Maximum reading rate out of memory (Sample count 50000, trigger count 1, "FETC?" or "R?") Readings ASCII GPIB rdg/Sec USB 2.0 rdg/Sec LAN (VXI-11) rgs/Sec LAN (Sockets) rdg/Sec 4000 8500 7000 8500 4–byte Binary 89,000 265,000 110,000 270,000 8–byte Binary 47,000 154,000 60,000 160,000 General Specifications (34410A/11A) Power Supply: Power Line Frequency: 100V/120V/ 220V / 240V ± 10% 50–60 Hz ± 10%, 400 Hz ± 10%.
Specifications Remote Interfaces Language LXI Compliance Warm–up Time 1 GPIB IEEE–488, 10/100Mbit LAN, USB 2.0 Standard SCPI – 1994.0, IEEE–488.2 LXI Class C, Version 1.0 90 minutes General Specifications (L4411A) Power Supply: Power Line Frequency: Power Consumption: Operating Environment: Storage Temperature Dimensions (HxWxL): Weight: Display: Safety: EMC: Warranty: Universal 100V to 240V ± 10% 45 Hz to 440 Hz ± 10% automatically sensed Automatically sensed at power–on, 400 Hz defaults to 50Hz.
1 Specifications Dimensions L4411A 212.3 40.9 363.
Specifications 1 To Calculate Total Measurement Error The multimeter's accuracy specifications are expressed in the form: ( % of reading + % of range ). In addition to the reading error and range error, you may need to add additional errors for certain operating conditions. Check the list below to make sure you include all measurement errors for a given function. Also, make sure you apply the conditions as described in the footnotes on the specification pages.
1 Specifications Understanding the " % of range " Error The range error compensates for inaccuracies that result from the function and range you select. The range error contributes a constant error, expressed as a percent of range, independent of the input signal level. The following table shows the range error applied to the multimeter's 24–hour dc voltage specification. Range Input Level Range Error (% of range) Range Error (Voltage) 10 VDC 10 VDC 0.0004 ±40 mV 10 VDC 1 VDC 0.
Specifications 1 Interpreting Accuracy Specifications Transfer Accuracy Transfer accuracy refers to the error introduced by the multimeter due to noise and short–term drift. This error becomes apparent when comparing two nearly–equal signals for the purpose of "transferring" the known accuracy of one device to the other. 24–Hour Accuracy The 24–hour accuracy specification indicates the multimeter's relative accuracy over its full measurement range for short time intervals and within a stable environment.
1 Specifications Configuring for Highest Accuracy Measurements The measurement configurations shown below assume that the multimeter is in its power–on or reset state. It is also assumed that auto–ranging is enabled to ensure proper full scale range selection. DC Voltage, DC Current, and Resistance Measurements: • Select NPLC and 100 (NPLCs) for INTEGRATION. • Set INPUT Z to HI–Z (for the 100 mV, 1 V, and 10 V ranges) for the best dc voltage accuracy.
Agilent 34410A/11A/L4411A 6½ Digit Multimeter Service Guide 2 Quick Start This chapter gives you a quick overview of the 34410A/11A multimeter’s front panel and basic features. The examples will help you become familiar with your meter, its measuring functions. and basic operation.
2 Quick Start Basic Multimeter Operations This section introduces the basics of the 34410A/11A multimeter, and how to use it. Preparing the Multimeter for Use To verify that your 34410A or 34411A multimeter is ready for use: 1 Check the list of supplied items. Verify that you have received the following items with your multimeter. If anything is missing, contact your nearest Agilent Sales Office. • • • • • • Test Lead Set (34410A/11A only). Power Cord. USB 2.0 Cable.
Quick Start 2 Using the Front Panel This section introduces the 34410A/11A multimeter front panel. Front-Panel Keys The front panel provides keys to select various functions and operations. Pressing a measurement function key (e.g. ) selects that function. Press to enter the configuration menu for the selected measurement function. Most keys have a shifted function printed in blue above the key. To perform a shifted function, press , and then press the key that has the desired label above it.
2 Quick Start Front- Panel Display Shortcuts Direct front–panel shortcuts are provided for three commonly used display functions: ranging, digit masking, and integration time. Ranging. The multimeter’s manual range can be set directly from the navigation keypad. To manually change the current multimeter range, press or . The ManRng annunciator will light, and the selected range (e.g. 100mV RANGE) will be briefly displayed on the second line. Digit Masking.
Quick Start 2 Making Basic Measurements This section introduces the many types of measurements that you can make with your 34410A/11A multimeter, and how to make connections for each measurement. Most basic measurements can be taken using the factory default settings. A more complete description of all multimeter functions, measurement parameter configuration and remote interface operation is provided in Chapter 2. For each measurement, connect the test leads as shown.
2 Quick Start To Measure DC Voltage Press to select the dc voltage function. • Ranges: 100 mV, 1 V, 10 V, 100 V, 1000 V • Configurable parameters: INTEGRATION, RANGE, INPUT Z (input impedance), AUTO ZERO, NULL, and NULL VALUE Connect test leads as shown: DC Voltage To Measure AC Voltage Press to select the ac voltage function.
Quick Start 2 To Measure DC Current Press to select the dc current function. • Ranges: 100 mA, 1 mA, 10 mA, 100 mA, 1 A, 3 A • Configurable parameters: INTEGRATION, RANGE, AUTO ZERO, NULL, and NULL VALUE Connect test leads as shown: DC Current To Measure AC Current Press to select the ac current function.
2 Quick Start To Make a 2-Wire Resistance Measurement Press to select the 2- wire resistance function. • Ranges: 100 W, 1 kW, 10 kW, 100 kW, 1 MW, 10 MW, 100 MW, 1 GW • Configurable parameters: INTEGRATION, RANGE, OFFSET COMP, AUTO ZERO, NULL, and NULL VALUE Connect test leads as shown: Resistance To null–out the test lead resistance: 1 Connect one end of the test leads at the meter, and short the probe ends together. 2 Press null.
Quick Start 2 To Measure Frequency Press to select the frequency function. • Measurement band: 3 Hz to 300 kHz • Input signal range: 100 mVAC to 750 VAC • Technique: reciprocal counting • Configurable parameters: GATE TIME, RANGE, AC FILTER, NULL and NULL VALUE Connect test leads as shown: AC Signal To Measure Period Press to select the frequency function. Then press PERIOD from the menu. • • • • and select Measurement band: 0.33 s to 3.
2 Quick Start To Measure Capacitance Press to select the capacitance function. • Ranges: 1 nF, 10 nF, 100 nF, 1 mF, 10 mF • Configurable parameters: RANGE, NULL, and NULL VALUE Connect test leads as shown: Capacitance To null–out the test lead capacitance: 1 Disconnect the + lead’s probe end from the test circuit, and leave open. 2 Press null. 3 Reconnect the + lead’s probe end to the test circuit, and measure the corrected capacitance value.
Quick Start 2 To Make a 2-Wire Temperature Measurement Press to select the temperature function. Then press and select RTD-2W or THERMISTOR-2W from the menu. • Probe types: 2.2 kW, 5 kW, 10 kW thermistors; 0.00385%/ºC RTD • Configurable parameters: PROBE TYPE, THERMISTOR or RTD value, AUTO ZERO, OFFSET COMP (RTD probes only), INTEGRATION, UNITS, NULL, and NULL VALUE Connect test leads as shown: Thermistor or RTD To Make a 4-Wire Temperature Measurement Press to select the temperature function.
2 Quick Start To Test Continuity Press to select the continuity function. • Test current source: 1 mA • Beeper Threshold: beeps below 10W Connect test leads as shown: Open or Closed Circuit To Check Diodes Press to select the diode test function. • Test current source: 1 mA • Beeper Threshold: 0.3V ~ voltagemeasured ~ 0.
Quick Start 2 Other Basics of Operation This section covers basic troubleshooting and general use. If the Multimeter Does Not Turn On Use the following steps to help solve problems you might encounter when turning on the multimeter. If you need more help, see the Service Guide for instructions on returning the multimeter to Agilent for service. 1 Verify that there is ac power to the multimeter. First, verify that the multimeter’s Power switch is in the “On” position.
2 Quick Start To Replace the Power-Line Fuse (34410A/11A) Remove power cord first. Then follow these steps: Depress tab (1) and pull fuse holder (2) from rear panel. Remove line-voltage selector from fuse holder assembly. Agilent Part Number 2110-0817 (250 mAT, 250V, slow-blow, 5x20mm) Rotate line-voltage selector and reinstall so correct voltage appears in fuse holder window. Replace fuse holder assembly in rear panel. Verify that the correct line voltage is selected and the power- line fuse is good.
Quick Start 2 To Adjust the Carrying Handle (34410A/11A) To adjust the position, grasp the handle by the sides and pull outward. Then, rotate the handle to the desired position. Bench-Top Viewing Positions Carrying Position To Rack Mount the 34410A/11A Multimeter You can mount the 34410A/11A in a standard 19–inch rack cabinet using the available rack–mount kits. Instructions and mounting hardware are included with each kit. Any Agilent System II (half- width, 2U height) instrument of either the 272.
2 Quick Start You must remove the carrying handle, and the front and rear rubber bumpers, before rack mounting an instrument. To remove each bumper, stretch a corner and slide it off. To remove the handle, rotate it to the vertical position and pull the ends outward.
Quick Start 2 Calibration Operation (34410A/11A) From the front panel you can: • Read the calibration count • Read and set the calibration message. • Secure and unsecure the instrument for calibration. To Read the Calibration Count You can query the instrument to determine how many calibrations have been performed. Note that your instrument was calibrated before it left the factory. When you receive your instrument, read the count to determine its initial value.
2 Quick Start To Read the Calibration Message The instrument allows you to store a message in calibration memory. For example, you can store such information as the date when the last calibration was performed, the date when the next calibration is due, the instrument's serial number, or even the name and phone number of the person to contact for a new calibration. You can record a calibration message only when the instrument is unsecured.
Quick Start 2 To Secure for Calibration This feature allows you to enter a security code to prevent accidental or unauthorized adjustments of the instrument. When you first receive your instrument, it is secured. Before you can adjust the instrument, you must unsecure it by entering the correct security code. The security code is set to AT34410 (AT34411 for the Agilent 34411A) when the instrument is shipped from the factory.
2 Quick Start To Unsecure for Calibration Before you can adjust the instrument, you must unsecure it by entering the correct security code. The security code is set to AT34410 when the instrument is shipped from the factory. The security code is stored in non–volatile memory, and does not change when power has been off, after a Factory Reset (*RST command), or after an Instrument Preset (SYSTem:PRESet command). The security code may contain up to 12 alphanumeric characters.
Agilent 34410A/11AL4411A 6½ Digit Multimeter Service Guide 3 Calibration Procedures Agilent Technologies Calibration Services 58 Calibration Interval 58 Adjustment is Recommended 58 Time Required for Calibration 59 Automating Calibration Procedures 59 Recommended Test Equipment 60 Performance Verification Tests 60 Input Connections 62 Test Considerations 63 Verification Tests 64 Calibration Security 75 Calibration Message 77 Calibration Count 77 Calibration Process 78 Aborting a Calibration in Progress 79
3 Calibration Procedures Agilent Technologies Calibration Services Agilent Technologies offers calibration services at competitive prices. When your instrument is due for calibration, contact your local Agilent Service Center for recalibration. See “Types of Service Available” on page 105 for information on contacting Agilent. Calibration Interval The instrument should be calibrated on a regular interval determined by the measurement accuracy requirements of your application.
Calibration Procedures 3 Time Required for Calibration The instrument can be automatically calibrated under computer control. With computer control you can perform the complete calibration procedure and performance verification tests in less than 30 minutes once the instrument is warmed–up (see Test Considerations on page 63). Automating Calibration Procedures The adjustment procedures provided in this Service Guide demonstrate front panel adjustment.
3 Calibration Procedures Recommended Test Equipment The test equipment recommended for the performance verification and adjustment procedures is listed below. If the exact instrument is not available, substitute calibration standards of equivalent accuracy. A suggested alternate method would be to use the Agilent 3458A 8½–digit Digital Multimeter to measure less accurate yet stable sources. The output value measured from the source can be entered into the instrument as the target calibration value.
Calibration Procedures 3 Self–Test A brief power–on self–test occurs automatically whenever you turn on the instrument. This limited test assures that the instrument is capable of operation. • During the self–test all display segments and annunciators are lit. • 34410A/11A If the self–test fails, the ERROR annunciator turns on. Read any errors using the front panel Utility menu (select SCPI ERRORS), or use the SYSTem:ERRor? command query from the remote interface.
3 Calibration Procedures If the instrument fails the quick performance check, adjustment or repair is required. Performance Verification Tests The performance verification tests are recommended as acceptance tests when you first receive the instrument. The acceptance test results should be compared against the 90 day test limits. You should use the 24–hour test limits only for verification within 24 hours after performing the adjustment procedure.
Calibration Procedures 3 Test Considerations Errors may be induced by ac signals present on the input leads during a self–test. Long test leads can also act as an antenna causing pick–up of ac signals. For optimum performance, all procedures should comply with the following recommendations: • Assure that the calibration ambient temperature (Tcal) is stable and between 18 °C and 28 °C. Ideally the calibration should be performed at 23 °C ±2 °C. • Assure ambient relative humidity is less than 80%.
3 Calibration Procedures Verification Tests Zero Offset Verification This procedure is used to check the zero offset performance of the instrument. Verification checks are only performed for those functions and ranges with unique offset calibration constants. Measurements are checked for each function and range as described in the procedure on the next page. Zero Offset Verification Procedure (34410A/11A) 1 Make sure you have read “Test Considerations” on page 63.
Calibration Procedures Input Function[1] Range Quick Check Error from Nominal 24 hour 90 day 1 year 100 µA ± 0.02 µA ± 0.025 µA ± 0.025 µA Open 1 mA ± 0.060 µA ± 0.060 µA ± 0.060 µA Open 10 mA ± 2 µA ± 2 µA ± 2 µA Open 100 mA ± 4 µA ± 5 µA ± 5 µA Open 1A ± 60 µA ± 100 µA ± 100 µA Open 3A ± 600 µA ± 600 µA ± 600 µA ± 3 µV ± 3.5 µV ± 3.
3 Calibration Procedures Gain Verification This procedure is used to check the “full scale” reading accuracy of the instrument. Verification checks are performed only for those functions and ranges with unique gain calibration constants. DC Volts Gain Verification Test 1 Make sure you have read “Test Considerations” on page 63. 2 Connect the calibrator to the input terminals. a For the 34410A/11A use the front panel input terminals and select the Front input terminals with the Front/Rear switch.
Calibration Procedures 3 DC Current Gain Verification Test 1 Make sure you have read “Test Considerations” on page 63 2 Connect the calibrator to the input terminals. a For the 34410A/11A use the front panel input terminals and select the Front input terminals with the Front/Rear switch. 3 Select each function and range in the order shown below. Provide the input shown in the table below. 4 Make a measurement and return the result.
3 Calibration Procedures Ohms Gain Verification Test Configuration: 4–Wire Ohms (CONFigure:FRESistance) 1 Make sure you have read “Test Considerations” on page 63. 2 Set the 4- Wire Ohms function. 3 Connect the calibrator to the input terminals. a For the 34410A/11A use the front panel input terminals and select the Front input terminals with the Front/Rear switch. 4 Select each range in the order shown below. Provide the resistance value indicated.
Calibration Procedures 3 Frequency Gain Verification Test Configuration: Frequency (CONFigure:FREQuency DEF, MIN) 1 Make sure you have read “Test Considerations” on page 63. 2 Select the Frequency function, default range, and minimum resolution (1 second aperture). 3 Connect the Agilent 33220A to the input terminals. a For the 34410A/11A use the front panel input terminals and select the Front input terminals with the Front/Rear switch. 4 Select each range in the order shown below.
3 Calibration Procedures AC Volts Verification Test Configuration: 1 2 3 4 AC Volts (CONFigure[:VOLTage]:AC) LF 3 HZ:SLOW ([SENSe:]VOLTage:AC:BANDwidth 3) Make sure you have read “Test Considerations” on page 63. Connect the calibrator to the input terminals. a For the 34410A/11A use the front panel input terminals and select the Front input terminals with the Front/Rear switch. Set the AC Volts function and the 3 Hz input filter. With the slow filter selected, each measurement takes 2.
Calibration Procedures 3 AC Current Verification Test Configuration: AC Current (CONFigure:CURRent:AC) LF 3 HZ:SLOW ([SENSe:]CURRent:AC:BANDwidth 3) 1 Make sure you have read “Test Considerations” on page 63. 2 Connect the calibrator to the input terminals. a For the 34410A/11A use the front panel input terminals and select the Front input terminals with the Front/Rear switch. 3 Set the AC Current function and the 3 Hz input filter. With the slow filter selected, each measurement takes 1.
3 Calibration Procedures Additional AC Voltage Performance Verification Tests Configuration: AC Volts (CONFigure[:VOLTage]:AC) LF 3 HZ:SLOW ([SENSe:]VOLTage:AC:BANDwidth 3) 1 Make sure you have read “Test Considerations” on page 63. 2 Connect the calibrator to the input terminals. a For the 34410A/11A use the front panel input terminals and select the Front input terminals with the Front/Rear switch. 3 Set the AC Volts function and the 3 Hz input filter.
Calibration Procedures 3 Additional AC Current Performance Verification Tests Configuration: AC Current (CONFigure:CURRent:AC) LF 3 HZ:SLOW ([SENSe:]CURRent:AC:BANDwidth 3) 1 Make sure you have read “Test Considerations” on page 63 2 Connect the calibrator to the input terminals. a For the 34410A/11A use the front panel input terminals and select the Front input terminals with the Front/Rear switch. 3 Set the AC Current function and the 3 Hz input filter.
3 Calibration Procedures Additional Capacitance Performance Verification Tests Configuration: Capacitance CONFigure:CAPacitance 1 Make sure you have read “Test Considerations” on page 63. 2 Set the Capacitance function. 3 Connect the calibrator to the input terminals. a For the 34410A/11A use the front panel input terminals and select the Front input terminals with the Front/Rear switch. 4 Select each range in the order shown below. Provide the indicated input.
Calibration Procedures 3 Calibration Security This feature allows you to enter a security code to prevent accidental or unauthorized adjustments of the instrument. When you first receive your instrument, it is secured. Before you can adjust the instrument, you must unsecure it by entering the correct security code. See “To Unsecure for Calibration” on page 56 for a procedure to enter the security code from the front panel.
3 Calibration Procedures 1 Disconnect the power cord and all input connections. 2 Disassemble the instrument using the “General Disassembly” on page 111. 3 Solder a temporary short between the two exposed metal pads on the main PC board assembly. The general location is shown in the figure below. On the 34410/11A PC board, the pads are marked JM101. On the L4411A the pads are marked UNSEC. L4411A 34410A/11A 4 Apply power and turn on the instrument.
Calibration Procedures 3 Calibration Message The instrument allows you to store a message in calibration memory. For example, you can store such information as the date when the last calibration was performed, the date when the next calibration is due, the instrument’s serial number, or even the name and phone number of the person to contact for a new calibration. The calibration message may contain up to 40 characters. You can record a calibration message only when the instrument is unsecured.
3 Calibration Procedures Calibration Process The following general procedure is the recommended method to complete a full instrument calibration. 1 Read “Test Considerations” on page 63. 2 Perform the verification tests to characterize the instrument (incoming data). 3 Unsecure the instrument for calibration (“Calibration Security” on page 75). 4 Perform the adjustment procedures (““Adjustments” on page 80). 5 Secure the instrument against calibration.
Calibration Procedures 3 Using the Remote Interface for Adjustments All adjustments can be made using the remote interface. You must use the remote interface for the L4411A. Commands used to perform the adjustments are listed in CALibration subsystem of the 34410A/11A/L4411A Programmer’s Reference. Selecting the Adjustment Mode Use the CALibration:ADC? query to begin the ADC calibration. The repsonse to this query indicates a succedssful adjustment (0) or a failure (1).
3 Calibration Procedures Adjustments You will need a test input cable and connectors set and a low thermal input short, Agilent 34172B (two are recommended for the 34410A/11A), to adjust the instrument (see “Input Connections” on page 62). ADC and Zero Adjustment Each time you perform a zero adjustment, the instrument stores a new set of offset correction constants for every measurement function and range.
Calibration Procedures 3 Adjust the ADC 3 34410A/11A: Select the front panel input terminals. If using a single shorting block, install the block on the front panel input terminals. L4411A: Install the shorting block on the input terminals. 4 Select the ADC adjustment mode (see “Selecting the Adjustment Mode” on page 78 or, for the remote interface page 79). 5 The display will show the ADC calibration steps as they progress. The ADC adjustment requires approximately 2 minutes to complete.
3 Calibration Procedures Gain Adjustments The instrument calculates and stores gain corrections for each input value. The gain constant is computed from the calibration value entered for the calibration command and from measurements made automatically during the adjustment procedure. Most measuring functions and ranges have gain adjustment procedures. The 100 MΩ and 1 GΩ ranges do not have gain calibration procedures. Adjustments for each function should be performed ONLY in the order shown.
Calibration Procedures 3 Valid Gain and Flatness Adjustment Input Values. Gain adjustment can be accomplished using the following input values. Function DC Volts Range Valid Amplitude Input Values 100 mV to 100 V 0.9 to 1.1 x Full Scale 1000 V 450 V to 550 V 100 µA to 1 A 0.9 to 1.1 x Full Scale 3A 1.8 A to 2.2 A Ohms 2W, Ohms 4W 100 Ω to 10 MΩ 0.9 to 1.1 x Full Scale Frequency Any Input > 100 mV rms, 990 Hz to 110 kHz AC Current (rms) [1] 100 µA to 1 A 0.9 to 1.
3 Calibration Procedures DC Voltage Gain Calibration Procedure Review the “Test Considerations” on page 63 and “Gain Adjustment Considerations” on page 82 sections before beginning this procedure. Configuration: DC Voltage 1 Configure each function and range shown in the adjustment table below. 2 Apply the input signal shown in the “Input” column of the table. NOTE Always complete tests in the specified order as shown in the appropriate table.
Calibration Procedures Input Voltage 100 mV 34410A/11A/L4411A Service Guide 3 Instrument Settings Function DC Volts Range 100 mV –100 mV 100 mV 1V 1V 10 V 10 V –10 V 10 V 100 V 100 V 500 V 1000 V 85
3 Calibration Procedures DC Current Gain Calibration Procedure Review the “Test Considerations” on page 63 and “Gain Adjustment Considerations” on page 82 sections before beginning this procedure. Configuration: DC Current 1 Configure each function and range shown in the adjustment table below. 2 Apply the input signal shown in the “Input” column of the table. NOTE Always complete tests in the specified order as shown in the appropriate table.
Calibration Procedures Input Current 100 µA 3 Instrument Settings Function DC Current Range 100 µA 1 mA 1 mA 10 mA 10 mA 100 mA 100 mA 1A 1A 2A 3A AC Voltage Gain Calibration Procedure Review the ““Test Considerations” on page 63 and “Gain Adjustment Considerations” on page 82 sections before beginning this procedure. Configuration: AC Voltage 1 Configure each function and range shown in the adjustment table below. 2 Apply the input signal shown in the “Input” column of the table.
3 Calibration Procedures 6 Verify the AC Voltage Gain adjustments using the verification procedures beginning on page 70. NOTE Each range in the gain adjustment procedure takes less than 6 seconds to complete.
Calibration Procedures 3 adjustment value to correct the problem and repeat the adjustment step. 4 Repeat steps 1 through 3 for each gain adjustment point shown in the table. 5 Store the new calibration constants (“Storing the Calibration Constants” on page 78. 6 Verify the AC Current Gain adjustments using the verification procedures beginning on page 71. NOTE Each range in the gain adjustment procedure takes less than 7 seconds to complete.
3 Calibration Procedures Ohms Gain Calibration Procedure Review the “Test Considerations” on page 63 and “Gain Adjustment Considerations” on page 82 sections before beginning this procedure. Configuration: 4–Wire Ohms This procedure adjusts the gain for both the 4–wire and 2–wire Ohms functions, and the offset compensated Ohms function. The 100 MΩ and 1 GΩ range gains are derived from the 10 MΩ range and do not have separate adjustment points.
Calibration Procedures Input Resistance 100 Ω 34410A/11A/L4411A Service Guide 3 Instrument Settings Function 4–Wire Ohms Range 100 Ω 1 kΩ 1 kΩ 10 kΩ 10 kΩ 100 kΩ 100 kΩ 1 MΩ 1 MΩ 10 MΩ 10 MΩ 91
3 Calibration Procedures Frequency Gain Calibration Procedure Review the “Test Considerations” on page 63 and “Gain Adjustment Considerations” on page 82 sections before beginning this procedure. Configuration: Frequency 10 V range The frequency accuracy of the Fluke 5720A is insufficient to calibrate the DMM. Its frequency output needs to be calibrated against a more accurate reference. The Agilent 33220A is recommended for this adjustment.
Calibration Procedures 3 Flatness Adjustments The instrument stores new flatness correction constants each time this procedure is followed. Flatness constants adjust the DMM for AC Volts and AC current measurements across the usable input frequency band. The flatness constant is computed from the calibration value entered for the calibration command and from measurements made automatically during the adjustment procedure.
3 Calibration Procedures AC Voltage Low Frequency Flatness Calibration Procedure Review the “Test Considerations” on page 63 and “Flatness Adjustment Considerations” on page 93 sections before beginning this procedure. Configuration: AC Voltage — 10 V range 1 Configure each function and range shown in the adjustment table below. 2 Apply the input signal shown in the “Input” column of the table. NOTE Always complete tests in the specified order as shown in the appropriate table.
Calibration Procedures 3 AC Voltage Flatness Calibration Procedure Review the “Test Considerations” on page 63 and “Flatness Adjustment Considerations” on page 93 sections before beginning this procedure. Configuration: AC Voltage The 100V AC range is adjusted with 50Vac input. All AC adjustments uses the 3 Hz bandwidth measurement filter 1 Configure each function and range shown in the adjustment table below. 2 Apply the input signal shown in the “Input” column of the table.
3 Calibration Procedures Input Vrms 100 mV Frequency 1 kHz Instrument Range 100 mV 5 kHz 10 kHz 20 kHz 35 kHz 50 kHz 75 kHz 100 kHz 200 kHz 300 kHz 390 kHz 400 kHz 220 Hz 96 34410A/11A/L4411A Service Guide
Calibration Procedures Input Vrms 1V Instrument Frequency 1 kHz 3 Range 1V 5 kHz 10 kHz 20 kHz 35 kHz 50 kHz 75 kHz 100 kHz 200 kHz 300 kHz 390 kHz 400 kHz 220 Hz 34410A/11A/L4411A Service Guide 97
3 Calibration Procedures Input Vrms 10 V Frequency 1 kHz Instrument Range 10 V 5 kHz 10 kHz 20 kHz 35 kHz 50 kHz 75 kHz 100 kHz 200 kHz 300 kHz 390 kHz 400 kHz 220 Hz 98 34410A/11A/L4411A Service Guide
Calibration Procedures Input Vrms 50 V Frequency 1 kHz 3 Instrument Range 100 V 5 kHz 10 kHz 20 kHz 35 kHz 50 kHz 75 kHz 100 kHz 200 kHz 300 kHz 390 kHz 400 kHz 220 Hz 34410A/11A/L4411A Service Guide 99
3 Calibration Procedures AC Current Flatness Calibration Procedure Review the “Test Considerations” on page 63 and “Flatness Adjustment Considerations” on page 93 sections before beginning this procedure. Configuration: AC Current All AC adjustments use the 3 Hz bandwidth measurement filter 1 Configure each function and range shown in the adjustment table below. 2 Apply the input signal shown in the “Input” column of the table.
Calibration Procedures 3 Input Current, rms Frequency 100 µΑ 1 kHz Range 100 µΑ 5 kHz 7.5 kHz 9.7 kHz 10 kHz 220 Hz 1 mA 1 kHz 1 mA 5 kHz 7.5 kHz 9.7 kHz 10 kHz 220 Hz 10 mA 1 kHz 10 mA 5 kHz 7.5 kHz 9.
3 Calibration Procedures Input Current, rms Frequency 100 mA 1 kHz Range 100 mA 5 kHz 7.5 kHz 9.7 kHz 10 kHz 220 Hz 1A 1 kHz 1A 5 kHz 7.5 kHz 9.7 kHz 10 kHz 220 Hz Finishing Adjustments 1 Remove all shorting blocks and connections from the instrument. 2 Reset the Calibration Message (see page 77). 3 Reset the Calibration Security (see page 75). 4 Record the new Calibration Count (see page 77).
Agilent 34410A/11A/L4411A 6½ Digit Multimeter Service Guide 4 Disassembly and Repair Operating Checklist 104 Types of Service Available 105 Repackaging for Shipment 106 Cleaning 106 To Replace the 34410A/11A Power Line Fuse 106 To Replace the Current Input Fuse 107 Self Test Procedures 107 Calibration Errors 109 34410A/11A Display and Keypad Tests 110 Electrostatic Discharge (ESD) Precautions 110 34410A/11A Mechanical Disassembly 111 L4411A Mechanical Disassembly 117 Replaceable Parts 122 This chapter wil
4 Disassembly and Repair Operating Checklist Before returning your multimeter to Agilent for service or repair check the following items: Is the multimeter inoperative? q Verify that the power cord is connected to the multimeter and to ac line power. q Verify the front panel power switch is depressed. q 34410A/11A Verify the power line fuse is installed. Use a 250 V 250 mAT fuse. q 34410A/11A Verify the power line voltage setting.
Disassembly and Repair 4 Types of Service Available If your instrument fails during the warranty period, Agilent Technologies will repair or replace it under the terms of your warranty. After your warranty expires, Agilent offers repair services at competitive prices. Extended Service Contracts Many Agilent products are available with optional service contracts that extend the covered period after the standard warranty expires.
4 Disassembly and Repair Repackaging for Shipment If the unit is to be shipped to Agilent for service or repair, be sure to: • Attach a tag to the unit identifying the owner and indicating the required service or repair. Include the model number and full serial number. • Place the unit in its original container with appropriate packaging material for shipping. • Secure the container with strong tape or metal bands.
Disassembly and Repair 4 To Replace the Current Input Fuse The front and rear current input terminals are protected by a fuse. This fuse is located on the rear panel (see page 9 or page 11). The supplied fuse is a 3AT, 250V, slow–blow, 5x20mm fuse, Agilent part number 2110–0780. If you determine that the fuse is faulty, replace it with one of the same size and rating Self Test Procedures Power–On Self–Test Each time the instrument is powered on, a subset of self–tests are performed.
4 Disassembly and Repair Self Test Error Numbers NOTE 108 On the remote interface, a self–test failure will generate SCPI error –330 and a supplemental message indicating one of the test numbers shown below. On the front panel, only the failing test is shown.
Disassembly and Repair 4 Calibration Errors The following errors indicate failures that may occur during a calibration.
4 Disassembly and Repair 34410A/11A Display and Keypad Tests You can test the keypad and display. Hold down the key as you turn on the instrument. Hold the key for a little over 5 seconds, until you hear a relay click. When you release the key, the instrument begins the keypad test. The second display line shows the names of the keys. Press each key in turn, as shown. When all the keys have been pressed, the display test is available.
Disassembly and Repair 4 34410A/11A Mechanical Disassembly For procedures in this manual, the following tools are required for disassembly: • T20 Torx driver (most disassembly) • T15 Torx driver (fan removal) • Flat Blade screw driver The following tools may also be needed if further disassembly is required. • 9/32” nut driver (rear–panel GPIB connector) WA R N I N G SHOCK HAZARD. Only service–trained personnel who are aware of the hazards involved should remove the instrument covers.
4 Disassembly and Repair 3 Remove the instrument bumpers. Pull from a corner and stretch the bumpers off the instrument. 4 Remove the rear bezel. Loosen the two captive screws in the rear bezel and remove the rear bezel.
Disassembly and Repair 4 5 Remove the cover. Remove the Torx drive screw in the bottom of the cover and slide the cover off the instrument. Front Panel Removal 6 Remove push rod and disconnect display cable. a Gently move the power switch push rod toward the front of the instrument to disengage it from the switch. Be careful not to twist or bend the push rod. Remove the front/rear push rod in the same manner.
4 Disassembly and Repair 7 Remove front panel. a Using a small bladed screwdriver, gently pry the black terminal latch from the red terminal housing. Rotate the Terminal larch up and remove it from the instrument. b Remove the Torx screw holding the front panel assembly.
Disassembly and Repair 4 c There is now enough play to allow the side of the front panel to be pried from the chassis and removed as an assembly.
4 Disassembly and Repair Front Panel Disassembly 1 Remove the keypad and display assembly. a Using a flat blade screwdriver, gently pry up on the circuit board tab (shown below) and slide the board to disengage from the tabs. Lift the keypad and display assembly from the plastic housing. b The rubber keypad can now be pulled from the plastic housing.
Disassembly and Repair 4 L4411A Mechanical Disassembly For procedures in this manual, the following tools are required for disassembly: • T10 Torx driver (most disassembly) • T20 Torx driver (power supply removal) • Flat Blade screw driver The following tools may also be needed if further disassembly is required. • 9/32” nut driver (rear–panel GPIB connector) WA R N I N G SHOCK HAZARD. Only service–trained personnel who are aware of the hazards involved should remove the instrument covers.
4 Disassembly and Repair 3 Remove the power switch push rod. Gently move the power switch push rod toward the front of the instrument to disengage it from the switch. Be careful not to twist or bend the push rod. You will need to rotate the push rod to guide it out through the front panel. 4 Remove the Display Cable from the main circuit board. Release the cable connector key from the main circuit board.
Disassembly and Repair 4 5 Remove the power supply safety shield. Remove the Torx screw holding the safety shield and lift out the shield. The display cable can be flexed out of the way without removing the cable from the display assembly. 6 Remove the power supply input and output cables.
4 Disassembly and Repair 7 Remove the Power Supply Assembly. Remove the four Torx screws holding the power supply assembly in place and lift out the power supply.
Disassembly and Repair 4 8 Remove the Display Assembly. Remove the two Torx screws holding the display bracket to the front panel and lift the display assembly and bracket up and out of the instrument.
4 Disassembly and Repair Replaceable Parts This section contains information for ordering replacement parts for your instrument. The parts lists are divided into the following sections. Parts are listed in alphanumeric order according to their reference designators. The parts lists include a brief description of each part with applicable Agilent part number. To Order Replaceable Parts You can order replaceable parts from Agilent using the Agilent part number.
Disassembly and Repair 4 Parts List 34410A/11A 34410A/11A/L4411A Service Guide Agilent Part Number Description 2110-0817 Line Fuse 2110-0780 Current Fuse 33220-88304 Bezel Rear 34401-45012 Latch-Terminal 34401-45021 Handle 34401-86013 Safety-Cover 34401-86020 Kit Bumper 34410-00602 Shield-Bottom 34410-00603 Shield-Top 34410-00611 Shield-ESD, VFD 34410-40201 Panel, Front 34410-43711 Pushrod-Power 34410-43712 Pushrod-Rear Terminals 34410-49321 Window 34410A 34411-49321 Windo
4 Disassembly and Repair Parts List L4411A 124 Agilent Part Number Description 2110-0780 Current Fuse L4411-61601 Cable, Display L4411-04104 Power Supply Cover L4411-04103 Front Panel L4411-43701 Pushrod E5810-00001 Display - LCD 36mm E5810-00007 Bracket, Display L4411-04102 Cover 34410A/11A/L4411A Service Guide
Agilent 34410A/11A/L4411A 6½ Digit Multimeter Service Guide 5 Backdating This chapter contains information necessary to adapt this manual to instruments not directly covered by the current content. At this printing, however, the manual applies to all instruments. Therefore, no information is included in this chapter.
5 126 Backdating 34410A/11A/L4411A Service Guide