User`s guide
341
Chapter 8 Specifications
Interpreting Internal DMM Specifications
4
8
Resolution
Resolution is the numeric ratio of the maximum displayed value divided
by the minimum displayed value on a selected range. Resolution is often
expressed in percent, parts-per-million (ppm), counts, or bits. For
example, a 6½-digit multimeter with 20% overrange capability can
display a measurement with up to 1,200,000 counts of resolution. This
corresponds to about 0.0001% (1 ppm) of full scale, or 21 bits including
the sign bit. All four specifications are equivalent.
Accuracy
Accuracy is a measure of the “exactness” to which the internal DMM’s
measurement uncertainty can be determined relative to the calibration
reference used. Absolute accuracy includes the internal DMM’s relative
accuracy specification plus the known error of the calibration reference
relative to national standards (such as the U.S. National Institute of
Standards and Technology). To be meaningful, the accuracy
specifications must be accompanied with the conditions under which
they are valid. These conditions should include temperature, humidity,
and time.
There is no standard convention among instrument manufacturers for
the confidence limits at which specifications are set. The table below
shows the probability of non-conformance for each specification with the
given assumptions.
Variations in performance from reading to reading, and instrument to
instrument, decrease for increasing number of sigma for a given
specification. This means that you can achieve greater actual
measurement precision for a specific accuracy specification number. The
34970A/34972A is designed and tested to meet performance better than
mean ±3 sigma of the published accuracy specifications.
Specification
Criteria
Probability
of Failure
Mean ±2 sigma
Mean ±3 sigma
4.5%
0.3%
34970A Refresh UG.book Page 341 Wednesday, February 17, 2010 12:34 PM