User`s guide
Remarks
The value of MIN is measurement dependent. It depends on the integration
time (resolution), whether autozero is on, whether autorange is on, and
the measurement range. Basically, MIN is automatically determined so
that the sample interval is always greater than the sampling time (see
above illustration). However, the MIN value is not enforced, and you can
specify an <interval> less than MIN (even zero). Execute SAMPle:TIMer?
MIN to determine the current value of MIN. See additional comments
under "Return Format" below.
Due to internal quantization, the actual interval that you set may be
slightly different than your specified value. The increment is approximately
20 µs. Use the query command to determine the exact interval that is set.
For example, if you send "SAMP:TIM 500 ms", and then send the query
"SAMP:TIM?" the actual interval is returned (for example:
"+5.00000753E-01").
After setting the sample count, source, and delay time, you must place the
meter in the "wait-for-trigger" state using the INITiate or READ?
command. A trigger will not be accepted from the selected trigger source
(see TRIGger:SOURce command) until the instrument is in the "wait-for-
trigger" state.
The interval may be set to any value from MIN to 3600 seconds. However,
the value will be rounded to the nearest step. For dc measurements, the
step size is approximately 20 µs. For ac measurements, it is ac bandwidth
dependent.
The instrument sets the sample timer to 1 second after a Factory Reset (
*RST command) or an Instrument Preset (SYSTem:PRESet command).
It is recommended that all triggered measurements be made using an
appropriate fixed manual range. That is, turn autorange off
(SENSe:<function>:RANGe:AUTO OFF), or set a fixed range using the
SENSe:<function>:RANGe, CONFigure, or MEASure command.