User guide

38 Focusing on a Single Device
Series 3100 device with a 10 MHz clock, the expected duration of a millisecond
timer is:
E
= 0.8192 *
floor
((D/0.82) + 1)
where
D
is the specified duration for the timer. For example, for a timeout of 100
ms,
E
equals 99.94 ms.
For a Series 3100 device with a 10 MHz clock with a 10 MHz clock, the low
duration is:
L
=
E
– 12 ms
and the high duration is:
H
=
E
+ 12 ms
With Other Clock Speeds
The following formulas allow you to calculate accuracy for millisecond timers
when other input clock rates are selected. In these formulas,
S
depends on the
input clock rate or the system clock rate, as shown in
Table 5.
Table 5. Determining S
S
Input Clock Rate
(Series 3100)
System Clock Rate
(Series 5000)
0.063 — 80 MHz
0.125 — 40 MHz
0.25 40 MHz 20 MHz
0.5 20 MHz 10 MHz
1 10 MHz 5 MHz
1.5259 6.5536 MHz
2 5 MHz
4 2.5 MHz
8 1.25 MHz
16 625 kHz
E
= 0.8192 * floor ( (floor(
D/S
)*
S
)/0.82) +1 )
Two factors determine
E.
The first is that the slower the input clock speed, the
less granular the input clock. For example, at 1/16 speed, the millisecond
granularity is 16 milliseconds (one clock tick every 16 milliseconds). The second
factor is that the hardware generates 819.2 microsecond ticks that the software