I am using an LPC1850 running at 180 MHz via the internal oscillator. According to the datasheet, the oscillator has a frequency tolerance range of 1.5%.
I have the system tick timer configured to interrupt at 1 ms intervals. Sometimes I use the interrupts to measure elapsed time. Given the tolerance of the internal oscillator, my time calculations become less accurate as the duration increases.
I also have a 32 kHz crystal that drives the internal RTC. The crystal has a frequency tolerance of 20 ppm, which is much more accurate than the internal system clock. I realize that for longer periods of time, I can just use the RTC to measure elapsed time. However, what about using the 32 kHz crystal to make the system tick timer more accurate??
Here's what I'm thinking:
- The MCU has a Downcounter register that counts down at a 1024 Hz rate, driven by the 32 kHz oscillator.
- The MCU has also the systick register that counts down at the same rate as the system clock, 180 MHz in my case.
- In a 250 ms period, I know that the Downcounter should decrement 256 ticks, while the systick register should decrement 45000000 ticks.
- With some careful coding to maximize accuracy, I can capture the starting systick register value, wait for the downcounter to drop by 256, and then capture the ending systick register value.
- Now I know how many systicks have occurred during the 250 ms period, and I can compare it against the expected number of 45000000.
- Using the difference in ticks, I can configure my system tick timer to interrupt at more accurate 1 ms intervals.
Aside from being an interesting exercise, is this ever done in practice? Is it a terrible idea?