Content originally posted in LPCWare by MarcVonWindscooting on Wed Jan 08 16:04:11 MST 2014
Hi Theodore,
I understand your calculation. Yet you have the theoretical (?) +- 40% WDOSC (in)accuracy.
You don't have to put up with that, assume 12MHz IRC:
Maximum system clock prescaler = 255 => 47.059..kHz
Maximum 16bit PR=65535 => 0.72..Hz (The true max is 65536 I admit)
1 day * 0.72..Hz = 62041.4.. (<=65535)
Calculation error < 9ppm.
That means, you may use the IRC (accuracy +- 1.5%) and still be able to use the 16-bit timer for a precise 1 day
interrupt by selecting SYSAHBCLKDIV=255, PR=65535, MR0=62041-1.
However, now I can see, there is a problem of using the 16-bit timer: I limits the system clock to something around 50kHz at most. That may be advantageous in terms of power consumption but on the other hand limits your peak computing power. It is easily increased by setting SYSAHBCLKDIV=1 which in turn taints the 16-bit clock within a few seconds if not turned off.
If you did use a 32-bit timer you were free in choosing system clock between 47kHz and full speed.
But honestly, alarms in the range of hours don't need a pure hardware solution, do they? A software 32bit value + an increment by a ISR every second would provide more than enough range and precision. It allows to run the core at full or reduced speed, yet you would prefer the to sleep in between the interrupts thus considerably lowering power consumption - let's say - by 50%. And your peripherals are still active!
EDIT: I just calculated what 1.5% means: 21.6minutes / day. Pretty much |(
That system would not be usable as an alarm clock at all for waking up the programmer in the morning in order to go to work. What a pity!