Content originally posted in LPCWare by member_lpc11xx on Tue Mar 03 09:00:42 MST 2015
Hello everybody,
I'm working with LPC1112FD20, using internal RC and clocking ARM core at 12MHz. My program has only to set the board and in the main task read an ADC channel and then set an output.
In previous project I always have configured system timer (SysTick) so that it runs at 1 KHz, therefore interrupt every 1ms. Now I need to set its frequency at 1MHz, but here comes the problem: I'm unable to make it work at that rate. Application do everything in the right way till it reaches ARM defined routine "SysTick_Config()": at this time my program stop running.
My first doubt have been that ARM core clock is too slow, so I've increased it to 48MHz (keeping IRC as clock source and enabling PLL), near maximum rate for LPC11xx microcontrollers (50MHz). Results: no change, same problem; when calling "SysTick_Config()" application stops.
So, I've tried do another test on the LPC-Link board with a LPC1114FBD48 mounted on it: same core (Cortex-M0) but more resources. Nothing has changed.
I specify that in the System Tick interrupt routine the only thing I do is toggling a pin to generate an output square wave and measure its frequency with the scope.
Maximum SysTick frequency I can configure keeping the application running is 400KHz (interrupt every 2,5us).
Now, my doubt is that maybe core clock isn't enough fast to serve and execute an ISR every microsecond. Reading "A Beginner’s Guide on Interrupt Latency - and Interrupt Latency of the ARM® Cortex®-M processors" article I've understood that to enter and exit an interrupt request 32 clock cycles (16+16) are needed: so 36/48000000 = 670ns only to manage ISR.
Has anyone experience same issue and/or confirm me if what I suppose is the right consideration for this "problem"?
Thanks to anyone who help me
regards
Andrea