Will Shaw

Timing difference between BDM debug vs non-debug

Discussion created by Will Shaw on Dec 16, 2009
Latest reply on Dec 17, 2009 by David Payne

Hi

 

I have a function that holds a line low or high for a specific amount of time, call it holdLow(int t).  This function's timer is based on an interrupt that occurs every millisecond.  When it hits the specified time, it releases and returns.  Pretty simple.

 

I'm running a MC52xx processor with a coldfire USB multilink BDM.  When I am in BDM/debug mode, no breakpoints or anything, the program will run fine.  If I tell it to set the pin low for 50ms, it will do it consistently.

 

However when I run the program without debug, the timing is not consistent at all.  If I set it low for 50ms, it will set it for up to +/- 3ms.   Sometimes 49ms, sometimes 53, etc.

 

There is no code optimization in either build.  The one without debug is loaded by a s19 file into flash.

 

Any ideas?

Outcomes