Timing difference between BDM debug vs non-debug

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Timing difference between BDM debug vs non-debug

1,047 Views
wxd
Contributor I

Hi

 

I have a function that holds a line low or high for a specific amount of time, call it holdLow(int t).  This function's timer is based on an interrupt that occurs every millisecond.  When it hits the specified time, it releases and returns.  Pretty simple.

 

I'm running a MC52xx processor with a coldfire USB multilink BDM.  When I am in BDM/debug mode, no breakpoints or anything, the program will run fine.  If I tell it to set the pin low for 50ms, it will do it consistently.

 

However when I run the program without debug, the timing is not consistent at all.  If I set it low for 50ms, it will set it for up to +/- 3ms.   Sometimes 49ms, sometimes 53, etc.

 

There is no code optimization in either build.  The one without debug is loaded by a s19 file into flash.

 

Any ideas?

Labels (1)
0 Kudos
2 Replies

308 Views
peg
Senior Contributor IV

Hello and welcome to the fora, wxd.

 

Problems like this are often caused by things like the COP being unhandled in your code. In Debug mode the COP is usually disabled. Programmes that can fully cycle within the timeout period can apparently run correctly. They are simply "interrupted" for a while during reset.

 

0 Kudos

308 Views
J2MEJediMaster
Specialist I

For the debug version, is the program built to execute out of RAM or flash? Can you make an S-record file out of it and try loading it the same way as the non-debug version?

 

---Tom

0 Kudos