Timing difference between BDM debug vs non-debug

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

Timing difference between BDM debug vs non-debug

1,605件の閲覧回数
wxd
Contributor I

Hi

 

I have a function that holds a line low or high for a specific amount of time, call it holdLow(int t).  This function's timer is based on an interrupt that occurs every millisecond.  When it hits the specified time, it releases and returns.  Pretty simple.

 

I'm running a MC52xx processor with a coldfire USB multilink BDM.  When I am in BDM/debug mode, no breakpoints or anything, the program will run fine.  If I tell it to set the pin low for 50ms, it will do it consistently.

 

However when I run the program without debug, the timing is not consistent at all.  If I set it low for 50ms, it will set it for up to +/- 3ms.   Sometimes 49ms, sometimes 53, etc.

 

There is no code optimization in either build.  The one without debug is loaded by a s19 file into flash.

 

Any ideas?

ラベル(1)
0 件の賞賛
返信
2 返答(返信)

866件の閲覧回数
peg
Senior Contributor IV

Hello and welcome to the fora, wxd.

 

Problems like this are often caused by things like the COP being unhandled in your code. In Debug mode the COP is usually disabled. Programmes that can fully cycle within the timeout period can apparently run correctly. They are simply "interrupted" for a while during reset.

 

0 件の賞賛
返信

866件の閲覧回数
J2MEJediMaster
Specialist I

For the debug version, is the program built to execute out of RAM or flash? Can you make an S-record file out of it and try loading it the same way as the non-debug version?

 

---Tom

0 件の賞賛
返信