We have code that monitors if our data is being processed fast enough, and asserts if it is not. Without breakpoints, the code runs fine. If I add a breakpoint to critical areas, the code asserts, even if its a conditional breakpoint that never gets hit. This seems to indicate adding breakpoints slows down execution of the code. Is this the case?
Solved! Go to Solution.
Hi Tim,
The behavior for conditional breakpoints that you are seeing (speed slow down) is correct, because, even if the breakpoint is not hit (condition = TRUE), the condition needs to be evaluated each time the PC reach that region. For this, the debugger halts the core, check the condition and if it is FALSE then execute the next piece of code.
In this mode the IDE is not notified about the debugger action and the user has the feeling that everything runs permanently.
If you really want to make sure about this aspect you can open the CCS console and you can enable the logging (log v). You will see some interaction between the DSP and the CW where the conditional breakpoints are checked.
Hi Tim,
The behavior for conditional breakpoints that you are seeing (speed slow down) is correct, because, even if the breakpoint is not hit (condition = TRUE), the condition needs to be evaluated each time the PC reach that region. For this, the debugger halts the core, check the condition and if it is FALSE then execute the next piece of code.
In this mode the IDE is not notified about the debugger action and the user has the feeling that everything runs permanently.
If you really want to make sure about this aspect you can open the CCS console and you can enable the logging (log v). You will see some interaction between the DSP and the CW where the conditional breakpoints are checked.