AnsweredAssumed Answered

Delay on systick fails on any optimalization

Question asked by Brian Smith on Jun 18, 2019
Latest reply on Jun 18, 2019 by Mark Butcher

Hello,

 

I've created simple delay for my project using systick as it is not used for anything else but once i start raising optimalizations it stops working.

 

void delay_us(uint32_t us)
{
         unsigned long actualSysTick;
         unsigned long desiredSysTick;
         unsigned long us_delay=us;

 

         (void)SYSTICK_CSR;

         desiredSysTick=SYSTICK_CURRENT - CORE_1US;

 

         do
         {
                  while((actualSysTick = SYSTICK_CURRENT) > desiredSysTick)
                  {
                        if(SYSTICK_CSR & SYSTICK_COUNTFLAG)
                        {
                                 (void)SYSTICK_CSR;
                                 break;
                        }
                }

 

               desiredSysTick = actualSysTick - CORE_1US;

 

         }while(--us_delay);
}

 

I noticed while debugging that if i set O1 optimalization or highter my program is stuck on do-while because when us_delay hits 0  instead of leaving do-while it jumps to us value again and it's just sits there forever. I heard once that it's best to program on desired optimalization from the start to avoid future problems but seems like compiler is doing some extreme level optimalization and i can't even spot what's wrong.

Outcomes