I am using evkmimxrt1064_gpt_timer example project. The settings for counter are as follows:
GPT_SetClockDivider(EXAMPLE_GPT, 6); //PER clock, so 75MHz/6 = 12.5MHz
GPT_SetOutputCompareValue(EXAMPLE_GPT, kGPT_OutputCompare_Channel1, 125); //125 counts, so 100kHz signal
In the interrupt routine I toggle a GPIO pin and try to measure time between interrupts using the files in attachment. I got these files here on NXP forum.
In the interrupt routine I have the following code:
void EXAMPLE_GPT_IRQHandler(void)
{
result = us_count() - start_time;
start_time = us_count();
/* Clear interrupt flag.*/
GPT_ClearStatusFlags(EXAMPLE_GPT, kGPT_OutputCompare1Flag);
GPIO_PortToggle(EXAMPLE_LED_GPIO, 1u << EXAMPLE_LED_GPIO_PIN);
gptIsrFlag = true;
#if defined __CORTEX_M && (__CORTEX_M == 4U || __CORTEX_M == 7U)
__DSB();
#endif
}
The issue is that the result fluctuates from 5 to 30 microseconds when I put a breakpoint at GPIO_PortToggle. However, looking at the actual hardware signal, I can clearly see the port being toggled at 10 microseconds:
My question is what causes the incorrect values to appear in result variable in software?
P.S. I realized that checking this with edge trigger is not exhaustive, so I checked it with pulse trigger mode, and it always was within the range of 9.6 to 10.5 microseconds.
Best regards,
CarrotMan
Hello
Hope you are well.
This issue might be cause by the code inside the IRQ Handler. I suggest you leave the code that clears/sets flags.
To calculate the time you can use a callback function so the result can be processed outside the handler.
If you have more questions do not hesitate to ask me.
Best regards,
Omar