GPT microseconds counter 3x

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

GPT microseconds counter 3x

Jump to solution
1,130 Views
neo2
Contributor III

Dear NXP,

I have found to get the right microseconds (us) then I need to multiply by 3 the return of GPT_GetCurrentTimerCount()

Is this expected?

Example: evkmimxtrt1160_gpt_timer_cm7

Creates a 1 second interrupt

If I measure the difference between each interrupt

uint32_t us_count1(void)
{
return GPT_GetCurrentTimerCount(GPT1);
}

gpt1_previous = gpt1_current;
gpt1_current = 3 * us_count1(); // <---- HERE multiply by 3
gpt1_difference = gpt1_current - gpt1_previous;

PRINTF("GPT1 %d\n\r",gpt1_difference);

Then I get
GPT1 981396 

981396us =(approx) 0.98 seconds 

Just wanted to know if this is documented somewhere?

Regards

0 Kudos
Reply
1 Solution
1,070 Views
neo2
Contributor III

Hi @kerryzhou I solved this issue, I want to share what I found.

Original code :  GPT_SetClockSource(GPT1, kGPT_ClockSource_Osc); 

It was not obvious about what speed the OSC clock was running. Which is why it confused me why 3x - it seems this was a "red herring" or I made the wrong assumption about what frequency the OSC clock was running.


Here a more robust solution:

Then I changed this to  GPT_SetClockSource(GPT1, kGPT_ClockSource_Periph); 
Seems like some documents say this is 150MHz but actually I found something different.

At least on the RT1160 board I have it seems this clock source is actually 24MHz

Since then I divided by 24
GPT_SetClockDivider(GPT1, 24);
Which gives 1MHz or 1us

I was able to confirm this because I setup GPT2 to generate an interrupt at 1 second intervals.
I could see the log output on the terminal correlated to 1 second.

Then I used the GPT1 counter to confirm e.g. GPT_GetCurrentTimerCount(GPT1) 

The elapsed time between the previous interrupt and the current interrupt.

So the number of ticks of my GPT1 @1us was of course approximately 1,000,000 = 1,000,000us = 1 second 

 

View solution in original post

0 Kudos
Reply
2 Replies
1,113 Views
kerryzhou
NXP TechSupport
NXP TechSupport

Hi @neo2 ,

   3X is not the fixed one, it is determined by your clock divider.

   Let me know, why you need to divide 3, please check the code:

/* Divide GPT clock source frequency by 3 inside GPT module */
GPT_SetClockDivider(EXAMPLE_GPT, 3);

/* Get GPT clock frequency */
gptFreq = EXAMPLE_GPT_CLK_FREQ;

/* GPT frequency is divided by 3 inside module */
gptFreq /= 3;

kerryzhou_0-1636362864034.png

 

So, your register divide 3, then your clock need to divide 3.

Wish it helps you!

Best Regards,

Kerry

0 Kudos
Reply
1,071 Views
neo2
Contributor III

Hi @kerryzhou I solved this issue, I want to share what I found.

Original code :  GPT_SetClockSource(GPT1, kGPT_ClockSource_Osc); 

It was not obvious about what speed the OSC clock was running. Which is why it confused me why 3x - it seems this was a "red herring" or I made the wrong assumption about what frequency the OSC clock was running.


Here a more robust solution:

Then I changed this to  GPT_SetClockSource(GPT1, kGPT_ClockSource_Periph); 
Seems like some documents say this is 150MHz but actually I found something different.

At least on the RT1160 board I have it seems this clock source is actually 24MHz

Since then I divided by 24
GPT_SetClockDivider(GPT1, 24);
Which gives 1MHz or 1us

I was able to confirm this because I setup GPT2 to generate an interrupt at 1 second intervals.
I could see the log output on the terminal correlated to 1 second.

Then I used the GPT1 counter to confirm e.g. GPT_GetCurrentTimerCount(GPT1) 

The elapsed time between the previous interrupt and the current interrupt.

So the number of ticks of my GPT1 @1us was of course approximately 1,000,000 = 1,000,000us = 1 second 

 

0 Kudos
Reply