Hi @kerryzhou I solved this issue, I want to share what I found.
Original code : GPT_SetClockSource(GPT1, kGPT_ClockSource_Osc);
It was not obvious about what speed the OSC clock was running. Which is why it confused me why 3x - it seems this was a "red herring" or I made the wrong assumption about what frequency the OSC clock was running.
Here a more robust solution:
Then I changed this to GPT_SetClockSource(GPT1, kGPT_ClockSource_Periph);
Seems like some documents say this is 150MHz but actually I found something different.
At least on the RT1160 board I have it seems this clock source is actually 24MHz
Since then I divided by 24
GPT_SetClockDivider(GPT1, 24);
Which gives 1MHz or 1us
I was able to confirm this because I setup GPT2 to generate an interrupt at 1 second intervals.
I could see the log output on the terminal correlated to 1 second.
Then I used the GPT1 counter to confirm e.g. GPT_GetCurrentTimerCount(GPT1)
The elapsed time between the previous interrupt and the current interrupt.
So the number of ticks of my GPT1 @1us was of course approximately 1,000,000 = 1,000,000us = 1 second