Hi all,
I just want to use each tick of GPT Timer as 1 milisecond and also 1 microsecond. As I know, if the GPT Timer Frequency is 1Mhz each timer-tick(GPT_GetCurrentTimerCount) is 1us. How can I set each increasion of timer count (from n to n-1) to 1 second or 1 milisecond? According to default logic I have to set 1 Hz (for 1 second increasion) to GPT Timer or 100 Hz (for 1 milisecond increasion). But Prescalar not permitting PER_CLK_ROOT(62500000Hz) to divide 1 Hz or 100 Hz.
Could you help me for the scenario please ?
Thanks and Regards.
Solved! Go to Solution.
Hi,
In my opinion, maybe you can consider other timer modules to make every count of tick is 1 second happen.
TIC
-------------------------------------------------------------------------------
Note:
- If this post answers your question, please click the "Mark Correct" button. Thank you!
- We are following threads for 7 weeks after the last post, later replies are ignored
Please open a new thread and refer to the closed one, if you have a related question at a later point in time.
-------------------------------------------------------------------------------
Hi,
Maybe you can consider other timer modules to implement your purpose.
TIC
-------------------------------------------------------------------------------
Note:
- If this post answers your question, please click the "Mark Correct" button. Thank you!
- We are following threads for 7 weeks after the last post, later replies are ignored
Please open a new thread and refer to the closed one, if you have a related question at a later point in time.
-------------------------------------------------------------------------------
Hi,
Thank you for your interest in NXP Semiconductor products and for the opportunity to serve you.
To implement your design, the key point is to select an appropriate clock to feed the GPT timer, you can make it via clock selection gate and Prescaler just as the below figure shows.
Have a great day,
TIC
-------------------------------------------------------------------------------
Note:
- If this post answers your question, please click the "Mark Correct" button. Thank you!
- We are following threads for 7 weeks after the last post, later replies are ignored
Please open a new thread and refer to the closed one, if you have a related question at a later point in time.
-------------------------------------------------------------------------------
Hi Dear @jeremyzhou
Appreciate for your support. I tried these method by using ipg_clk_32k by (kGPT_ClockSource_LowFreq) enum with the line below in default config:
config->clockSource = kGPT_ClockSource_LowFreq;//kGPT_ClockSource_Periph;
void GPT_GetDefaultConfig(gpt_config_t *config)
{
assert(NULL != config);
/* Initializes the configure structure to zero. */
(void)memset(config, 0, sizeof(*config));
config->clockSource = kGPT_ClockSource_LowFreq;//kGPT_ClockSource_Periph;
config->divider = 1U;
config->enableRunInStop = true;
config->enableRunInWait = true;
config->enableRunInDoze = false;
config->enableRunInDbg = false;
config->enableFreeRun = false;
config->enableMode = true;
}
But that is not exactly what I want because the lowest frequency value that can bu used is 32KHz as I see. When I divide this frequency with 4096 (maximum number of prescalar) result is 7 Hz. So this is the minimum working frequency that I can reach. But I need 1 Hz for setting every count of tick (with the help of GPT_GetCurrentTimerCount function) is 1 second or 0.001Hz setting every count is 1ms. Could you help me please ?
Thanks and Regards.
Hi,
In my opinion, maybe you can consider other timer modules to make every count of tick is 1 second happen.
TIC
-------------------------------------------------------------------------------
Note:
- If this post answers your question, please click the "Mark Correct" button. Thank you!
- We are following threads for 7 weeks after the last post, later replies are ignored
Please open a new thread and refer to the closed one, if you have a related question at a later point in time.
-------------------------------------------------------------------------------
Hi Dear @jeremyzhou ,
This Messages is just a reminder for my above question. I know you have too many issue.
Thanks and Regards.