Hi All,
Has anybody a example for a simple delay,sleep function in milliseconds
microsecond by using GPT Timer?
I dont know how can calculate the time by using Timer. Could you help me please?
Just like below :
Delay(1000); //1sec delay
Sleep(1000);//1sec sleep
Thanks and Regards.
Solved! Go to Solution.
I apologize for my delayed reply.
1. No, the delay time is based on the period of the GPT timer. If the GPT clock is 1Mz then each tick equals 1us.
2. CNT register contains the current value of the main counter and the output compare value holds the value that determines when a compare event is generated on the corresponding Output Compare Channel. When the CNT value equals the Compare value a compare event is generated. The timer counts based on the clock source of the module.
3. You should use the clock source frequency of the module.
If you have more questions do not hesitate to ask me.
Best regards,
Omar
EDIT:
Dear @Omar_Anguiano Hi again,
I have three more question.
Q1:Is the unit of delay_time second in the equation that you attach your answer?
Q2: What is the relationship between base->CNT increasion and output compare value? So What is the time between the value of base->CNT when it pass form n to n+1 for example 0 to 1.
Q3:When should I use base->CNT value if I can obtain delay with your equation?
Thanks and Regards.
I apologize for my delayed reply.
1. No, the delay time is based on the period of the GPT timer. If the GPT clock is 1Mz then each tick equals 1us.
2. CNT register contains the current value of the main counter and the output compare value holds the value that determines when a compare event is generated on the corresponding Output Compare Channel. When the CNT value equals the Compare value a compare event is generated. The timer counts based on the clock source of the module.
3. You should use the clock source frequency of the module.
If you have more questions do not hesitate to ask me.
Best regards,
Omar
Hello
Hope you are well.
We don´t have a specific example but I have some suggestions to achieve this. Based on the frequency of the timer you can calculate when the interruption should occur.
If you want a 1-second delay and the GPT frequency is 10Mhz you should set the output compare value to 10,000,000 ticks.
This equation might help:
You can use a loop cycle that exits when the interruption of the GPT occurs. The SDK has an example that can be used as a starting point for this.
Let me know if this is helpful, if you have more questions do not hesitate to ask me.
Best regards,
Omar
Hi Dear @Omar_Anguiano ,
That is exactly what I am trying to do. I watch some of the tutorials on STM but I did not know the unit of output compare value is Tick. I just looked for the interrupt method that system calls when every tick is occured. I will try your suggestions.
EDIT:
Dear @Omar_Anguiano Hi again,
I have three more question.
Q1:Is the unit of delay_time second in the equation that you attach your answer?
Q2: What is the relationship between base->CNT increasion and output compare value? So What is the time between the value of base->CNT when it pass form n to n+1 for example 0 to 1.
Q3:When should I use base->CNT value if I can obtain delay with your equation?
Thanks and Regards.