Hello NXP Team,
I’m working with an NXP MCU based on the Cortex-M33 core (e.g., i.MX93 mcimx9352), and I noticed that when I use:
to generate a 10 µs trigger pulse, the actual pulse width measured on the oscilloscope is around 16 µs.
I’m using the default SDK definition:
which, according to the datasheet, is the maximum clock frequency supported by the M33 core.
Is SDK_DEVICE_MAXIMUM_CPU_CLOCK_FREQUENCY supposed to match the actual core frequency at runtime, or is it just a max-safe default?
Is there a recommended way to dynamically query the actual core clock and pass it to SDK_DelayAtLeastUs() instead of relying on the macro?
Any clarification or best practices would be appreciated. Thanks in advance!
Best regards
Manjunath B
Hi,
SDK_DEVICE_MAXIMUM_CPU_CLOCK_FREQUENCY is not real frequency, it's the maximum value.
You can use CLOCK_GetFreq to get real frequency, the pass this value to SDK_DelayAtLeastUs.
Best Regards,
Zhiming