Hello NXP Team,
I’m working with an NXP MCU based on the Cortex-M33 core (e.g., i.MX93 mcimx9352), and I noticed that when I use:
SDK_DelayAtLeastUs(10, SDK_DEVICE_MAXIMUM_CPU_CLOCK_FREQUENCY);
to generate a 10 µs trigger pulse, the actual pulse width measured on the oscilloscope is around 16 µs.
I’m using the default SDK definition:
#define SDK_DEVICE_MAXIMUM_CPU_CLOCK_FREQUENCY (250000000UL)
which, according to the datasheet, is the maximum clock frequency supported by the M33 core.
Is SDK_DEVICE_MAXIMUM_CPU_CLOCK_FREQUENCY supposed to match the actual core frequency at runtime, or is it just a max-safe default?
Is there a recommended way to dynamically query the actual core clock and pass it to SDK_DelayAtLeastUs() instead of relying on the macro?
Any clarification or best practices would be appreciated. Thanks in advance!
Best regards
Manjunath B
i.MX93 CONTROL-AND-TIMING-RELAY