Hello,
I would presume that the real requirement here is to generate a delay period with microsecond resolution, rather than a delay period of one microsecond. It would appear that the OP was considering a delay based on timer ticks of 1 us, with the implication that there would be one interrupt per tick.
An alternative way of achieving the required resolution, with a single interrupt only at the expiry of the delay period, is to utilize output compare mode for a TPM channel. However, for software output compare mode, there will always be timing inacuracy due to setup and activation of the delay, and latency of interrupt processing at the conclusion of the delay. A correction can sometimes be applied to eliminate some of this inaccuracy, but there will be a minimum allowable delay period.
Polling the channel flag, rather than using interrupts, may reduce the timing uncertainty, but here interrupts would need to be globally disabled during the delay period (probably not an issue for a delay of a few microseconds). Best accuracy, with usually few latency issues will be achieved using the hardware output pin for the TPM channel, but this may not be appropriate for many applications.
What timing precision is actually required? The original indication was that the delays were required for a LCD function. For alphanumeric LCD modules with integral 8-bit or 4-bit parallel interface, the delay periods are usually not too critical, provided minimum delay requirements are met.
Regards,
Mac