I use a KV10Z32, Flex Timer 0 channel 3 to trigger a ADC1 digital analog conversion.
With the management of the launch of the conversion by interruption I observe a latency of 2.8μs with a 21MHz default system clock. Which is correct in relation to the number of instructions the software (assembler) takes to watch this event by a GPIO.
But with eDMA processing (a single transfer of 1 word to ADC1_SC1A) this latency is more important and dependent on DMA_TCDn_CSR BWC :
BWC=00 : 14µs (reference)
BWC=01 : 14µs
BWC=10 : 26µs (+12µs)
BWC=11 : 38µs (+24µs)
The datasheet says "BWC=10 eDMA engine stalls for 4 cycles after each R/W." and "BWC=11 eDMA engine stalls for 8 cycles after each R/W."
Which means that a cycle corresponds to 3μs @21MHZ : What cycle are we talking about? Surely not that of the system clock.
It also seems that this delay is before and not after each transfer.
Is it possible to have latency with eDMA processing that approximates the one by interrupt ?