I am wondering whether any one else has experience here and can help in confirming understanding /or improving it?
The story. I have a device receiving frames at 250k on an UART. The frames vary in size from say 50 bytes to 1k and the DMA controller is set up to do the transfer (up to a slighly longer frame size).
The UART is set up to interrupt on detection of a break condition and the interrupt terminates the active DMA reception, wakes a receiver task and reconfigures the DMA controller so that it can receive the next frame.
It is working but I did have a couple of difficulties. Previously I was doing the same thing but using character interrupts and the method had to be changed quiet a lot and this is what is of interest.
1. When working in interrupt mode there was a break interrupt and a character reception at the same time - I believe the break is put to the input buffer as a 0x00 and basically only the RX buffer interrupt was necessary with a check of the break status bit in USR_UCSR.
2. In DMA mode I had to change to work with the break detection interrupt (since rx characters were no longer causing interrupts). The the break detection bit is no longter set in USR_UCSR (seems only to be valid when the Rx ready byte is also set, which doesn't happen in DMA mode).
3. The break detection interrupt seems rather different. There are in fact two interrupts - one for start of break and one for end of break detect (at least I get two and have to filer the second one out). It is also necessary to reset the break change condition by using a RESET BREAK command (twice - once for each change).
4. In DMA mode it seems that also one (at least) extra character (0x00 break I assume) is transfered by the DMA controller to the rx buffer. Corresponds to experience in interrupt mode.
These are my experiences. Can any one confirm the operation which has been seen?