On my IMX23 EVK board, I can run a test (basic serial port stuff) to talk from the IMX23 EVK to a "remote" system in my development environment. It's a simple serial comm layer with a basic packet format. Nothing special.
The problem is that each data byte sent via the IMX23 EVK debug UART is delayed from the previous byte by 20ms (as measured with a logic analyzer). This, of course, is killing the transmit speed.
0037727789 (19702 us): TXd 0x5A
0037747818 (20029 us): TXd 0x5A
0037767874 (20056 us): TXd 0x5A
0037787780 (19906 us): TXd 0x5A
0037807775 (19995 us): TXd 0x02
0037827776 (20001 us): TXd 0x08
0037847804 (20028 us): TXd 0x64
When I run this exact same test program to talk over an attached FTDI USB cable instead (using the serial port interface), I see much more reasonable inter-byte delays of about 3.5ms.
0006399679 (3590 us): TXd 0x5A
0006403261 (3582 us): TXd 0x5A
0006407229 (3968 us): TXd 0x5A
0006410877 (3648 us): TXd 0x5A
0006414275 (3398 us): TXd 0x06
0006417928 (3653 us): TXd 0x02
0006421259 (3331 us): TXd 0x00
Due to design limitations in the "remote" system, the normal inter-byte delay should be about 2ms. The code is specifically written to delay for this time. So I can live with the 3.5ms, but 20ms is far too long and I can't figure out why the debug UART is so slow.
I had a similar problem when I originally tried to use the application UART, but that was resolved by disabling DMA mode on that driver. I do not know why that seemed to improve things, but it absolutely did. The debug UART driver, of course, is not using DMA mode. It's not even an option. It's using a basic Rx/Tx interrupt and as far as I can tell, it *should* be working fine. But yet I'm seeing these 20ms delays between transmit bytes.
What am I missing? Is there an additional coalescence delay somewhere that I'm missing?
Hi together,
I have same issue when porting application using uart to imx6 and raspberry pi 2 model B. Interbyte time is time that each data byte sent via UART is delayed from the previous byte. With my application has interbyte time > 3ms, this time is too long and can't accept .more detail :
Can you suggest me a solution to reduce time delay in case 1, interbyte time is too long (>3ms )
I'm looking to respond from you !
Many thanks !
The 20ms inter byte delay I was seeing on the debug UART was caused by the duart_tx_empty method in mxs-duart.c. It's returning an inappropriate status as far as I can tell. It's supposed to return an indication that the tx fifo is empty. Instead it was returning an indication of something like !BUSY & !FULL. This was causing the port->timeout in serial_core to kick in which was set to basically pad with 20ms (see uart_update_timeout to read about the "0.02 seconds of slop" added to the timeout)
I've attached a patch in case any of the crickets in this forum want to look at it. :) Just note that I've done zero testing of this patch for anything other than my specific purpose. There may well be perfectly valid reasons why the code was as it was originally. I'm just not seeing it. And, for what it's worth, the way this patch returns tx_empty is the same way the application UART returns tx_empty when DMA is disabled. So we do at least have some indication that this is how Freescale had intended the function to work.
Oh...one thing that just dawned on me that might apply here. The Rx and Tx lines are tied together in this case. So every byte that's written out to the port by the IMX23 driver is also immediately read back in as a received byte. I wonder if that's kicking off something goofy.
OK, I've looked at the code side-by-side (debug UART vs. application UART). I do not have any explanation for why the application UART seems to have the same 20ms interbyte delay when using DMA (CONFIG_MXS_AUART1_DMA_ENABLE) as the debug UART does without DMA enabled (not even an option).
If I disable DMA on the application UART, I do not see this 20ms delay at all. Everything flies through with minimal delay.
I'm missing something and I don't think it has anything to do with the IMX23 itself. I think it probably has to do with something more general, like something related to Linux device driver call sequences. Unfortunately, I don't know where to ask or read up on this to find out. Hopefully *someone* on here will pick up on what I'm missing and point me in the right direction.
I went back to confirm that I could recreate the slowness on the application UART by enabling DMA mode and I could. This time, however, I measured the inter-byte delay and got the same 20ms as was seen with the debug UART. So now I'm a little confused.
I'll go back and read the source and reference manual on the debug UART again. Maybe it's using DMA as well and I just misread something. If so, then maybe the whole problem is something DMA related that I just don't understand yet.