Jim Picariello

i.MX287 I2C Slave Issue

Discussion created by Jim Picariello Employee on Sep 14, 2011
Latest reply on Jan 22, 2013 by Zaheer Ahmad

I am working with a customer debuging i.MX287 I2C configured for slave mode.  Their diagnostics engineer has I2C0 setup as a slave and I2C1 setup as a master. The two ports are connected together and he’s trying to get the master to read data from the slave. In his first attempt  he sets up the I2C0 (slave) DMA to transmit 512 bytes of data and the I2C1 (master) DMA to receive 512 bytes of data. This did not work for him as the master never issued a START cycle. He then tried to chain 2 separate DMA transactions together on both the master and the slave side. On the master side he programmed the first transaction to transmit 1 byte and then read 512 bytes. On the slave side he programmed the first transaction to read 1 byte and then send 512 bytes. In this experiment The master sends the START cycle and writes the 1 byte and the slave ACKs it. The master then does 512 read cycles and the slave outputs nothing. Is there some trick they’re missing here or something they’re doing wrong?

Here's an updated version of the memo they sent out with all the details.

subject: mx287 I2C controller anomaly
date:    2 sep 2011
from:    Guy Viviers

    I am writing test code that passes test frames back and forth between
2 I2C controllers, one being configured as the master and the other being
configured as the slave. There are no other devices on the bus and I have
total control of both I2C controllers involved in the transfers so I am
not bothering with traditional I2C addressing. I am only interested in
proving that the master and the slave may reliably exchange data.

   Two of the I2C controllers under test are mx287's built-in controllers
and the rest are bit-banged (BB) GPIO I2C controllers. I have 4 scenarios
that I want to test. In scenarios 1 and 2 the mx287 I2C0 channel is the
slave and one of the BitBanged I2C channels is the master. In scenarios
3 and 4 the mx287 I2C0 channel is the slave and the mx287 I2C1 channel
is the master.

scenario 1) this scenario works

    ~ configure BB I2C as a master
    ~ configure mx287 I2C0 as a slave
    ~ program the mx287 I2C0 dma to receive 512 bytes of data
    ~ BB issues I2C START
    ~ BB sends 512 bytes and checks the ACK flag after each byte
    ~ BB issues I2C STOP
    ~ check the I2C0 dma status and compare the data

scenario 2) this scenario works

    ~ configure BB I2C as a master
    ~ configure mx287 I2C0 as a slave
    ~ program the mx287 I2C0 dma to send 512 bytes of data
    ~ BB issues I2C START
    ~ BB reads 512 bytes
    ~ BB issues I2C STOP
    ~ check the I2C0 dma status and compare the data

scenario 3) this scenario works

    ~ configure mx287 I2C1 as a master
    ~ configure mx287 I2C0 as a slave
    ~ program the mx287 I2C0 dma to receive 512 bytes of data
    ~ program the mx287 I2C1 dma to transmit 512 bytes of data
    ~ check the I2C0 and I2C1 dma status and compare the data

scenario 4) this scenario doesn't work

    ~ configure mx287 I2C1 as a master
    ~ configure mx287 I2C0 as a slave
    ~ program the mx287 I2C0 dma to transmit 512 bytes of data
    ~ program the mx287 I2C1 dma to receive 512 bytes of data

In this scenario I2C1 is a master and is programmed to do read
cycles, but it never issues a START cycle. I can only guess that
there is something inside the state machine that does not think
this is a reasonable thing to do and makes the decision to ignore
the request.

I reasoned that all I needed to do to get this particular scenario
to work was to do things as they would be done in a normal I2C
environment, specifically have the master I2C1 channel issue an I2C
ADDRESS cycle prior to asking it to perform some I2C READ cycles.
So I added code to the I2C0 slave interface that performs address
recognition and acknowledgement and I added code to the I2C1 master
that performs a DMA ADDR write cycle prior to doing READ cycles.

Et voila, it worked! Well sort of ... I can see on the logic analyzer
that the I2C1 master now issues an ADDR write cycle, and that the
I2C0 slave ACKs the ADDR, and that the I2C1 master now issues the
READ cycles that I wanted. The only thing that's wrong is that the
I2C0 slave DMA never outputs data when the I2C1 master issues the
READ clock cycles.

I searched the entire planet and could not find a single bit of
code or a single app note that describes how to use an mx287 I2C
channel in slave mode. I looked through the reference manuals of
every single Freescale device and found that the mx287 is the only
part that has this type of I2C controller. Every other Freescale
part uses the same SIMPLE I2C controller, which bears no resemblance
to the one inflicted upon the mx287.

After a day or so of trial and error I decided to ignore the "doc"
that said that all you need to do after a successful address cycle
was to program the slave's DMA for the appropriate DMA read or
write cycle.

I decided that I was going to treat the address recognition cycle
and the subsequent DMA read or write cycle as 2 totally separate
transactions. Because I already knew how to set up the slave DMA
to respond to I2C READ cycles I concentrated on finishing the
address recognition phase by manually manipulating the slave's
register bits.

I reasoned that all I needed to do was to clear one bit called
CLOCK_HELD then clear the channel's RUN bit and the ADDR cycle on
the slave would be finished. Clearing the CLOCK_HELD bit does 2
things: it asserts the DATA line LOW which the master interprets
as an ADDR ACK, then it releases the CLOCK line which it had been
holding low so as to freeze the bus while the slave software
decides whether or not it is going to respond.

I found that if I cleared the CLOCK_HELD bit in one instruction
and cleared the RUN bit in the next instruction that I had the
same result as before, the master sees the ACK but the slave
never responds to its subsequent read clocks. So I threw in an
arbitrary delay of 5us between the 2 instructions and the slave
DMA started working.

I also found that if I changed the delay to 10us that the slave
DMA stopped working. I took a look at these cycles on the logic
analyzer and I could not believe my eyes, so I called Dave
Eiselen and asked him to verify what I was seeing.

If the delay between the 2 instructions was too short the master
would never see the ADDR ACK and would abort the entire I2C
transaction. If the delay between the 2 instructions was longer
than 5us the slave DMA would not work. If the delay was exactly
5us the master would see the ACK and the slave DMA would work.

A look at the logic analyzer shows that when the delay between
the 2 instructions is just right that the DATA line transitions
from a LOW to a HIGH just before the CLOCK line goes LOW. This
is the exact definition of an I2C STOP cycle and I'm guessing
that the low-level logic in the slave DMA machine sees this
cycle and terminates the ADDR cycle which allows the DMA to
continue on and respond appropriately to the next incoming
I2C READ cycles. Of course I could be wrong ...

So while I have gotten the mx287 I2C channel to work in slave
mode I do not feel that my method is what was intended by the
chip manufacturer. There are 2 possibilities here. One is that
there really is a problem using the I2C channel in slave mode
and I have stumbled onto a way to get it to work or the more
likely case is that I have missed something.

I believe that we should get to the bottom of this mystery
because the customer intends to use their mx287 I2C channels
in slave mode.

 

Any comments, suggestions, sample code on how to resolve this issue would be greatly appreciated.

Thanks,

Jim

Jim Picariello

Sr. Field Applications Engineer

Freescale Semiconductor

300 Unicorn Park Drive

Woburn, MA  01801

Tel:  781-932-6045

Cell: 978-987-1744

Fax: 781-932-9100

Email: jim.picariello@freescale.com

 

This email communication along with any attachments is classified as:

 

[X] General business information

[ ]  Freescale internal use only

[ ]  Freescale confidential propriety

Outcomes