AnsweredAssumed Answered

K22 I2C SDK Driver behaving strangely? (As in failing...)

Question asked by Donald Bosley on Nov 4, 2014
Latest reply on Nov 6, 2014 by Donald Bosley

EDIT (The Short Story) : IICIE does not set after  I2C_HAL_SetIntCmd(baseAddr, true); in I2C_DRV_MasterSendDataBlocking, but it does set and goes through its proper courses when used in I2C_DRV_MasterReceiveDataBlocking. So, 0x04 timeouts on sends, 0x00 successful receives, but using the exact same subfunction to enable the interrupt? Makes no sense unless there is some other bit combination it isn't compatible with....If I manually set IICIE after the I2C_HAL_SetIntCmd(baseAddr, true); in I2C_DRV_MasterSendDataBlocking, transmission succeeds, so I know the bit needs to be set manually somehow...will update on that.

 

The Long Story:

Currently trying to program registers of a Max 98090 Audio CODEC. Works perfectly with another device, but running into some static with my K22 FRDM board using the Master Driver.

This version of Max98090 addresses at 0x20 / 0x21 for W/R, so my address variable is set at 0x10 (0010000). hardware seems to be initialized properly and based on scope/timings is operating "properly."

I had the I2C comm demo working and have tried to copy as many of my settings as possible from that.

 

Variations I have tried:

I have tried using I2C1 on different pins, and noticed it helped square off my waveforms, likely because it isn't sharing any leads/trace with another device as I2C0 on PB2/3 is (hooked to the accel/mag).

No pullups, 1k, 2.2k, and 10k. No major differences. Tried 50, 100, 400kHz. 50 kHz times out, but 100 and 400 work.

 

Using Master Send Data results in the correct address, and ACK from the CODEC as slave, but nothing beyond that....returns time out every time. If I stop the transfer with a breakpoint and manually set the IE before the send address bit it works, otherwise no dice. (The interrupts work for master receive)

NoACK.png

 

Master Receive is successful, but works strangely compared to my previous experience with I2C and TWI. In order to read one byte, it goes through the following sequence :

1)Start, send address (Write address, not read?)

2)Sends the Register value I want to read

3)Sends a read address

4) Reads the actual value (correctly)

Write.png

Why start out with a write instead of starting with a read address directly, followed by the register, and then receive a value? Is there a specific reason in logic, performance, or mechanics?

Outcomes