Hi!
I'm working to use the RT1010 in a custom application where I2C is used to communicate with a motor driver. The root clock of the LPI2C module is 8MHz (derived from the 24MHz clock), and the I2C bus clock should be 100KHz. To make sure the pullups are correct, i've installed 2k2 ohms.
Copied from the I2C polling example i've initialised the module like this:
LPI2C_MasterGetDefaultConfig(&gI2cMasterConfig);
gI2cMasterConfig.baudRate_Hz = 100000;
LPI2C_MasterInit(LPI2C1, &gI2cMasterConfig, 8000000);
Then;
if(LPI2C_MasterStart(LPI2C1, 0x0, kLPI2C_Write) != kStatus_Success){
return 1;
} And then checking in a while loop (like in the example) wether the fifo TX count is zero. And it is, but the MasterSend call will be stuckin the while() loop.
Looking with a logic analyzer, I saw this:

The peripheral sends a start condition with a 0 (as requested) but then keeps sending clocks. Keeps sending. After 10 minutes, the clocks are stil transmitted.
When I change the address to 0x55 (0b01010101), the 0's that are transmit change into 1's. Might be the last bit I'm seeing?
I have played a bit with the sclGlitchFilterWidth_ns and sdaGlitchFilterWidth_ns values, but this gave no other result.
What could be the culprit here?
Thanks!
Marcel