I am currently upgrading a project from the FRDM-KL27Z to the new FRDM-KL82Z, which means using the standalone KL82Z SDK as it has not been integrated with KSDK 1.3 yet. I've been walking through my source files and testing the code that uses the SDK drivers such as SPI, I2C, etc. Everything I tested either worked immediately or needed minor modification until I started in on the I2C Master driver (blocking).
I am using the I2C1 driver to communicate with an Atmel ATECC508 CryptoAuthenticator IC. The code I built on top of the I2C driver wasn't working so I decided to start from scratch (with an SDK driver example) to see if I was missing something. I am using the i2c_blocking_master_example_frdmkl82z project as a testing ground and only made modifications necessary to use I2C1 instead of I2C0 which is the default bus.
These modifications included:
NOTE: My problem is very similar to the question found here: K22 I2C SDK Driver behaving strangely? (As in failing...) but the suggestions there did not help.
The issue I'm experiencing is that it appears the i2c master interrupt handler is not getting called when using I2C_DRV_MasterSendDataBlocking(). This is what I know:
(Above is my Oscope readout confirming the device address of 0xC0 is sent and ACK'd by slave)
What I've tried:
Test Conditions:
I don't know what else to try since the device does indeed ACK its own address, but the master driver does not seem to "receive" the ACK and follow up with transmitting the byte data to the slave device, indicating that the issue lies in interrupt handler not getting called which I've confirmed with breakpoints in the handler. Please help!
Solved! Go to Solution.
Cody,
The interrupt mux is present on cortex-M0 devices where the number of interrupts in the system is larger than what the cortex-m0 can handle (32 individual interrupt channels). The I2C1 interrupt is not assigned to one of the 32 core interrupts, but instead is placed on the intmux peripheral. You can see this assignment from the KL82 reference manual, section 3.2.5.
In order to route the interrupt from the I2C1, you must route it through one of the cortex-m NVIC vectors assigned to INTMUX-0. You can see the choices below.
The code that Susan has provided above will enable the INTMUX clock, choose the I2C1 interrupt on the INTMUX, and then finally enable the INTMUX interrupt.
The reason you don't see this special step on the KL27 is assigned to the NVIC controller, table 3-2 in the KL27 reference manual.
I hope this helps you get your I2C1 solution working on the KL82!
Jason
Thanks everyone for helping this noob!
Cody,
The interrupt mux is present on cortex-M0 devices where the number of interrupts in the system is larger than what the cortex-m0 can handle (32 individual interrupt channels). The I2C1 interrupt is not assigned to one of the 32 core interrupts, but instead is placed on the intmux peripheral. You can see this assignment from the KL82 reference manual, section 3.2.5.
In order to route the interrupt from the I2C1, you must route it through one of the cortex-m NVIC vectors assigned to INTMUX-0. You can see the choices below.
The code that Susan has provided above will enable the INTMUX clock, choose the I2C1 interrupt on the INTMUX, and then finally enable the INTMUX interrupt.
The reason you don't see this special step on the KL27 is assigned to the NVIC controller, table 3-2 in the KL27 reference manual.
I hope this helps you get your I2C1 solution working on the KL82!
Jason
Update: I switched back to I2C0 and everything works perfectly, so there seems to be an issue with the interrupt handler when I2C1 is configured. Although I can move forward, It would be nice to know why this is occurring on the I2C1 bus.
Hi Cody Lundie,
For the I2C1 can not work well by just calling I2C_DRV_MasterSendDataBlocking() issue, it's because that the interrupt of I2C1 is under INTMUX, and because the intmux is very flexible and user could freely choose to mux the interrupt to one of the channels, so this work is left out to be done by users.
So to use the interrupts that are under INTMUX, user need to enable the intmux interrupt in the application code and configure the according interrupt of the using IP to one of the INTMUX channels, for current case is I2C1. And user also needs to include the interrupt entry of INTMUX implemented in fsl_interrupt_manager_irq.c. The following code could be a reference, hope it could help:
I2C_DRV_MasterInit(BOARD_I2C_COMM_INSTANCE, &master, &masterConfig); #if (FSL_FEATURE_SOC_INTMUX_COUNT) if (g_i2cIrqId[BOARD_I2C_COMM_INSTANCE] >= FSL_FEATURE_INTMUX_IRQ_START_INDEX) { CLOCK_SYS_EnableIntmuxClock(INTMUX0_IDX);
INTMUX_HAL_EnableInterrupt(INTMUX0, kIntmuxChannel0, 1U << (g_i2cIrqId[BOARD_I2C_COMM_INSTANCE] - FSL_FEATURE_INTERRUPT_IRQ_MAX -1U));
INT_SYS_EnableIRQ(INTMUX0_0_IRQn); }
Chun Su,
Thank you for your response! I'm sorry but I'm very new to your products and I'm unclear as to how your code snippet from your last comment should be used? Could you explain more on what needs to be done in order to use I2C1? I never had to do this for the KL27Z and not sure how to proceed. I appreciate your time on this matter.
Cody