i.mx7 ULP ends up in DefaultISR when using LPI2C1_IRQn

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

i.mx7 ULP ends up in DefaultISR when using LPI2C1_IRQn

939 Views
klaas
Contributor III

Hi all,

I'm currently facing issues while testing I2C interrupts on a costum imx7ulp board.

Using examples provided with IAR works fine (lpi2c_interrupt_transfer). But this example is using an old SDK version: fsl_lpi2c.h with version 2.1.1).

For my project I would like to base my project on the new SDK and therefore I did build a new project with the config tool. After that I copied the I2C master part of the example project mentioned above into my main.cpp:

/*
 * Copyright (c) 2013 - 2015, Freescale Semiconductor, Inc.
 * Copyright 2016-2017 NXP
 * All rights reserved.
 *
 * SPDX-License-Identifier: BSD-3-Clause
 */

#include "fsl_device_registers.h"
#include "fsl_debug_console.h"
#include "pin_mux.h"
#include "clock_config.h"
#include "board.h"

#include "fsl_lpi2c.h"

lpi2c_master_handle_t g_m_handle;
uint8_t g_master_buff[32];

/*!
 * @brief Main function
 */
int main(void)
{
    char ch;

    /* Init board hardware. */
    BOARD_InitPins();
    BOARD_BootClockRUN();
    BOARD_InitDebugConsole();

    PRINTF("Start CM4 .......................... \r\n");
    
    lpi2c_master_transfer_t masterXfer = {0};
    status_t reVal = kStatus_Fail;
    
    CLOCK_SetIpSrc(kCLOCK_Lpi2c1, kCLOCK_IpSrcFircAsync);
    NVIC_SetPriority(LPI2C1_IRQn, 0);
    lpi2c_master_config_t masterConfig;

    /*
     * masterConfig.debugEnable = false;
     * masterConfig.ignoreAck = false;
     * masterConfig.pinConfig = kLPI2C_2PinOpenDrain;
     * masterConfig.baudRate_Hz = 100000U;
     * masterConfig.busIdleTimeout_ns = 0;
     * masterConfig.pinLowTimeout_ns = 0;
     * masterConfig.sdaGlitchFilterWidth_ns = 0;
     * masterConfig.sclGlitchFilterWidth_ns = 0;
     */
    LPI2C_MasterGetDefaultConfig(&masterConfig);

    /* Change the default baudrate configuration */
    masterConfig.baudRate_Hz = 100000U;

    /* Initialize the LPI2C master peripheral */
    LPI2C_MasterInit(LPI2C1, &masterConfig, CLOCK_GetIpFreq(kCLOCK_Lpi2c1));

    /* Create the LPI2C handle for the non-blocking transfer */
    LPI2C_MasterTransferCreateHandle(LPI2C1, &g_m_handle, NULL, NULL);

    for (int i = 0; i < sizeof(g_master_buff); i++)
    {
        g_master_buff[i] = i;
    }
    /* Setup the master transfer */
    masterXfer.slaveAddress = 0x7E;
    masterXfer.direction = kLPI2C_Write;
    masterXfer.subaddress = 0;
    masterXfer.subaddressSize = 0;
    masterXfer.data = g_master_buff;
    masterXfer.dataSize = sizeof(g_master_buff);
    masterXfer.flags = kLPI2C_TransferDefaultFlag;

    /* Send master non-blocking data to slave */
    reVal = LPI2C_MasterTransferNonBlocking(LPI2C1, &g_m_handle, &masterXfer);

    while (true) {}
}

 When I execute this I will always end up in the DefaultISR after LPI2C_MasterTransferCreateHandle() function is called.

Any Idea what is going wrong here? For me it looks like I did everything as done in the example. The major difference is, that I'm using a new SDK version (fsl_lpi2c.h: 2.3.0).

Thanks
Klaas

Tags (3)
0 Kudos
2 Replies

936 Views
klaas
Contributor III

Hi all,

I found the issue. My project was configured to C++ instead of C as for the example.

Now I have to figure out how this works with C++...

Thanks anyway
Klaas

0 Kudos

930 Views
art
NXP Employee
NXP Employee

Glad to hear that the problem is solved.

Best Regards,

Artur

0 Kudos