Hello,
I am new to the i.MX platform and Android BSP.
I am using i.MX8M Plus with the NXP Android BSP, and I am developing a display system (digital cluster) that communicates with the CAN bus.
I want to confirm if my understanding of the process is correct:-
-> I download and flash the Android BSP image from NXP (which includes the Linux kernel and CAN drivers).
->The BSP already supports SocketCAN, so I don’t need to modify the CAN HAL or kernel code.
->From Android Studio, I can build an app that:
-Opens the SocketCAN interface (e.g.,can0 using C/C++ via NDK/JNI
-Receives CAN frames from the bus
-Decodes them and updates UI variables (like speed, RPM, etc.) in the Android app (Java/Kotlin layer)
->In the future, I plan to add camera and navigation features, which I also plan to implement in the same Android Studio app.
Could you please confirm:
-> Is this overall approach correct for using CAN communication in an Android BSP environment?
-> Is it the right way to open and read SocketCAN data from an app using Android Studio (through JNI)?
-> For camera and navigation integration, is it also fine to develop them directly in Android Studio (application layer)?
->If this process is not correct, could you please explain the proper method or architecture I should follow?
I am very new to this platform, so I would really appreciate a clear explanation or references to official documentation that describes the correct flow for Android BSP development with CAN.