Using GPU on i.MX8qmek for DNN infernece

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Using GPU on i.MX8qmek for DNN infernece

Jump to solution
14,746 Views
ullasbharadwaj
Contributor III

I am currently trying to evaluate different inference engines with TensorFlow and TensorFlow Lite models on i.MX8 QMEK. I follow the eIQ guide form NXP and using L4.14 Release.

I tried with OpenCV DNN module, TFLite Interpreter and Arm NN. I was not able to use GPU with any of them. I know OpenCV does not run on GPU due to OpenCL compatibility issue on i.MX8 but can I not also use GPU with TFLite and Arm NN? 

On other hand, the Arm NN examples with e.IQ does not provide an option to use GPU at all. 

In this thread, Arm NN support for the i.MX 8 GPUsvanessamaegima suggested only TFLite engine supports GPU at the moment. 

So it is all confusing to me whether GPU is of any use on imx8qmek for running DNN inference.  

So if there is a way to use GPU, kindly let me know. This has me bogging my head from quite a while now. 

Best Regards 

Ullas Bharadwaj

Labels (1)
1 Solution
13,401 Views
diego_dorta
NXP Employee
NXP Employee

Hi Ullas,

Which BSP version are you using it? 5.4.3_1.0.0? If so, it will not work properly. You must build an image using the 5.4.3_2.0.0 version which is required by PyeIQ.

About the internet connection, you must connect your board because PyeIQ requires a connection to retrieve what it needs to work.

Thanks,

Diego

View solution in original post

0 Kudos
27 Replies
13,402 Views
diego_dorta
NXP Employee
NXP Employee

Hi Ullas,

Which BSP version are you using it? 5.4.3_1.0.0? If so, it will not work properly. You must build an image using the 5.4.3_2.0.0 version which is required by PyeIQ.

About the internet connection, you must connect your board because PyeIQ requires a connection to retrieve what it needs to work.

Thanks,

Diego

0 Kudos
2,132 Views
ullasbharadwaj
Contributor III

Hi Diego,

Yes it was 5.4.3_1.0.0 !
I will start building 5.4.3_2.0.0 it and come back to you. Thanks for the lead. 


Do you know if there is an alternative to get all required python packages on PC and then take it to the board?
Because I am using laptop connected via WiFi and trying to get internet on the board via ethernet which was not possible.


vanessamaegima‌ Thanks for the clarification.  Is there example source code for SSD MobileNet object detection in C++ for Arm NN from NXP?


Best Regards
Ullas Bharadwaj

0 Kudos
2,132 Views
diego_dorta
NXP Employee
NXP Employee

Hi Ullas,

PyeIQ does not retrieve any Python package, it uses exactly what is supported on 5.4.3_2.0.0. What it does is to retrieve Machine Learning models, labels or any related data to run the applications or demos.

Unfortunately, that was how PyeIQ was designed and you will need at least for now (I will check the possibility to include a way to get the models on host machine) connect it to the internet.

The second version will be released soon which plenty of more samples. Check in a few days on the eIQ community :smileyhappy:

Thanks,

Diego

0 Kudos
2,132 Views
vanessa_maegima
NXP Employee
NXP Employee

For ArmNN C++ examples you can refer to the ArmNN repository, for example: armnn/TfLiteMobileNetSsd-Armnn.cpp at branches/armnn_20_05 · ARM-software/armnn · GitHub 

Please check the BSP User Guide for more information on the examples.

0 Kudos
2,132 Views
ullasbharadwaj
Contributor III

Hi Vanessa,
Thanks for your response. I tried example code for image classification (label_image) in the L4.19 BSP to start with.

1. I was able to run the model on GPU but when I tried my custom SSD mobilenet model (which was exported with      post_processing = true during graph generation), I got an error "op code 6 not supoorted with NNAPI).

2. Somehow, tensorflow lite library and Qt5 libraries were not exported into the SDK, i have trouble compiling applications. 
Error 1: undefined reference to `tflite::Interpreter::ModifyGraphWithDelegate(....) . I used the library from the sysroot, so I am using the proper tensorflow-lite library version.
Error 2: 

CMake Warning at SDKs/yocto/sysroots/aarch64-poky-linux/usr/lib/cmake/Qt5Core/Qt5CoreConfig.cmake
SkippingbecauseOE_QMAKE_PATH_EXTERNAL_HOST_BINSisnotdefined

Local.conf file:

MACHINE ??= 'imx8qmmek'
DISTRO ?= 'fsl-imx-xwayland'
PACKAGE_CLASSES ?= 'package_rpm'
EXTRA_IMAGE_FEATURES ?= "debug-tweaks"
USER_CLASSES ?= "buildstats image-mklibs image-prelink"
PATCHRESOLVE = "noop"
IMAGE_INSTALL_append = " ffmpeg"
PACKAGECONFIG_append_pn-opencv_mx8 = " dnn python3 qt5 jasper test"
IMAGE_INSTALL_append = " gputop"
COMMERCIAL_LICENSE ?= "lame gst-fluendo-mp3 libmad mpeg2dec ffmpeg qmmp"
LICENSE_FLAGS_WHITELIST = 'commercial'


BB_DISKMON_DIRS ??= "\
STOPTASKS,${TMPDIR},1G,100K \
STOPTASKS,${DL_DIR},1G,100K \
STOPTASKS,${SSTATE_DIR},1G,100K \
STOPTASKS,/tmp,100M,100K \
ABORT,${TMPDIR},100M,1K \
ABORT,${DL_DIR},100M,1K \
ABORT,${SSTATE_DIR},100M,1K \
ABORT,/tmp,10M,1K"

PACKAGECONFIG_append_pn-qemu-system-native = " sdl"
PACKAGECONFIG_append_pn-nativesdk-qemu = " sdl"
CONF_VERSION = "1"
DL_DIR ?= "${BSPDIR}/downloads/"
ACCEPT_FSL_EULA = "1"


Also, wrt TFLite delgate registration and custom op supported, is there an example code for object detection to showcase how to work with gpu on i.MX8?  

0 Kudos
2,132 Views
vanessa_maegima
NXP Employee
NXP Employee

Hi ullasbharadwaj,

1. The label_image TfLite example does not support SSD models. Also GPU models should be quantized (uint8 models). Please take a look at PyeIQ for examples on how to use SSD models.

2. Could you please try the following to build Yocto SDK:

IMAGE_INSTALL_append = "packagegroup-imx-ml"
IMAGE_INSTALL_append = "packagegroup-imx-qt5"

For GPU examples, please take a look at PyeIQ.

Thanks,
Vanessa

0 Kudos
2,132 Views
vanessa_maegima
NXP Employee
NXP Employee

Hi ullasbharadwaj‌,

For the L4.14 releases we do not have GPU support for any of the inference engines.


Starting with L4.19 releases eIQ is released as part of the official NXP BSP and we do have GPU implementations for TfLite and ArmNN. I suggest you to try our latest 5.4.3_1.0.0 release which provides our latest TfLite/ArmNN GPU acceleration. Please note that we do not support OpenCV DNN GPU acceleration.

Please refer to the BSP documentation for more details on the eIQ support for each release after L4.19.

Thanks,
Vanessa

0 Kudos