[i.MX8MQ] Android 13 can't use GPU Delegate to inference AI model

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

[i.MX8MQ] Android 13 can't use GPU Delegate to inference AI model

48 Views
KyleChang
Contributor I

 

NXP reference design code base:
i.MX8MQ platform / Android 13.0.0_2.0.0 (L6.1.22_2.0.0 BSP)

 

We try to run a Image classification APK to evaluate GPU performance on i.MX8MQ, but GPU is not supported on i.MX8MQ.

For example, if we run TFLClassify_mobilenet.apk(download link) on i.MX8MQ(Anldroid 13), the log message will show TFL Classify: This device is GPU Incompatible as the picture below.

LogTFL_1.png

The source code also shows below

sourcecode.png

We also run benchmark to check GPU-delegate that can support on i.MX8MQ.
Please see the attached benchmark--use_gpu=true.txt.

INFO: Use gpu: [1]
INFO: Created TensorFlow Lite delegate for GPU.
INFO: GPU delegate created.

 

We thought we should check first if i.MX8MQ supports GPU delegate as the picture below.

Add_GPU_Delegate.png

But isSupported always returns FALSE on i.MX8MQ, we are assuming i.MX8MQ has not supported GPU Delegate yet.

 

How to enable GPU Delegate so that isSupported can return TRUE?

or could you please give us a sample code/apk that can support GPU-delegate for Android 13 on i.MX8MQ?

0 Kudos
Reply
0 Replies