[i.MX8MQ] Android 13 can't use GPU Delegate to inference AI model

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 
已解决

[i.MX8MQ] Android 13 can't use GPU Delegate to inference AI model

跳至解决方案
2,733 次查看
KyleChang
Contributor I

 

NXP reference design code base:
i.MX8MQ platform / Android 13.0.0_2.0.0 (L6.1.22_2.0.0 BSP)

 

We try to run a Image classification APK to evaluate GPU performance on i.MX8MQ, but GPU is not supported on i.MX8MQ.

For example, if we run TFLClassify_mobilenet.apk(download link) on i.MX8MQ(Anldroid 13), the log message will show TFL Classify: This device is GPU Incompatible as the picture below.

LogTFL_1.png

The source code also shows below

sourcecode.png

We also run benchmark to check GPU-delegate that can support on i.MX8MQ.
Please see the attached benchmark--use_gpu=true.txt.

INFO: Use gpu: [1]
INFO: Created TensorFlow Lite delegate for GPU.
INFO: GPU delegate created.

 

We thought we should check first if i.MX8MQ supports GPU delegate as the picture below.

Add_GPU_Delegate.png

But isSupported always returns FALSE on i.MX8MQ, we are assuming i.MX8MQ has not supported GPU Delegate yet.

 

How to enable GPU Delegate so that isSupported can return TRUE?

or could you please give us a sample code/apk that can support GPU-delegate for Android 13 on i.MX8MQ?

0 项奖励
回复
1 解答
2,681 次查看
jimmychan
NXP TechSupport
NXP TechSupport

We are scheduling the Android 14 enablement for VX Delegate in August. 

在原帖中查看解决方案

0 项奖励
回复
1 回复
2,682 次查看
jimmychan
NXP TechSupport
NXP TechSupport

We are scheduling the Android 14 enablement for VX Delegate in August. 

0 项奖励
回复