Deep Learning model inference returning Wrong result on i.MX 8

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Deep Learning model inference returning Wrong result on i.MX 8

288 Views
nawin_sharma
Contributor I

Hi everyone,

We have developed an app to do DL models (Face Detection etc) inference and video encoding. We are using eIQ GPU inference using Tflite API with NNAPI delegate. The problem is that when we run app in background first time after Flashing BSP 5.10-hardknott, output by DL models running in GPU are wrong. On rebooting board ,reconnecting power etc, Outputs are correct. the wrong output is always observed on first boot after flashing BSP images. GPU returning wrong outputs are also observed randomly on multiple reboots.

These are further observations :-

1. DL model inference on CPU with ARMNN libs have no such issue in the same app with same configuration.

2. DL model inference on GPU with TFLite API and NNAPI delegate have no such issue in the same app when we disable Video encoding.

3. DL model inference on GPU with TFLite API and NNAPI delegate have no such issue in the standalone app with Video encoding. In the standalone app we have tried only Face detection and Video encoding, it is returning correct Face coordinates and doing well on Video encoding too in first boot.

We are not getting stability in DL model outputs on running it in GPU.

Can you please help me on this issue?

Thanks and Regards!

 

Tags (2)
0 Kudos
0 Replies