Deep Learning model inference returning Wrong result on i.MX 8

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

Deep Learning model inference returning Wrong result on i.MX 8

645件の閲覧回数
nawin_sharma
Contributor I

Hi everyone,

We have developed an app to do DL models (Face Detection etc) inference and video encoding. We are using eIQ GPU inference using Tflite API with NNAPI delegate. The problem is that when we run app in background first time after Flashing BSP 5.10-hardknott, output by DL models running in GPU are wrong. On rebooting board ,reconnecting power etc, Outputs are correct. the wrong output is always observed on first boot after flashing BSP images. GPU returning wrong outputs are also observed randomly on multiple reboots.

These are further observations :-

1. DL model inference on CPU with ARMNN libs have no such issue in the same app with same configuration.

2. DL model inference on GPU with TFLite API and NNAPI delegate have no such issue in the same app when we disable Video encoding.

3. DL model inference on GPU with TFLite API and NNAPI delegate have no such issue in the standalone app with Video encoding. In the standalone app we have tried only Face detection and Video encoding, it is returning correct Face coordinates and doing well on Video encoding too in first boot.

We are not getting stability in DL model outputs on running it in GPU.

Can you please help me on this issue?

Thanks and Regards!

 

タグ(2)
0 件の賞賛
返信
0 返答(返信)