i.XM8MP EVK:NNAPI run ssd_mobilenet_v1_coco in Android

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 
已解决

i.XM8MP EVK:NNAPI run ssd_mobilenet_v1_coco in Android

跳至解决方案
1,644 次查看
Geo
Contributor I

My Environment
Hardware: NXP i.XM8MP EVK A01
Software: Android version 10

I try to use NNAPI load ssd_mobilenet to inference in Android. Load model later inference first frame that inference cost time will be more. The second frame, later inference cost time will be normal.

Is this reasonable?

// calculate inference time
long l2 = System.currentTimeMillis();
mTflite.runForMultipleInputsOutputs(new Object[]{localObject}, paramBitmap);
long l1 = System.currentTimeMillis();

 

inference timessd_mobilenet.tflitessd_mobilenet_quant.tflite
CPU(1 thread361ms307ms
CPU(4 thread140ms112 ms
NNAPI (first frame)2234 ms12722ms
NNAPI (second frame)1101ms 27ms

 

laterat some eventual time in the futureMore (Definitions, Synonyms, Translation)

0 项奖励
1 解答
1,583 次查看
manish_bajaj
NXP Employee
NXP Employee

@Geo,

There is appnote on warm time in NXP.com.

https://www.nxp.com/docs/en/application-note/AN12964.pdf

Check it out for the more detail.

 

-Manish

在原帖中查看解决方案

1 回复
1,584 次查看
manish_bajaj
NXP Employee
NXP Employee

@Geo,

There is appnote on warm time in NXP.com.

https://www.nxp.com/docs/en/application-note/AN12964.pdf

Check it out for the more detail.

 

-Manish