i.XM8MP EVK:NNAPI run ssd_mobilenet_v1_coco in Android

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

i.XM8MP EVK:NNAPI run ssd_mobilenet_v1_coco in Android

ソリューションへジャンプ
1,607件の閲覧回数
Geo
Contributor I

My Environment
Hardware: NXP i.XM8MP EVK A01
Software: Android version 10

I try to use NNAPI load ssd_mobilenet to inference in Android. Load model later inference first frame that inference cost time will be more. The second frame, later inference cost time will be normal.

Is this reasonable?

// calculate inference time
long l2 = System.currentTimeMillis();
mTflite.runForMultipleInputsOutputs(new Object[]{localObject}, paramBitmap);
long l1 = System.currentTimeMillis();

 

inference timessd_mobilenet.tflitessd_mobilenet_quant.tflite
CPU(1 thread361ms307ms
CPU(4 thread140ms112 ms
NNAPI (first frame)2234 ms12722ms
NNAPI (second frame)1101ms 27ms

 

laterat some eventual time in the futureMore (Definitions, Synonyms, Translation)

0 件の賞賛
1 解決策
1,546件の閲覧回数
manish_bajaj
NXP Employee
NXP Employee

@Geo,

There is appnote on warm time in NXP.com.

https://www.nxp.com/docs/en/application-note/AN12964.pdf

Check it out for the more detail.

 

-Manish

元の投稿で解決策を見る

1 返信
1,547件の閲覧回数
manish_bajaj
NXP Employee
NXP Employee

@Geo,

There is appnote on warm time in NXP.com.

https://www.nxp.com/docs/en/application-note/AN12964.pdf

Check it out for the more detail.

 

-Manish