i.XM8MP EVK:NNAPI run ssd_mobilenet_v1_coco in Android

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

i.XM8MP EVK:NNAPI run ssd_mobilenet_v1_coco in Android

Jump to solution
1,642 Views
Geo
Contributor I

My Environment
Hardware: NXP i.XM8MP EVK A01
Software: Android version 10

I try to use NNAPI load ssd_mobilenet to inference in Android. Load model later inference first frame that inference cost time will be more. The second frame, later inference cost time will be normal.

Is this reasonable?

// calculate inference time
long l2 = System.currentTimeMillis();
mTflite.runForMultipleInputsOutputs(new Object[]{localObject}, paramBitmap);
long l1 = System.currentTimeMillis();

 

inference timessd_mobilenet.tflitessd_mobilenet_quant.tflite
CPU(1 thread361ms307ms
CPU(4 thread140ms112 ms
NNAPI (first frame)2234 ms12722ms
NNAPI (second frame)1101ms 27ms

 

laterat some eventual time in the futureMore (Definitions, Synonyms, Translation)

0 Kudos
1 Solution
1,581 Views
manish_bajaj
NXP Employee
NXP Employee

@Geo,

There is appnote on warm time in NXP.com.

https://www.nxp.com/docs/en/application-note/AN12964.pdf

Check it out for the more detail.

 

-Manish

View solution in original post

1 Reply
1,582 Views
manish_bajaj
NXP Employee
NXP Employee

@Geo,

There is appnote on warm time in NXP.com.

https://www.nxp.com/docs/en/application-note/AN12964.pdf

Check it out for the more detail.

 

-Manish