Too slow first Invoke() in TensorFlowLite2.2.0 NNAPI

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

Too slow first Invoke() in TensorFlowLite2.2.0 NNAPI

406件の閲覧回数
hsaito
Contributor I

Hello,

We are developing an application using the NNAPI of TensorFlowLite 2.2.0 in i.MX8QM.(Linux 5.4.47_2.2.0)

In this application, inferences are made for each of the four cameras.

There is one problem, the first Invoke() after initialization is too slow.(about 2 ~ 3 seconds.)

Therefore, it takes about 10 seconds to execute inference.

To speed up the first Invoke(), we ran it in multi-threaded mode, but in rare cases the inference did not work, so we ran it sequentially.

Is there any way to speed up for first Invoke()?

ラベル(1)
0 件の賞賛
返信
0 返答(返信)