How to import custom models using eiq tool

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

How to import custom models using eiq tool

755件の閲覧回数
Rabiraj
Contributor II

I have a face detect landmark based model, which works on my PC.
I would like to import into the mCXN947 MCU.
My model is based on Nonquantized ONNX model.

I tried to quantize it and convert it into a tflite-based model, then using xxd command, I tried to convert them into a C array and import.
I used label reader examples from eiq examples.

When I compile and load it, I am getting error :
AllocateTensors() failed!
Failed initializing model
Label image example using a TensorFlow Lite Micro model.
Detection threshold: 23%
Model: mobilenet_v1_0.25_128_quant_int8_npu
Didn't find op for builtin opcode 'PAD'
Failed to get registration from op code PAD


How to properly import a custom model and run it with a static images as input ?

ラベル(1)
0 件の賞賛
返信
1 返信

723件の閲覧回数
Omar_Anguiano
NXP TechSupport
NXP TechSupport

Please refer to AN14241, there is detailed how to correctly import the model to EiQ so it can be used on MCX.

Best regards,
Omar

0 件の賞賛
返信