How to import custom models using eiq tool

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

How to import custom models using eiq tool

728 Views
Rabiraj
Contributor II

I have a face detect landmark based model, which works on my PC.
I would like to import into the mCXN947 MCU.
My model is based on Nonquantized ONNX model.

I tried to quantize it and convert it into a tflite-based model, then using xxd command, I tried to convert them into a C array and import.
I used label reader examples from eiq examples.

When I compile and load it, I am getting error :
AllocateTensors() failed!
Failed initializing model
Label image example using a TensorFlow Lite Micro model.
Detection threshold: 23%
Model: mobilenet_v1_0.25_128_quant_int8_npu
Didn't find op for builtin opcode 'PAD'
Failed to get registration from op code PAD


How to properly import a custom model and run it with a static images as input ?

Labels (1)
0 Kudos
Reply
1 Reply

696 Views
Omar_Anguiano
NXP TechSupport
NXP TechSupport

Please refer to AN14241, there is detailed how to correctly import the model to EiQ so it can be used on MCX.

Best regards,
Omar

0 Kudos
Reply