How to import custom models using eiq tool

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 

How to import custom models using eiq tool

740 次查看
Rabiraj
Contributor II

I have a face detect landmark based model, which works on my PC.
I would like to import into the mCXN947 MCU.
My model is based on Nonquantized ONNX model.

I tried to quantize it and convert it into a tflite-based model, then using xxd command, I tried to convert them into a C array and import.
I used label reader examples from eiq examples.

When I compile and load it, I am getting error :
AllocateTensors() failed!
Failed initializing model
Label image example using a TensorFlow Lite Micro model.
Detection threshold: 23%
Model: mobilenet_v1_0.25_128_quant_int8_npu
Didn't find op for builtin opcode 'PAD'
Failed to get registration from op code PAD


How to properly import a custom model and run it with a static images as input ?

标签 (1)
0 项奖励
回复
1 回复

708 次查看
Omar_Anguiano
NXP TechSupport
NXP TechSupport

Please refer to AN14241, there is detailed how to correctly import the model to EiQ so it can be used on MCX.

Best regards,
Omar

0 项奖励
回复