EIQ onnx model convert to tf-lite failed

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

EIQ onnx model convert to tf-lite failed

1,309 Views
wuhuangcangg
Contributor I

HI,

We are trying using tflite  framework with npu to do machince learing. Here is our step:

1)tranin a model by pytorch

2) export model with onnx format

3) covnert model by eiq tools

 

and we meet this issue :

wuhuangcangg_0-1679914359729.png

 

And we need you to help us to answer these questions:

a)  Dose eIQ(version 2.7.12 ) support convert onnx model to  tflte format  ?(file see in attach)

b) we can not found quantization option for float16 ,   do you know whether tf-lite framework can support use npu (nnapi)  to inference  with float16  precision ? (PS :original model is onnx)

c) Any recommend for use NPU to do  inference with float 16 ?   ONNX runtme 、 Tensor flow 、DeepViewRT ?(PS :original model is onnx ,  bsp version we use 5.10.72-2.2.2)

 

Waiting for your response.

Best Regards!

0 Kudos
Reply
2 Replies

1,280 Views
brian14
NXP TechSupport
NXP TechSupport

Hi @wuhuangcangg

Please find the answers to your questions below:

a) I have reviewed this case and I tried to replicate it, but I got the same issue with conversion ONNX to TFLite in eIQ Model Tool 2.7.5.
Then I used the eIQ Model Tool 2.6.1 with a succesful conversion.

I suggest using eIQ Model Tool 2.6.1, you will find at this link: eIQ Download 

Brian_Ibarra_0-1680125845563.png  Brian_Ibarra_1-1680125845566.png

b) Yes, actually there is no support for float16 for quantization using the eIQ Model Tools.
c) I'm not totally sure about the full support of float16 NPU inference but you could try with a quantized model with this datatype and using NNAPI as execution provider.

I hope this information will be helpful.

Have a great day. 

0 Kudos
Reply

1,271 Views
wuhuangcangg
Contributor I

Thank you very much!

I will try with your suggest version , and  we will test with int8 quan.

Have a good day!

0 Kudos
Reply