I am trying to compile a MobileNet model (TensorFlow Lite Integer Quantized) using below command
C:\nxp\Glow\bin\model-compiler.exe -model=vww_96_int8.tflite -emit-bundle=vww_model -backend=CPU -target=arm -mcpu=cortex-m7 -float-abi=hard -use-cmsis
However I am getting error which reads as:
Error.cpp:123] exitOnError(Error) got an unexpected ErrorValue:
Error message: TensorFlowLite: Operator 'MEAN' modifies the output type registered in the model!
How can I get around this?
Hello,
The error message suggested you will need custom implementations for following operations;
TensorListFromTensor, TensorListReserve or other that:
TFLITE_BUILTINS, SELECT_TF_OPS
flags to reduce custom implementation of ops (only a subset of ops can be implemented by this method while others may still require custom implementation
Regards
I have tried converting the model using TFLITE_BUILTINS, SELECT_TF_OPS flags. The error is still showing. I have also tried setting converter.allow_custom_ops=True
As a side note, I am able to run inference on TFLite model using python API tf.lite.Interpreter as shown here: https://www.tensorflow.org/lite/guide/inference#load_and_run_a_model_in_python