Hi,
I would like to convert the Tensorflow Savedmodel to tflite model which supports imx8mp NPU.
I followed the below steps with no success
python models/research/object_detection/exporter_main_v2.py \
--input_type image_tensor \
--pipeline_config_path training_dir/pipeline.config \
--trained_checkpoint_dir training_dir/checkpoint \
--output_directory exported-model
and I make sure its fixed shape model {
ssd {
image_resizer {
fixed_shape_resizer {
height: 320
width: 320
}
}
}
}
and also Ensure TFLite-Compatible Ops
ssd {
feature_extractor {
type: "ssd_mobilenet_v2_fpn_keras"
use_depthwise: true
}
box_predictor {
convolutional_box_predictor {
use_depthwise: true
}
}
}
tflite conversion script
import tensorflow as tf
import pathlib
saved_model_dir = "exported-model/saved_model"
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
# Provide representative dataset for INT8 calibration
def representative_data_gen():
data_dir = pathlib.Path("dataset/val")
for img_path in data_dir.glob("*.jpg"):
img = tf.keras.preprocessing.image.load_img(img_path, target_size=(320, 320))
img = tf.keras.preprocessing.image.img_to_array(img)
img = img[tf.newaxis, ...] / 255.0
yield [img.astype("float32")]
converter.representative_dataset = representative_data_gen
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8
tflite_model = converter.convert()
with open("model_int8.tflite", "wb") as f:
f.write(tflite_model)
command to run the inference with model_int8.tflite
$ USE_GPU_INFERENCE=0 \
python3 label_image.py -m model_int8.tflite \
-e /usr/lib/liblitert_vx_delegate.so
please help me if these steps correct.
I also tried converting model using eIQ tool but not working.
Related Post : https://community.nxp.com/t5/i-MX-Processors/Tensorflow-Savedmodel-to-tflite-conversion-which-suppor...
I'm facing this problem almost a month. No solution. I above won't workout. Is it possible to develop tf savedmodel with tflite supported ops from scratch?
Hi,
1.Your TFLite model uses the EXP op, version 2, which is not supported by the runtime you're using.
2.The NPU delegate or TFLite runtime on your i.MX8MP device is too old to support this op version.
3.You need to ensure that your model avoids using ops like EXP, or uses only version 1 of them. To do this: Use converter.experimental_new_converter = False to force the old converter (which may avoid newer op versions).
Best Regards,
Zhiming
Hi,
Thanks for the reply and I tried with conversion code that I mentioned below with no success
Hi @Zhiming_Liu ,
Thanks for the reply.
Could you please mention BSP version and tflite version?
Thanks and Regards,
Subba Reddy
If you are using NXP EVK, you can download the demo image from here.
If you are using third party board, please contact them to get latest BSP they ported.
Best Regards,
Zhiming
Hi @subbareddyai
Please consider upgrade the BSP version to update the TFlite verison.
Best Regards,
Zhiming