Hi,
I would like to convert the Tensorflow Savedmodel to tflite model which supports imx8mp NPU.
I followed the below steps with no success
python models/research/object_detection/exporter_main_v2.py \
--input_type image_tensor \
--pipeline_config_path training_dir/pipeline.config \
--trained_checkpoint_dir training_dir/checkpoint \
--output_directory exported-model
and I make sure its fixed shape model {
ssd {
image_resizer {
fixed_shape_resizer {
height: 320
width: 320
}
}
}
}
and also Ensure TFLite-Compatible Ops
ssd {
feature_extractor {
type: "ssd_mobilenet_v2_fpn_keras"
use_depthwise: true
}
box_predictor {
convolutional_box_predictor {
use_depthwise: true
}
}
}
tflite conversion script
import tensorflow as tf
import pathlib
saved_model_dir = "exported-model/saved_model"
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
# Provide representative dataset for INT8 calibration
def representative_data_gen():
data_dir = pathlib.Path("dataset/val")
for img_path in data_dir.glob("*.jpg"):
img = tf.keras.preprocessing.image.load_img(img_path, target_size=(320, 320))
img = tf.keras.preprocessing.image.img_to_array(img)
img = img[tf.newaxis, ...] / 255.0
yield [img.astype("float32")]
converter.representative_dataset = representative_data_gen
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8
tflite_model = converter.convert()
with open("model_int8.tflite", "wb") as f:
f.write(tflite_model)
command to run the inference with model_int8.tflite
$ USE_GPU_INFERENCE=0 \
python3 label_image.py -m model_int8.tflite \
-e /usr/lib/liblitert_vx_delegate.so
please help me if these steps correct.
I also tried converting model using eIQ tool but not working.
Related Post : https://community.nxp.com/t5/i-MX-Processors/Tensorflow-Savedmodel-to-tflite-conversion-which-suppor...
I'm facing this problem almost a month. No solution. I above won't workout. Is it possible to develop tf savedmodel with tflite supported ops from scratch?