Tensorflow Savedmodel to tflite conversion which supports IMX8MP NPU

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 

Tensorflow Savedmodel to tflite conversion which supports IMX8MP NPU

502 次查看
subbareddyai
Contributor II

Hi,

I would like to convert the Tensorflow Savedmodel to tflite model which supports imx8mp NPU.

I followed the below steps with no success

python models/research/object_detection/exporter_main_v2.py \
--input_type image_tensor \
--pipeline_config_path training_dir/pipeline.config \
--trained_checkpoint_dir training_dir/checkpoint \
--output_directory exported-model

and I make sure its fixed shape model {
ssd {
image_resizer {
fixed_shape_resizer {
height: 320
width: 320
}
}
}
}

 

and also Ensure TFLite-Compatible Ops 

ssd {
feature_extractor {
type: "ssd_mobilenet_v2_fpn_keras"
use_depthwise: true
}
box_predictor {
convolutional_box_predictor {
use_depthwise: true
}
}
}

tflite conversion script 

import tensorflow as tf
import pathlib

saved_model_dir = "exported-model/saved_model"

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.optimizations = [tf.lite.Optimize.DEFAULT]

# Provide representative dataset for INT8 calibration
def representative_data_gen():
data_dir = pathlib.Path("dataset/val")
for img_path in data_dir.glob("*.jpg"):
img = tf.keras.preprocessing.image.load_img(img_path, target_size=(320, 320))
img = tf.keras.preprocessing.image.img_to_array(img)
img = img[tf.newaxis, ...] / 255.0
yield [img.astype("float32")]

converter.representative_dataset = representative_data_gen
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8

tflite_model = converter.convert()

with open("model_int8.tflite", "wb") as f:
f.write(tflite_model)

 

command to run the inference with model_int8.tflite 

$ USE_GPU_INFERENCE=0 \
python3 label_image.py -m model_int8.tflite \
-e /usr/lib/liblitert_vx_delegate.so 

please help me if these steps correct.   

I also tried converting model using eIQ tool but not working.

Related Post : https://community.nxp.com/t5/i-MX-Processors/Tensorflow-Savedmodel-to-tflite-conversion-which-suppor...

I'm facing this problem almost a month. No solution. I above won't workout. Is it possible to develop tf savedmodel with tflite supported ops from scratch?

0 项奖励
回复
5 回复数

474 次查看
Zhiming_Liu
NXP TechSupport
NXP TechSupport

Hi,

1.Your TFLite model uses the EXP op, version 2, which is not supported by the runtime you're using.

2.The NPU delegate or TFLite runtime on your i.MX8MP device is too old to support this op version.

3.You need to ensure that your model avoids using ops like EXP, or uses only version 1 of them. To do this: Use converter.experimental_new_converter = False to force the old converter (which may avoid newer op versions).


Best Regards,
Zhiming

0 项奖励
回复

440 次查看
subbareddyai
Contributor II

Hi,

Thanks for the reply and I tried with conversion code that I mentioned below with no success

import tensorflow as tf

# Path to your exported SavedModel directory
saved_model_dir = r"ssd-mobilenet-v2-tensorflow2-fpnlite-320x320-v1"

# Load the SavedModel
model = tf.saved_model.load(saved_model_dir)

# Get the concrete function for serving
concrete_func = model.signatures["serving_default"]

# Fix input shape (must match your model resolution, here 320x320x3)
concrete_func.inputs[0].set_shape([1, 320, 320, 3])

# Create TFLite converter
converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func])

# Force old converter (avoids ops like EXP v2)
converter.experimental_new_converter = False

# Restrict to built-in ops only (for NPU compatibility)
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS]

# (Optional) Quantization for smaller/faster model
# converter.optimizations = [tf.lite.Optimize.DEFAULT]

# Convert to TFLite
tflite_model = converter.convert()

# Save the converted model
output_file = "ssd_mobilenet_v2_fixed.tflite"
with open(output_file, "wb") as f:
    f.write(tflite_model)

print(f" Conversion complete. Saved as {output_file}"
 
It would be great if there is complete procedure to convert savedmodel to tflite model which support imx8mp npu.
0 项奖励
回复

380 次查看
subbareddyai
Contributor II

Hi @Zhiming_Liu ,

Thanks for the reply. 

Could you please mention BSP version and tflite version?

 

Thanks and Regards,

Subba Reddy

标记 (1)
0 项奖励
回复

377 次查看
Zhiming_Liu
NXP TechSupport
NXP TechSupport

Hi @subbareddyai 

 

If you are using NXP EVK, you can download the demo image from here.

https://www.nxp.com/design/design-center/software/embedded-software/i-mx-software/embedded-linux-for...

 

If you are using third party board, please contact them to get latest BSP they ported.



Best Regards,
Zhiming

0 项奖励
回复

383 次查看
Zhiming_Liu
NXP TechSupport
NXP TechSupport

Hi @subbareddyai 

Please consider upgrade the BSP version to update the TFlite verison.

Best Regards,
Zhiming

0 项奖励
回复