Tensorflow Savedmodel to tflite conversion which supports imx8mp NPU

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 

Tensorflow Savedmodel to tflite conversion which supports imx8mp NPU

790 次查看
subbareddyai
Contributor II

Hi,

I would like to convert the Tensorflow Savedmodel to tflite model which supports imx8mp NPU.

I followed the below steps with no success

python models/research/object_detection/exporter_main_v2.py \
--input_type image_tensor \
--pipeline_config_path training_dir/pipeline.config \
--trained_checkpoint_dir training_dir/checkpoint \
--output_directory exported-model

and I make sure its fixed shape model {
ssd {
image_resizer {
fixed_shape_resizer {
height: 320
width: 320
}
}
}
}

 

and also Ensure TFLite-Compatible Ops 

ssd {
feature_extractor {
type: "ssd_mobilenet_v2_fpn_keras"
use_depthwise: true
}
box_predictor {
convolutional_box_predictor {
use_depthwise: true
}
}
}

tflite conversion script 

import tensorflow as tf
import pathlib

saved_model_dir = "exported-model/saved_model"

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.optimizations = [tf.lite.Optimize.DEFAULT]

# Provide representative dataset for INT8 calibration
def representative_data_gen():
data_dir = pathlib.Path("dataset/val")
for img_path in data_dir.glob("*.jpg"):
img = tf.keras.preprocessing.image.load_img(img_path, target_size=(320, 320))
img = tf.keras.preprocessing.image.img_to_array(img)
img = img[tf.newaxis, ...] / 255.0
yield [img.astype("float32")]

converter.representative_dataset = representative_data_gen
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8

tflite_model = converter.convert()

with open("model_int8.tflite", "wb") as f:
f.write(tflite_model)

 

command to run the inference with model_int8.tflite 

$ USE_GPU_INFERENCE=0 \
python3 label_image.py -m model_int8.tflite \
-e /usr/lib/liblitert_vx_delegate.so 

 

please help me if these steps correct.  

all the above steps from the chatgpt.

0 项奖励
回复
4 回复数

721 次查看
pengyong_zhang
NXP Employee
NXP Employee

Hi @subbareddyai 

I will be on vacation for a week and may not reply during this period. If you are in a hurry, you can create a new ticket. My other colleagues will support you. Thanks for your understand.

B.R

0 项奖励
回复

711 次查看
subbareddyai
Contributor II

Hi,

Thanks for the reply.

I will try to create new ticket on this meanwhile after your vacation please support here.

 

Thanks and Regards,

GV Subba Reddy

 

0 项奖励
回复

764 次查看
pengyong_zhang
NXP Employee
NXP Employee
0 项奖励
回复

748 次查看
subbareddyai
Contributor II

I tried converting using eiq tool and tflite. conversion with qunatization and int8  is success(Refer attached image for quantization settings that I have used for conversion) but when I use the tflite model with inference its saying the below error

Failed to load delegate or model with delegate. Trying without delegate. Error: Didn't find op for builtin opcode 'EXP' version '2'. An older version of this builtin might be supported. Are you using an old TFLite binary with a newer model?
Registration failed.

Traceback (most recent call last):
File "inference_quant_int8.py", line 50, in <module>
interpreter = Interpreter(model_path=MODEL_PATH, experimental_delegates=[delegate])
File "/home/root/miniforge3/envs/tflite/lib/python3.8/site-packages/tflite_runtime/interpreter.py", line 455, in __init__
_interpreter_wrapper.CreateWrapperFromFile(
ValueError: Didn't find op for builtin opcode 'EXP' version '2'. An older version of this builtin might be supported. Are you using an old TFLite binary with a newer model?
Registration failed. 

 

标记 (1)
0 项奖励
回复
%3CLINGO-SUB%20id%3D%22lingo-sub-2167437%22%20slang%3D%22en-US%22%20mode%3D%22CREATE%22%3E%E5%B0%86%20Tensorflow%20Savedmodel%20%E8%BD%AC%E6%8D%A2%E4%B8%BA%20tflite%EF%BC%8C%E6%94%AF%E6%8C%81%20imx8mp%20NPU%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-2167437%22%20slang%3D%22en-US%22%20mode%3D%22CREATE%22%3E%3CP%3E%E6%82%A8%E5%A5%BD%EF%BC%8C%3C%2FP%3E%3CP%3E%E6%88%91%E6%83%B3%E5%B0%86%20Tensorflow%20Savedmodel%20%E8%BD%AC%E6%8D%A2%E4%B8%BA%E6%94%AF%E6%8C%81%20imx8mp%20NPU%20%E7%9A%84%20tflite%20%E6%A8%A1%E5%9E%8B%E3%80%82%3C%2FP%3E%3CP%3E%E6%88%91%E6%8C%89%E7%85%A7%E4%BB%A5%E4%B8%8B%E6%AD%A5%E9%AA%A4%E6%93%8D%E4%BD%9C%EF%BC%8C%E4%BD%86%E6%B2%A1%E6%9C%89%E6%88%90%E5%8A%9F%3C%2FP%3E%3CP%3Epython%20models%2Fresearch%2Fobject_detection%2Fexporter_main_v2.py%5C%3CBR%20%2F%3E--input_type%20image_tensor%20%5C%3CBR%20%2F%3E--pipeline_config_path%20training_dir%2Fpipeline.config%20%5C%3CBR%20%2F%3E--trained_checkpoint_dir%20training_dir%2Fcheckpoint%20%5C%3CBR%20%2F%3E--output_directory%20exported-model%3C%2FP%3E%3CP%3E%E6%88%91%E7%A1%AE%E4%BF%9D%E5%85%B6%E5%9B%BA%E5%AE%9A%E5%BD%A2%E7%8A%B6%E6%A8%A1%E5%9E%8B%20%7B%3CBR%20%2F%3Essd%20%7B%3CBR%20%2F%3Eimage_resizer%20%7B%3CBR%20%2F%3Efixed_shape_resizer%20%7B%3CBR%20%2F%3Eheight%3A%20320%3CBR%20%2F%3Ewidth%3A%20320%3CBR%20%2F%3E%7D%3CBR%20%2F%3E%7D%3CBR%20%2F%3E%7D%3CBR%20%2F%3E%7D%3C%2FP%3E%3CBR%20%2F%3E%3CP%3E%E5%B9%B6%E7%A1%AE%E4%BF%9D%20TFLite%20%E5%85%BC%E5%AE%B9%E8%BF%90%E8%A1%8C%20%3C%2FP%3E%3CP%3Essd%20%7B%3CBR%20%2F%3Efeature_extractor%20%7B%3CBR%20%2F%3Etype%3A%22ssd_mobilenet_v2_fpn_keras%22%3CBR%20%2F%3E%20use_depthwise%3A%20true%3CBR%20%2F%3E%7D%3CBR%20%2F%3Ebox_predictor%20%7B%3CBR%20%2F%3Econvolutional_box_predictor%20%7B%3CBR%20%2F%3Euse_depthwise%3A%20true%3CBR%20%2F%3E%7D%3CBR%20%2F%3E%7D%3CBR%20%2F%3E%7D%3C%2FP%3E%3CP%3Etflite%20%E8%BD%AC%E6%8D%A2%E8%84%9A%E6%9C%AC%20%3C%2FP%3E%3CP%3Eimport%20tensorflow%20as%20tf%3CBR%20%2F%3Eimport%20pathlib%3C%2FP%3E%3CP%3Esaved_model_dir%20%3D%22exported-model%2Fsaved_model%22%3C%2FP%3E%3CP%3E%E5%8F%98%E6%B5%81%E5%99%A8%20%3D%20tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)%3CBR%20%2F%3E%E5%8F%98%E6%B5%81%E5%99%A8%E4%BC%98%E5%8C%96%3D%20%5Btf.lite.Optimize.DEFAULT%5D%E3%80%82%3C%2FP%3E%3CP%3E%23%20%E4%B8%BA%20INT8%20%E6%A0%A1%E5%87%86%E6%8F%90%E4%BE%9B%E4%BB%A3%E8%A1%A8%E6%80%A7%E6%95%B0%E6%8D%AE%E9%9B%86%3CBR%20%2F%3Edef%20representative_data_gen()%3A%3CBR%20%2F%3Edata_dir%20%3D%20pathlib.Path(%22dataset%2Fval%22)%3CBR%20%2F%3Efor%20img_path%20in%20data_dir.glob(%22*.jpg%22)%EF%BC%9A%3CBR%20%2F%3Eimg%20%3D%20tf.keras.preprocessing.image.load_img(img_path.img_path.img_path.img)target_size%3D(320%2C%20320))%3CBR%20%2F%3Eimg%20%3D%20tf.keras.preprocessing.image.img_too_array(img)%3CBR%20%2F%3Eimg%20%3D%20img%5Btf.newaxis%E3%80%81...%5D%2F255.0%20%3CBR%20%2F%3E%20%E4%BA%A7%E9%87%8F%20%5Bimg.astype%20(%22%20float32%20%22)%5D%3C%2FP%3E%3CP%3E%E5%8F%98%E6%B5%81%E5%99%A8.representative_dataset%3D%20representative_data_gen%3CBR%20%2F%3E%E5%8F%98%E6%B5%81%E5%99%A8.target_spec.supported_ops%20%3D%20%5Btf.lite.opsset.tflite_builtins_int8%5D%3CBR%20%2F%3E%E5%8F%98%E6%B5%81%E5%99%A8.inference_input_type%20%3D%20tf.uint8%3CBR%20%2F%3E%E5%8F%98%E6%B5%81%E5%99%A8.inference_output_type%3D%20tf.uint8%3C%2FP%3E%3CP%3Etflite_model%20%3D%20%E5%8F%98%E6%B5%81%E5%99%A8.convert%20()%3C%2FP%3E%3CP%3Ewith%20open(%22model_int8.tflite%22%20%E3%80%81%22wb%22)%20as%20f%3A%3CBR%20%2F%3Ef.write(tflite_model)%3C%2FP%3E%3CBR%20%2F%3E%3CP%3E%E5%91%BD%E4%BB%A4%EF%BC%8C%E4%BD%BF%E7%94%A8%20model_int8.tflite%20%E8%BF%90%E8%A1%8C%E6%8E%A8%E7%90%86%20%3C%2FP%3E%3CP%3E%24%20USE_GPU_INFERENCE%3D0%20%5C%3CBR%20%2F%3Epython3%20label_image.py%20-m%20model_int8.tflite%20%5C%3CBR%20%2F%3E-e%20%2Fusr%2Flib%2Fliblitert_vx_delegate.so%20%3C%2FP%3E%3CBR%20%2F%3E%3CP%3E%E8%AF%B7%E5%91%8A%E8%AF%89%E6%88%91%E8%BF%99%E4%BA%9B%E6%AD%A5%E9%AA%A4%E6%98%AF%E5%90%A6%E6%AD%A3%E7%A1%AE%E3%80%82%20%20%3C%2FP%3E%3CP%3E%E4%BB%8E%E8%81%8A%E5%A4%A9%E5%B7%A5%E5%85%B7%E4%B8%AD%E5%AE%8C%E6%88%90%E4%B8%8A%E8%BF%B0%E6%89%80%E6%9C%89%E6%AD%A5%E9%AA%A4%E3%80%82%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-2169449%22%20slang%3D%22en-US%22%20mode%3D%22CREATE%22%20translate%3D%22no%22%3ERe%3A%20Tensorflow%20Savedmodel%20to%20tflite%20conversion%20which%20supports%20imx8mp%20NPU%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-2169449%22%20slang%3D%22en-US%22%20mode%3D%22CREATE%22%3E%3CP%3E%E6%82%A8%E5%A5%BD%EF%BC%8C%3C%2FP%3E%3CP%3E%E6%84%9F%E8%B0%A2%E6%82%A8%E7%9A%84%E5%9B%9E%E5%A4%8D%E3%80%82%3C%2FP%3E%3CP%3E%E6%88%91%E5%B0%86%E5%B0%9D%E8%AF%95%E5%B0%B1%E6%AD%A4%E5%88%9B%E5%BB%BA%E6%96%B0%E7%9A%84%E7%A5%A8%E6%8D%AE%EF%BC%8C%E5%90%8C%E6%97%B6%E5%9C%A8%E6%82%A8%E4%BC%91%E5%81%87%E5%90%8E%E8%AF%B7%E5%9C%A8%E6%AD%A4%E6%8F%90%E4%BE%9B%E6%94%AF%E6%8C%81%E3%80%82%3C%2FP%3E%3CBR%20%2F%3E%3CP%3E%E6%84%9F%E8%B0%A2%E5%B9%B6%E8%87%B4%E6%84%8F%3C%2FP%3E%3CP%3EGV%20Subba%20Reddy%3C%2FP%3E%3CBR%20%2F%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-2169341%22%20slang%3D%22en-US%22%20mode%3D%22CREATE%22%20translate%3D%22no%22%3ERe%3A%20Tensorflow%20Savedmodel%20to%20tflite%20conversion%20which%20supports%20imx8mp%20NPU%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-2169341%22%20slang%3D%22en-US%22%20mode%3D%22CREATE%22%3E%3CP%3E%E4%BD%A0%E5%A5%BD%3CA%20href%3D%22https%3A%2F%2Fcommunity.nxp.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F254707%22%20target%3D%22_blank%22%3E%40subbareddyai%3C%2FA%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%3E%E6%88%91%E5%B0%86%E4%BC%91%E5%81%87%E4%B8%80%E5%91%A8%EF%BC%8C%E5%9C%A8%E6%AD%A4%E6%9C%9F%E9%97%B4%E5%8F%AF%E8%83%BD%E6%97%A0%E6%B3%95%E5%9B%9E%E5%A4%8D%E3%80%82%E5%A6%82%E6%9E%9C%E6%82%A8%E8%B5%B6%E6%97%B6%E9%97%B4%EF%BC%8C%E5%8F%AF%E4%BB%A5%E5%88%9B%E5%BB%BA%E4%B8%80%E5%BC%A0%E6%96%B0%E7%A5%A8%E6%8D%AE%E3%80%82%E6%88%91%E7%9A%84%E5%85%B6%E4%BB%96%E5%90%8C%E4%BA%8B%E4%BC%9A%E6%94%AF%E6%8C%81%E4%BD%A0%E7%9A%84%E3%80%82%E8%B0%A2%E8%B0%A2%E6%82%A8%E7%9A%84%E7%90%86%E8%A7%A3%E3%80%82%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%3EB.R%3C%2FSPAN%3E%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-2167799%22%20slang%3D%22en-US%22%20mode%3D%22CREATE%22%20translate%3D%22no%22%3ERe%3A%20Tensorflow%20Savedmodel%20to%20tflite%20conversion%20which%20supports%20imx8mp%20NPU%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-2167799%22%20slang%3D%22en-US%22%20mode%3D%22CREATE%22%3E%3CP%3E%E6%88%91%E5%B0%9D%E8%AF%95%E4%BD%BF%E7%94%A8%20eiq%20%E5%B7%A5%E5%85%B7%E5%92%8C%20tflite%20%E8%BF%9B%E8%A1%8C%E8%BD%AC%E6%8D%A2%E3%80%82%E4%BD%BF%E7%94%A8%20qunatization%20%E5%92%8C%20int8%20%E8%BD%AC%E6%8D%A2%E6%88%90%E5%8A%9F%EF%BC%88%E8%AF%B7%E5%8F%82%E9%98%85%E9%99%84%E5%9B%BE%E4%B8%AD%E6%88%91%E7%94%A8%E4%BA%8E%E8%BD%AC%E6%8D%A2%E7%9A%84%E9%87%8F%E5%8C%96%E8%AE%BE%E7%BD%AE%EF%BC%89%EF%BC%8C%E4%BD%86%E5%BD%93%E6%88%91%E4%BD%BF%E7%94%A8%E5%B8%A6%E6%9C%89%E6%8E%A8%E7%90%86%E5%8A%9F%E8%83%BD%E7%9A%84%20tflite%20%E6%A8%A1%E5%9E%8B%E6%97%B6%EF%BC%8C%E5%8D%B4%E5%87%BA%E7%8E%B0%E4%BA%86%E4%BB%A5%E4%B8%8B%E9%94%99%E8%AF%AF%3C%2FP%3E%3CP%3E%E5%8A%A0%E8%BD%BD%E5%A7%94%E6%89%98%E6%88%96%E5%B8%A6%E6%9C%89%E5%A7%94%E6%89%98%E7%9A%84%E6%A8%A1%E5%9E%8B%E5%A4%B1%E8%B4%A5%E3%80%82%E5%B0%9D%E8%AF%95%E4%B8%8D%E6%8E%88%E6%9D%83%E3%80%82%E9%94%99%E8%AF%AF%EF%BC%9A%E6%9C%AA%E6%89%BE%E5%88%B0%E5%86%85%E7%BD%AE%E6%93%8D%E4%BD%9C%E7%A0%81'EXP'%E7%89%88%E6%9C%AC'2'%E7%9A%84%E6%93%8D%E4%BD%9C%E3%80%82%E5%8F%AF%E8%83%BD%E6%94%AF%E6%8C%81%E8%AF%A5%E5%86%85%E7%BD%AE%E7%A8%8B%E5%BA%8F%E7%9A%84%E6%97%A7%E7%89%88%E6%9C%AC%E3%80%82%E6%82%A8%E6%98%AF%E5%90%A6%E6%AD%A3%E5%9C%A8%E4%BD%BF%E7%94%A8%E6%97%A7%E7%9A%84TfLite%E4%BA%8C%E8%BF%9B%E5%88%B6%E6%96%87%E4%BB%B6%E5%92%8C%E8%BE%83%E6%96%B0%E7%9A%84%E5%9E%8B%E5%8F%B7%EF%BC%9F%3CBR%20%2F%3E%E6%B3%A8%E5%86%8C%E5%A4%B1%E8%B4%A5%E3%80%82%3C%2FP%3E%3CP%3E%E5%9B%9E%E9%A1%BE%EF%BC%88%E6%9C%80%E5%90%8E%E4%B8%80%E6%AC%A1%E9%80%9A%E8%AF%9D%EF%BC%89%EF%BC%9A%3CBR%20%2F%3E%20%E6%96%87%E4%BB%B6%20%22%20inference_quant_int8.py%20%22%EF%BC%8C%E7%AC%AC%2050%20%E8%A1%8C%EF%BC%8C%E4%BD%8D%E4%BA%8E%3CMODULE%3E%3CBR%20%2F%3E%20%E8%A7%A3%E9%87%8A%E5%99%A8%20%3D%20%E8%A7%A3%E9%87%8A%E5%99%A8%EF%BC%88model_path%3Dmodel_Path%EF%BC%8Cexperimental_delegates%3D%20%5B%E5%A7%94%E6%89%98%5D%EF%BC%89%E6%96%87%E4%BB%B6%3CBR%20%2F%3E%20%22%20%2Fhome%2Froot%2Fminiforge3%2Fenvs%2Ftflite%2Fpython3.8%2Fsite-packages%2Ftflite_runtime%2Finterparter.py%20%22%EF%BC%8C%E7%AC%AC%20455%20%E8%A1%8C%EF%BC%8C%E5%9C%A8__init__%3CBR%20%2F%3E%20_interpreter_wrapper.CreateWrapperFromFile(%3CBR%20%2F%3EValueError%EF%BC%9A%E6%9C%AA%E6%89%BE%E5%88%B0%E5%86%85%E7%BD%AE%E6%93%8D%E4%BD%9C%E7%A0%81%20%22EXP%20%22%E7%89%88%E6%9C%AC%20%222%20%22%E7%9A%84%E6%93%8D%E4%BD%9C%E3%80%82%E5%8F%AF%E8%83%BD%E6%94%AF%E6%8C%81%E8%AF%A5%E5%86%85%E7%BD%AE%E7%A8%8B%E5%BA%8F%E7%9A%84%E6%97%A7%E7%89%88%E6%9C%AC%E3%80%82%E6%82%A8%E6%98%AF%E5%90%A6%E6%AD%A3%E5%9C%A8%E4%BD%BF%E7%94%A8%E6%97%A7%E7%9A%84TfLite%E4%BA%8C%E8%BF%9B%E5%88%B6%E6%96%87%E4%BB%B6%E5%92%8C%E8%BE%83%E6%96%B0%E7%9A%84%E5%9E%8B%E5%8F%B7%EF%BC%9F%3CBR%20%2F%3E%E6%B3%A8%E5%86%8C%E5%A4%B1%E8%B4%A5%E3%80%82%20%3C%2FMODULE%3E%3C%2FP%3E%3CBR%20%2F%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-2167516%22%20slang%3D%22en-US%22%20mode%3D%22CREATE%22%20translate%3D%22no%22%3ERe%3A%20Tensorflow%20Savedmodel%20to%20tflite%20conversion%20which%20supports%20imx8mp%20NPU%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-2167516%22%20slang%3D%22en-US%22%20mode%3D%22CREATE%22%3E%3CP%3E%E4%BD%A0%E5%A5%BD%3CA%20href%3D%22https%3A%2F%2Fcommunity.nxp.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F254707%22%20target%3D%22_blank%22%3E%40subbareddyai%3C%2FA%3E%3C%2FP%3E%0A%3CP%3E%E6%82%A8%E5%8F%AF%E4%BB%A5%E4%BD%BF%E7%94%A8%E6%88%91%E4%BB%AC%E7%9A%84%20eIQ%20%E5%B7%A5%E5%85%B7%3CSPAN%3E%E8%BD%AC%E6%8D%A2%E6%A8%A1%E5%9E%8B%E3%80%82%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%3E%3CA%20href%3D%22https%3A%2F%2Fwww.nxp.com%2Fdesign%2Fdesign-center%2Fsoftware%2Feiq-ai-development-environment%2Feiq-toolkit-for-end-to-end-model-development-and-deployment%3AEIQ-TOOLKIT%22%20target%3D%22_blank%22%20rel%3D%22noopener%20nofollow%20noreferrer%22%3Ehttps%3A%2F%2Fwww.nxp.com%2Fdesign%2Fdesign-center%2Fsoftware%2Feiq-ai-development-environment%2Feiq-toolkit-for-end-to-end-model-development-and-deployment%3AEIQ-TOOLKIT%3C%2FA%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%3EB.R%3C%2FSPAN%3E%3C%2FP%3E%3C%2FLINGO-BODY%3E