i.MX8MP person detection demo with NNstreamer

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

i.MX8MP person detection demo with NNstreamer

631 Views
ariefraja
Contributor I

Hi,

As per the guidance in i.MX_Machine_Learning_User's_Guide.pdf, I am able to run the Vision Pipeline with NNStreamer demo in i.MX8MP eval board using usb camera and HDMI display. 

I can see the person detection demo working fine. 

Now I want to run the person detection using my own model created using YOLOV3 

I converted my_model.h5 file to tflight using Python script.

But the nnstreamer seems to be not accepting my model.

My Questions are : 

1. Is the NNStreamer only work with YOLO5 models? 

2. Is there an NNStreamer that can run for the YOLO3 model? 

I copied the conversation script below.

Below is the error from my board:

export MODEL=$(pwd)/my_model.tflite

export LABELS=$(pwd)/labels.txt

gst-launch-1.0 --no-position v4l2src device=/dev/video3 ! video/x-raw,width=416,height=416,framerate=10/1 ! tee name=t t. ! queue max-size-buffers=2 leaky=2 ! imxvideoconvert_g2d ! video/x-raw,width=416,height=416,format=RGBA ! videoconvert ! video/x-raw,format=RGB ! tensor_converter ! tensor_filter framework=tensorflow-lite model=${MODEL} custom=Delegate:External,ExtDelegateLib:libvx_delegate.so ! tensor_decoder mode=bounding_boxes option1=tf-ssd option2=${LABELS} option3=0:1:2:3,50 option4=640:480 option5=300:300 ! mix. t. ! queue max-size-buffers=2 ! imxcompositor_g2d name=mix sink_0::zorder=2 sink_1::zorder=1 ! waylandsink
** Message: 13:41:07.732: accl = cpu
Caught SIGSEGV
#0 0x0000ffffa6d08f58 in __GI___wait4 (pid=<optimized out>,
#1 0x0000ffffa6ea4588 in g_on_error_stack_trace ()
#2 0x0000aaaab04e6380 in ?? ()
#3 <signal handler called>
#4 0x0000ffff8d4ec1c0 in tflite::FlatBufferModel::ReadAllMetadata[abi:cxx11]() const () from /usr/lib/libtensorflow-lite.so.2.8.0
#5 0x0000ffff8d4e689c in tflite::InterpreterBuilder::InterpreterBuilder(tflite::FlatBufferModel const&, tflite::OpResolver const&) ()
#6 0x0000ffff8d634338 in TFLiteInterpreter::loadModel(int, tflite_delegate_e)
#7 0x0000ffff8d634f2c in TFLiteCore::loadModel() ()
#8 0x0000ffff8d635b8c in TFLiteCore::init(tflite_option_s*) ()
#9 0x0000ffff8d635dfc in ?? ()
#10 0x0000ffff8e7ad7c0 in gst_tensor_filter_common_open_fw ()
#11 0x0000ffff8e7adc5c in gst_tensor_filter_load_tensor_info ()
#12 0x0000ffff8e80bd54 in ?? () from /usr/lib/gstreamer-1.0/libnnstreamer.so
#13 0x0000ffffa6a3864c in ?? () from /usr/lib/libgstbase-1.0.so.0
#14 0x0000ffffa6a3c208 in ?? () from /usr/lib/libgstbase-1.0.so.0
#15 0x0000ffffa707837c in gst_pad_query () from /usr/lib/libgstreamer-1.0.so.0
#16 0x0000ffffa70bc894 in gst_pad_query_caps ()
#17 0x0000ffffa70bd560 in gst_element_get_compatible_pad ()
#18 0x0000ffffa70be49c in gst_element_link_pads_full ()
#19 0x0000ffffa70beaf4 in gst_element_link_pads_filtered ()
#20 0x0000ffffa701a14c in ?? () from /usr/lib/libgstreamer-1.0.so.0
#21 0x0000ffffa70d2a84 in gst_parse_launch_full ()
#22 0x0000ffffa70d2d24 in gst_parse_launchv_full ()
#23 0x0000aaaab04e3d7c in ?? ()
#24 0x0000ffffa6c7b230 in __libc_start_call_main (
#25 0x0000ffffa6c7b30c in __libc_start_main_impl (main=0xaaaab04e3a80,
#26 0x0000aaaab04e4370 in _start ()
Spinning. Please run 'gdb gst-launch-1.0 1620' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.

 

Here is how I converted my_model.h5 to tflight:

import tensorflow as tf
from keras.models import load_model
import os

 

# convert keras model to tflite
def get_file_size(file_path):
size = os.path.getsize(file_path)
return size

def convert_bytes(size, unit=None):
if unit == "KB":
return print('File size: ' + str(round(size / 1024, 3)) + ' Kilobytes')
elif unit == "MB":
return print('File size: ' + str(round(size / (1024 * 1024), 3)) + ' Megabytes')
else:
return print('File size: ' + str(size) + ' bytes')

 

model = load_model("Yolo_v3_Tf.h5")

TF_LITE_MODEL_FILE_NAME = "my_model.tflite"
tf_lite_converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = tf_lite_converter.convert()
tflite_model_name = TF_LITE_MODEL_FILE_NAME
open(tflite_model_name, "wb").write(tflite_model)
convert_bytes(get_file_size(TF_LITE_MODEL_FILE_NAME), "KB")

# Convert the model.
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()
# or using another method

# Save the model.
with open('my_model.tflite', 'wb') as f:
f.write(tflite_model)

 

Labels (1)
0 Kudos
2 Replies

492 Views
araja
Contributor III

Hi @Dhruvit ,

I am good and hopping the same from you. 

Thank you Very much for the response.

I will check the NNstremer for the yolo3 model and update you soon.

Regards,

Arief Raja

0 Kudos

564 Views
Dhruvit
NXP TechSupport
NXP TechSupport

Hi @ariefraja,

I hope you are doing well
 
1. Is the NNStreamer only work with YOLO5 models? 
=> No
 

2. Is there an NNStreamer that can run for the YOLO3 model? 

=>You may check this nnstreamer example which uses the vivante yolo3 model:

https://github.com/nnstreamer/nnstreamer-example/tree/main/Tizen.platform/Vivante_pipeline_NonGUI_yo...

Thanks & Regards,
Dhruvit Vasavada
0 Kudos