Hello,
I am usig the BD evaluation board for imx8mp:
https://boundarydevices.com/product/universal-smarc-carrier-board/
I am testing object detection capability following the following manual:
https://www.nxp.com/docs/en/user-guide/IMX-MACHINE-LEARNING-UG.pdf page 55
I am trying to modify the following pipeline to work with udp instead of display:
gst-launch-1.0 --no-position v4l2src device=/dev/video3 ! \
video/x-raw,width=640,height=480,framerate=30/1 ! \
tee name=t t. ! queue max-size-buffers=2 leaky=2 ! \
imxvideoconvert_g2d ! \
video/x-raw,width=300,height=300,format=RGBA ! \
videoconvert ! video/x-raw,format=RGB ! \
tensor_converter ! \
tensor_filter framework=tensorflow-lite model=${MODEL} \
custom=Delegate:External,ExtDelegateLib:libvx_delegate.so ! \
tensor_decoder mode=bounding_boxes option1=mobilenet-ssd-postprocess option2=${LABELS} \
option3=0:1:2:3,50 option4=640:480 option5=300:300 ! \
mix. t. ! queue max-size-buffers=2 ! \
imxcompositor_g2d name=mix latency=30000000 min-upstream-latency=30000000
sink_0::zorder=2 sink_1::zorder=1 ! waylandsink
I cannot run the above because my display is not configured and is not needed.
I have tested streaming out using the following pipeline sucessfully with no issues:
$GSTL -v v4l2src device=$USB_CAM_1 ! imxvideoconvert_g2d ! "video/x-raw, width=1920, height=1080, framerate=30/1" ! vpuenc_h264 ! rtph264pay pt=96 ! rtpstreampay ! udpsink host=$HOST_IP port=$HOST_PORT_1 sync=false
When I modify desired object detection pipeline to stream out (as shown below), it seems to be running with no errors. However, I do not get any frames (with or without overlay) on the client side.
I have used wireshark to monitor the incoming port on the PC and do not see any frames coming in. I think the issue is imxcompositor_g2d to imxvideoconvert_g2d/vpuenc_h264?
Any suggestions or help is much appreciated.
Thank you.
# DOES NOT ERROR BUT CANNOT SEE THE STREAM
$GSTL -vvv v4l2src device=$USB_CAM_1 ! video/x-raw, width=640, height=480, framerate=30/1 \
! tee name=t t. ! queue leaky=2 max-size-buffers=2 ! imxvideoconvert_g2d ! video/x-raw,width=300,height=300,format=RGBA ! \
videoconvert ! video/x-raw,format=RGB ! \
tensor_converter ! \
tensor_filter framework=tensorflow-lite model=${MODEL} \
custom=Delegate:External,ExtDelegateLib:libvx_delegate.so ! \
tensor_decoder mode=bounding_boxes option1=mobilenet-ssd-postprocess option2=${LABELS} \
option3=0:1:2:3,50 option4=640:480 option5=300:300 ! \
mix. t. ! queue max-size-buffers=2 ! \
imxcompositor_g2d name=mix latency=30000000 min-upstream-latency=30000000 \
sink_0::zorder=2 sink_1::zorder=1 ! \
imxvideoconvert_g2d ! "video/x-raw, width=640, height=480, framerate=30/1,format=RGB16" \
! vpuenc_h264 ! rtph264pay config-interval=1 pt=96 ! rtpstreampay ! udpsink host=$HOST_IP_1 port=$HOST_PORT_1 async=false sync=false
Output of the pipeline: Note that the numbers of incrementing which makes me believe that the pipeline is running. However, on the client side no packets are received.
could you reproduce this issue on nxp board?
sorry for missing this message, how about the status? if still failed, let me reproduce this on my nxp evk board
yes. It is still happening.
I'm on vacation now, I will reproduce this issue when I come back to office, how do you set the client side? you also can use nnshark to check this
Yes.
The pasted print statements are from an ssh session in the nxp SOM. Did I understand your question correctly?
On this eval board I have tested successfully:
1. I can stream out using gstreamer and the cameras successfully. Receive the udp stream via gsteamer client on a pc.
2. The model was checked out to run and produce the correct bounding box, label using command line.
So I think the issue is the gstreamer pipeline provided above? Do you see anything wrong with it?