I've had a real tough time trying to push my camera up from 720p to 1080p... See earlier (https://community.nxp.com/thread/508198). Long story short, converting UYVY to YUV420 frames resulted in a green bar on the right side of the screen. Using gstreamer is a guide, I discovered that the NV12 format resulted in a perfect output from the IPU.
Now... on to the video encoding. I've changed all my code to use NV12, and I'm trying to do h264 encoding. I modified the encode-example in libimxvpuapi (https://github.com/Freescale/libimxvpuapi/blob/v1/example/encode-example.c) to encode 1080p video, and use the raw NV12 stream recorded previously.
I made sure to set the vpu_open_param.chroma_interleave=1 so that the VPU knows it is an NV12 stream.
I also modified the call to imx_vpu_calc_framebuffer_sizes so that chroma_interleave is 1
1 frame from input:
1 frame from output:
You'll notice there's another green bar at the bottom of the screen this time, and something is messed up in the overlay text.
As with my color conversion problems, I tried using gstreamer to convert the raw video file and gstreamer worked perfectly.
$gst-launch-1.0 filesrc location="1080_overlay.nv12" ! rawvideoparse use-sink-caps=false width=1920 height=1080 format=nv12 ! vpuenc_h264 ! queue ! flvmux ! filesink location="gst_overlay_1080.mp4"
已解决! 转到解答。
I figured out how to get this working!
I stole a few macros from libimxvpuapi2...
#define VPU_FRAME_LENGTH_ALIGNMENT (16)
#define IMX_VPU_API_ALIGN_VAL_TO(LENGTH, ALIGN_SIZE) ( ((uintptr_t)(((uint8_t*)(LENGTH)) + (ALIGN_SIZE) - 1) / (ALIGN_SIZE)) * (ALIGN_SIZE) )
then when I set up my ipu output task...
ipu_task.output.width = IMX_VPU_API_ALIGN_VAL_TO(out_format->fmt.pix.width, VPU_FRAME_LENGTH_ALIGNMENT);
ipu_task.output.height = IMX_VPU_API_ALIGN_VAL_TO(out_format->fmt.pix.height, VPU_FRAME_LENGTH_ALIGNMENT);
ipu_task.output.crop.pos.x = 0;
ipu_task.output.crop.pos.y = 0;
ipu_task.output.crop.w = out_format->fmt.pix.width;
ipu_task.output.crop.h = out_format->fmt.pix.height;
You'll lose a few pixels on the edge, but this aligns the IPU buffers in such a way that it's compatible with the VPU.
When you allocate your IPU output buffer, you need to make sure you compute the size based on the *actual* output width/height instead of the cropped width/height.
I figured out how to get this working!
I stole a few macros from libimxvpuapi2...
#define VPU_FRAME_LENGTH_ALIGNMENT (16)
#define IMX_VPU_API_ALIGN_VAL_TO(LENGTH, ALIGN_SIZE) ( ((uintptr_t)(((uint8_t*)(LENGTH)) + (ALIGN_SIZE) - 1) / (ALIGN_SIZE)) * (ALIGN_SIZE) )
then when I set up my ipu output task...
ipu_task.output.width = IMX_VPU_API_ALIGN_VAL_TO(out_format->fmt.pix.width, VPU_FRAME_LENGTH_ALIGNMENT);
ipu_task.output.height = IMX_VPU_API_ALIGN_VAL_TO(out_format->fmt.pix.height, VPU_FRAME_LENGTH_ALIGNMENT);
ipu_task.output.crop.pos.x = 0;
ipu_task.output.crop.pos.y = 0;
ipu_task.output.crop.w = out_format->fmt.pix.width;
ipu_task.output.crop.h = out_format->fmt.pix.height;
You'll lose a few pixels on the edge, but this aligns the IPU buffers in such a way that it's compatible with the VPU.
When you allocate your IPU output buffer, you need to make sure you compute the size based on the *actual* output width/height instead of the cropped width/height.
Okay, the issue is the alignment and offsets of the y and uv channels. I switched the libimxvpuapi2 library for this.
In the encode-example.c code there is a call to fb_metrics = &(ctx->stream_info.frame_encoding_framebuffer_metrics); which returns the y offset, u offset, and v offset that the encoder wants.
My YUV buffer is straight from the IPU, so I assume all of the bytes in the frame are sequential, but the fb_metrics struct is telling me that the VPU needs padding between them...
For example, with a 1080p YUV420 8bit frame...
y_offset = 0; y_size = 1920x1080 = 2073600. Since the u bytes start right after the y bytes u_offset = 2073600
According to the fb_metrics struct, the encoder *wants* the u offset to be 2088960.
The encoder-example in libimxvpuapi2 uses the y, u, and v offsets in fb_metrics to populate the input dma buffer, but it's just encoding a file, not a live camera feed.
Now...these conversions all work fine at 720p, and when I run the encode-example at 720p, the u_offset/v_offset are exactly as I would expect them to be...
Turns out libimxvpuai calculates the offsets and strides based on *aligned* frame width/height, not actual frame width/height. After some searching, I found out according to the alignments, 720p aligns perfectly but 1080p aligns to 1920x1088. This is the root of the problem.
Hi Chris Roed,
Can you please provide us below information:
1. Which NXP board/SoC are you using?
2. Which is the yocto/linux verion you are using?
3. What is your gstreamer version?
4. Steps to reproduce the issue?
Regards,
Karan Gajjar
I'm using a custom board with an IMX6DL processor.
I'm on Yocto rocko with kernel 4.9.88
Gstreamer (which encodes correctly) is 1.12.2
libimxvpuapi from rocko is v0.10.3
To reproduce this issue...
Capture Raw Frames and convert to NV12
I captured raw camera frames at 1080p UYVY, then added an overlay and converted the color to NV12 format. I can't share the project I used that added the overlay, but capturing raw camera frames can be done with gstreamer to the same effect.
$gst-launch-1.0 imxv4l2src num-buffers=30 ! video/x-raw,width=1920,height=1080,format=UYVY ! imxvideoconvert_ipu ! video/x-raw,width=1920,height=1080,format=NV12 ! filesink location="gst_1080p.NV12"
Modify example in libimxvpuapi
$git clone git://github.com/Freescale/libimxvpuapi.git
$git checkout v0.10.3
copy in the attached encode-example.c (just changed resolution to 1920x1080, added chroma_interleave=1 to open_params and to imx_vpu_calc_framebuffer_sizes)
followed README.md instructions to compile code, installed on my iMX device
Convert the video
$encode-example -i gst_1080p.NV12 -o convert_test.h264
I copied the convert_test.h264 back to my PC for viewing with ffplay
What DOES work
*encode-example works great with 720p, just not the 1080p videos.
*gstreamer can convert 1080p videos
$gst-launch-1.0 filesrc location="1080_overlay.nv12" ! rawvideoparse use-sink-caps=false width=1920 height=1080 format=nv12 ! vpuenc_h264 ! queue ! flvmux ! filesink location="gst_overlay_1080.mp4"
I took the liberty of sharing my 1080_overlay.nv12 and 720_overlay.nv12 files if you don't want to capture your own raw frames