NXP IMX8MM android 14 BSP doesnt support RAW image capture

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

NXP IMX8MM android 14 BSP doesnt support RAW image capture

5,844 Views
santhana_kumar
Contributor II

Hi,

   We are working with the IMX298 camera, which supports RAW8/RAW10 output formats, and    have integrated it into the Android BSP on our platform.

  At the kernel level, we successfully capture raw images using the following command, both    with and without services like Zygote, SurfaceFlinger, and BootAnimation running:

   v4l2-ctl --device=/dev/video0 --set-fmt-video=width=1920,height=1080,pixelformat=RG10 --stream-mmap --stream-count=1 --stream-to=output.raw

Issue:

  • The command executes successfully, and images are captured without any crashes.
  • However, when running the command while Android services are active, the OS crashes whenever a frame is received. Specifically, services like Zygote, SurfaceFlinger, and BootAnimation terminate unexpectedly.

Analysis:
Upon further investigation, we suspect that RAW data corruption is occurring due to issues in the RenderThread handling. The logs indicate the following errors:

 

OpenGLRenderer: Device claims wide gamut support, cannot find matching config, error = EGL_SUCCESS OpenGLRenderer: Failed to initialize 101010-2 format, error = EGL_SUCCESS

 

  We followed the patch provided in the NXP knowledge base for supporting RAW format on   Android:
IMX8MP Support for RAW Format on Android

   Request for Support:
      We need guidance on displaying RAW image previews in Android OS(Android layer). If the IMX298 camera is insufficient for this setup, we also have access to the OV5640 camera and the NXP IMX8MM-EVK board. Kindly share the necessary patches and assist us in resolving this issue.

      Attached are the logs for further debugging. We appreciate your support and look forward to your guidance. 

Regards,
SanthanaKumarS

0 Kudos
Reply
12 Replies

5,795 Views
joanxie
NXP TechSupport
NXP TechSupport

based on your description, you can capture raw data successfully under linux, right? how do you capture raw data on android? by your own app? is your issue  related to capture stage or display stage?

0 Kudos
Reply

5,785 Views
santhana_kumar
Contributor II

Hi Joan,

Thank you for getting back to us. We are able to capture raw data successfully, but not within the Android layer.

We are capturing raw images in Android using the application provided by NXP, as described in the following link:
https://community.nxp.com/t5/i-MX-Processors-Knowledge-Base/i-MX8MP-support-raw-format-on-Android/ta...

However, if the camera produces RAW10 frames, services like SurfaceFlinger and Zygote crash—even when using the v4l2-ctl command. Whenever these services are running, executing v4l2-ctl causes a crash. However, when the services are not running, v4l2-ctl works fine.

issue comes in the RenderThread when the camera produced the frames.

0 Kudos
Reply

5,740 Views
joanxie
NXP TechSupport
NXP TechSupport

the link you refer is for imx8mp, which use ISI, and refer to your description, the issue is related to the GPU, and refer to the 3.5 Camera HAL modification, GPU doesn't support raw data, maybe you should trace this, I don't have sample android patch for imx8mm

0 Kudos
Reply

5,689 Views
santhana_kumar
Contributor II

Hi,

   

Hi,

We have identified that RAW data is not being displayed correctly on the NXP Android UI. Based on the patches provided in the following link, modifications were made at the HAL layer, enabling RAW data streaming on the display.

Services like Surfaceflinger and zygote crashed because of camera provided the low frame (ie 1fps) and the timeout is happening in the HAL layer. Increase the timeout and getting without crash and the camera continuously providing the frames at 1fps.


Path: vendor/nxp-opensource/imx/camera/VideoStream.cpp

#define MAX_RECOVER_COUNT 1 /* Increased to 8 */
#define SELECT_TIMEOUT_SECONDS 3 /* Increased to 30 */


When running both commands, we are only achieving 1 FPS:

gst-launch-1.0 -v v4l2src device=/dev/video0 ! bayer2rgb ! videoconvert ! fpsdisplaysink video-sink=autovideosink text-overlay=true

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
[ 530.044980] Zumi mipi_csis_enum_frameintervals
[ 530.052416] Zumi Frame Interval: 1/120 sec
/GstPipeline:pipeline0/GstFPSDisp[ 530.057797] Zumi mipi_csis_enum_frameintervals
laySink:fpsdisplaysink0/GstAutoVideoSink:autovideosink0/GstWaylandSink:autovideosink0-actual-sink-wayland: sync = true
Settin[ 530.073698] set to pixelformat 'RAWRGB'
[ 530.080032] type 1
g pipeline to PLAYING ...
New cl[ 530.082043] width 480
ock: GstSystemClock
/GstPipeline[ 530.087200] height 320
:pipeline0/GstV4l2Src:v4l2src0.Gs[ 530.092420] Zumi In function: mx6s_configure_csi - pixelformat 0x30314752 width = 480
tPad:src: caps = video/x-bayer, width=(int)480, height=(int)320, framerate=(fraction)120/1, format=(string)rggb, pixel-aspect-ratio=(fraction)1/1, colore
/GstPipeline:pipeline0/GstBayer2RGB:ba[ 530.122705] Zumi __mipi_csis_set_format fmt: 0x300f, 480 x 320
yer2rgb0.GstPad:src: caps = video/x-raw, width=(int)480, height=(int)320, framerate=(fraction)120/1, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(s0
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)480, height=(int)320, framerate=(fraction)120/1, pixel-a0
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw, width=(int)480, height=(int)320, fr0
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay.GstPad:src: caps = video/x-raw, width=(int)480, height=0
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad1: caps = video/x-raw, wid0
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstAutoVideoSink:autovideosink0/GstWaylandSink:autovideosink0-actual-sink-wayland.GstPad:sink: 0
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw, width=(int)480, height=(i0
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay.GstPad:video_sink: caps = video/x-raw, width=(int)480, 0
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw, width=(int)480, height=(int)320, framerate=(fraction)120/0
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, width=(int)480, height=(int)320, framerate=(fraction)120/1, pixel-0
/GstPipeline:pipeline0/GstBayer2RGB:bayer2rgb0.GstPad:sink: caps = video/x-bayer, width=(int)480, height=(int)320, framerate=(fraction)120/1, format=(ste
[ 531.536801] Zumi Setting Exposure: 0x3e80 (High Byte: 0x3e, Low Byte: 0x80)
[ 531.545965] Zumi Exposure successfully set to 0x3e80
[ 531.554928] IMX298: stream on
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstAutoVideoSink:autovideosink0/GstWaylandSink:autovideosink0-actual-sink-wayland: sync = true
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 2, dropped: 0, current: 1.76, average6
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 2, dropped: 0, current: 1.76, average: 1.76
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstTextOverlay:fps-display-text-overlay: text = rendered: 3, dropped: 0, current: 0.84, average9
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 3, dropped: 0, current: 0.84, average: 1.29

 

root@imx8mmevk:/# v4l2-ctl --device=/dev/video0 --set-fmt-video=width=480,height=320,pixelformat=RG10 --stream-mmap --stream-count=100
[ 48.061928] set to pixelformat 'RAWRGB'
[ 48.065809] type 1
[ 48.067852] width 480
[ 48.070123] height 320
[ 48.072511] Zumi In function: mx6s_configure_csi - pixelformat 0x30314752 width = 480
[ 48.096054] Zumi __mipi_csis_set_format fmt: 0x300f, 480 x 320
[ 49.596468] Zumi Setting Exposure: 0x3e80 (High Byte: 0x3e, Low Byte: 0x80)
[ 49.605997] Zumi Exposure successfully set to 0x3e80
[ 49.613550] IMX298: stream on
<<<<< 0.82 fps
< 0.83 fps
< 0.84 fps
< 0.84 fps
< 0.85 fps
< 0.85 fps

We suspect that either bayer2rgb conversion is causing a delay or another factor is limiting the performance.

Issue: Low FPS (1 FPS) from the IMX298 Camera Sensor

1. MIPI CSI Configuration

Our current device tree configuration is as follows:

link-frequencies = /bits/ 64 <480000000>;
#define IMX298_DEFAULT_CLK_FREQ (24000000)
#define IMX298_DEFAULT_LINK_FREQ (480000000)
#define IMX298_DEFAULT_PIXEL_RATE ((IMX298_DEFAULT_LINK_FREQ * 8LL) / 10)

Is this configuration sufficient to achieve 30 FPS, or can we increase the MIPI clock to improve the frame rate?

2. Effect of Exposure and Gain on FPS

By varying the exposure value, we were able to achieve up to 9 FPS:

imx298->exposure = v4l2_ctrl_new_std(&imx298->ctrls, &imx298_ctrl_ops,
V4L2_CID_EXPOSURE,
16, 33333, 16, 16000);

imx298->gain = v4l2_ctrl_new_std(&imx298->ctrls, &imx298_ctrl_ops,
V4L2_CID_GAIN,
128, 8091, 128, 2048);

How can we determine the optimal exposure and gain values to achieve the best possible FPS for our sensor?

Any guidance on improving the MIPI clock settings or reducing processing delays in the pipeline would be greatly appreciated.


Regards,
SanthanaKumarS

0 Kudos
Reply

5,665 Views
joanxie
NXP TechSupport
NXP TechSupport

1)Is this configuration sufficient to achieve 30 FPS, or can we increase the MIPI clock to improve the frame rate?

>depends on what clock you set in the kernel, but if you just need 480x320@30fps, I think the default bsp settings is enough for you, I suggest that you can test on the linux firstly, to check if kernel support this already, then debug your android code, the 

2) for gain and exposure settings, I suggest that you contact sensor vendor to get the proper parameters

0 Kudos
Reply

5,656 Views
santhana_kumar
Contributor II

Joan,

Thanks for your reply. We debugged the issue in Linux and increased the frame length, MIPI clock, and exposure settings to get 23 fps working.

The issue is that when running the Android application, the camera provides a few frames, and we can see them on the display, but the application crashes sometimes.

Could you please answer the following questions?

  1. Does the following link work with the camera preview on the display or not?

    i.MX8MP support raw format on Android

  2. In the link, the following point is suggested:

    "Modifying the GPU code is not recommended."

    "When the preview stream is active, its pixel format is fixed to HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED. It needs YUYV format. In this patch, we don't convert raw12 to YUYV, we just copy the buffer from input to output, so the preview stream is actually raw12."

    The same patch has been applied in the i.MX8MM Android BSP, and we can preview the camera frames on the display for a while. However, after a few frames, the application crashes.

  3. From the HAL layer, when you "copy the buffer from input to output," the processing of the frame buffer will take some time, right? Would it help to increase the timeout for acquiring the frame to allow proper frame display? We have analyzed the issue and found that the dequeue operation is not being done properly.

    Regards,SanthanaKumarS

0 Kudos
Reply

5,526 Views
joanxie
NXP TechSupport
NXP TechSupport

1) refer to the link, the owner uses 7yuv tool to check the raw12 format

2) I don't know what display you use, as I known, if you need display raw data, you need convert the raw data to yuv by GPU or your own SW

3)copy should cost time, if you copy by your own SW, the performance isn't good enough as other customer tested before

0 Kudos
Reply

5,520 Views
santhana_kumar
Contributor II
Hi,
 
 
1) The owner uses the 7yuv tool to check the RAW12 format. However, this is only for the capture side, not for the preview.
 
    Explanation of the Reference Link:
    
  RAW12 patches have been added at both the kernel and HAL layers.
 
      Two different formats are used for previewing and capturing RAW data. (Modifying the GPU code is not recommended.)
 
      The preview stream is set to HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED.
 
          When using the preview stream, its pixel format is fixed to HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED. This format requires YUYV, but in this patch, RAW12 is not converted to YUYV. Instead, the buffer is simply copied from input to output. As a result, the preview stream is still RAW12.
 
          For the capture stream, the Blob format is used, which is typically associated with JPEG. When the application specifies the Blob format, the Camera HAL copies the buffer from input to output directly. You can find more details in the ProcessCapturedBuffer() function.
 
2)  We are not aware of the specific display you are using. However, as far as we know, RAW data needs to be converted to YUV for proper display. This conversion can be performed using GPU acceleration or custom software (SW).
 
     The i.MX8MM EVK uses an HDMI display. Could you provide the reference link you mentioned, or share any steps/ideas on how to display RAW or YUV data?
 
3)  Copying buffers takes time. If the copy process is handled by custom software, performance may not be as efficient as expected. Based on previous customer tests, software-based copying can significantly impact performance.
 
      We agree with this observation.
  
Regards,
SanthanaKumarS
0 Kudos
Reply

5,405 Views
joanxie
NXP TechSupport
NXP TechSupport

1)as I mentioned before, the patch and the link is for capture raw data on android, not for display, because imx8mp couldn't support raw data display directly

2) The i.MX8MM EVK uses an HDMI display. Could you provide the reference link you mentioned, or share any steps/ideas on how to display RAW or YUV data?

>no such reference link or code to share, we don't have such demo for customer, for imx8mq or imx8qm, we can use softISP(opencl or openvx) to support this convert, but for imx8mm doesn't support this, for linux, can use bayer2rgb to capture raw data and convert it to the rgb, normally for imx8mm raw data capture and display we suggest customer to use external ISP chip, otherwise the performance isn't good enough, you need convert it by your own SW

0 Kudos
Reply

5,358 Views
santhana_kumar
Contributor II

1)as I mentioned before, the patch and the link is for capture raw data on android, not for display, because imx8mp couldn't support raw data display directly
  
 Attached the log file to you  and you can see more than 200 frame displayed in the EVK8MM board.  With the patch we can able to stream the camera using the IMX8MM evk board and the camera image streaming continuously but sometimes the RenderThread crashed.

Again I am telling you the RAW image is not directly displayed in the EVK board it changed from RAW10 to YUV format.

"When preview stream, it's pixel format is fixed to HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, it needs YUYV format, in this patch, we don't convert raw12 to yuyv, just copy the buffer from input to output, so the preview stream is raw12 actually."


0 Kudos
Reply

5,320 Views
joanxie
NXP TechSupport
NXP TechSupport

you can see more than 200 frame displayed in the EVK8MM board. With the patch we can able to stream the camera using the IMX8MM evk board

>could you see the correct video on the display with raw data format?

so the preview stream is raw12 actually.

>this means that this demo is only shows capture, not shows how to display, the demo only copy the date to the different buffer, not convert format, so the preview stream is raw12, not mean that imx8mm can support preview raw12 directly

0 Kudos
Reply

5,315 Views
santhana_kumar
Contributor II

Attached the logs to you.

could you see the correct video on the display with raw data format?

Yes. HAL layer provided the frame buffer data with the YUYV format.

From the Application it sets to HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED with the YUYV format. With the NXP provided patch the RAW10 should be changed for accessing the RAW format in the kernel layer.

Can you please review my attached logs and you can see the format selection.

0 Kudos
Reply