I have a custom MIPI camera that I've written a driver for. I've integrated this camera into Linux w/ the Yocto build and can successfully stream and display camera images.
The driver is the same for Android 8. (Similar kernel version and camera module, DTS is the same.) In addition, I've followed the guide to customize the init.rc to set the back_camera_name property accordingly to my custom camera and I've created an libcamera3 Camera implementation for my camera in the vendor library.
In Android, when the camera app is launched, I see all the correct dmesg output to suggest our driver and library are successfully called but the actual data fails to stream:
ERROR: v4l2 capture: mxc_v4l_dqueue timeout enc_counter 0
I've checked mipi phy status and err1/2 registers and they suggest mipi is streaming as expected. (Again on Linux at this point the camera can stream images fine.. same hardware/driver).
Can anyone suggest how I might debug further on Android why it is failing to stream?
Our data format is YUYV (YUV422). On linux, gstreamer can take this format directly to the IPU and from there we can convert to to a format available for display. I don't know if the fact that we can't support NV12 is a problem or not on Android or if that would cause the timeout we're seeing.
Here is a snippet of the Android logcat output for the FslCameraHAL layer:
01-01 00:00:15.481 246 246 I FslCameraHAL: Camera fourcc to format code is: 0x14 ->(HAL_PIXEL_FORMAT_YCBCR_422_I)
01-01 00:00:15.481 246 246 I FslCameraHAL: enum frame size w:1920, h:1080
01-01 00:00:15.481 246 246 I FslCameraHAL: SupportedPictureSizes: 1920 x 1080
01-01 00:00:15.481 246 246 I FslCameraHAL: SupportedPreviewSizes: 1920 x 1080
01-01 00:00:15.481 246 246 I FslCameraHAL: FrameDuration is 33331760, 30000000000
01-01 00:00:15.481 246 246 I FslCameraHAL: mMaxWidth:1920, mMaxHeight:1080
01-01 00:00:15.481 246 246 I FslCameraHAL: Thine, mFocalLength:3.370000, mPhysicalWidth:2.772000, mPhysicalHeight 1.512000
01-01 00:05:03.771 246 352 I FslCameraHAL: openDev:0: Opening camera device
01-01 00:05:03.771 246 352 I FslCameraHAL: openDev
01-01 00:05:04.409 246 246 I FslCameraHAL: configureStreams:0: stream_config 0xbeff0430, num 2, streams 0xb3f191d8, mode 0
01-01 00:05:04.409 246 246 I FslCameraHAL: config 0, type 0, res 1920x1080, fmt 0x21, usage 0x3, maxbufs 0, priv 0x0, rotation 0
01-01 00:05:04.409 246 246 I FslCameraHAL: config 1, type 0, res 1920x1080, fmt 0x22, usage 0x100, maxbufs 0, priv 0x0, rotation 0
01-01 00:05:04.409 246 246 I FslCameraHAL: Stream create capture stream
01-01 00:05:04.409 246 246 I FslCameraHAL: stream: w:1920, h:1080, format:0x21, usage:0x20303, buffers:2
01-01 00:05:04.409 246 246 I FslCameraHAL: Stream create preview stream
01-01 00:05:04.409 246 246 I FslCameraHAL: stream: w:1920, h:1080, format:0x0, usage:0x20302, buffers:3
01-01 00:05:04.578 246 352 I FslCameraHAL: constructDefaultRequestSettings:0: type=1
01-01 00:05:04.672 246 246 E FslCameraHAL: configure: invalid stream parameters
01-01 00:05:04.673 246 309 E FslCameraHAL: invalid state:0x201 go into start state
01-01 00:05:14.716 246 309 E FslCameraHAL: onFrameAcquireLocked: VIDIOC_DQBUF Failed
解決済! 解決策の投稿を見る。
Actually, yes I did verify that. I was able to get streaming to work by changing the media formats in the libcamera addition for my camera. I originally put YUYV as the format because that is what our camera can produce. I since learned the format here (libcamera HAL) isn't necessarily what the camera can produce but also includes what the IPU can produce. Since the IPU can convert YUYV to NV12 et. al., those are what actually needed listed because Android must have the NV12.
Anyway, at least that's my understanding of the problem. There may be parts I don't understand correctly or haven't completely implemented but it's working for me at this time.
Hello,
Can you verify if you have access to the camera at kernel level?
Otherwise, it can be due that the application is not recognizing your new hardware. Please see the below guide, probably it could help you:
https://developer.android.com/guide/topics/media/camera
Best regards,
Diego.
Actually, yes I did verify that. I was able to get streaming to work by changing the media formats in the libcamera addition for my camera. I originally put YUYV as the format because that is what our camera can produce. I since learned the format here (libcamera HAL) isn't necessarily what the camera can produce but also includes what the IPU can produce. Since the IPU can convert YUYV to NV12 et. al., those are what actually needed listed because Android must have the NV12.
Anyway, at least that's my understanding of the problem. There may be parts I don't understand correctly or haven't completely implemented but it's working for me at this time.