Until now, I didn't find the RT chip driver about the AR1335 , I also search it internally, didn't find it.
Maybe you can search it from the internet, some 3rd part may have AR1335 drivers.
Nxp official side, didn't find it, until now, RT mainly use MT9M114 or OV7725 camera module.
Sorry for the inconvenience bring you, and thanks a lot for your understanding.
I am still trying to port AR1335 camera driver to ImX RT1170 CCI2 mipi demo code.
In the CS2 mipi code I found these clocks:
Can anyone explains what these clocks are for?
I am using this camera sensor: AR1335 instead of OV5640 on imX RT1170 dev board. AR1335 sends out only 10 bits image. I believe OV5640 pixel data bus is 16 bits.
How can I modify the mipi csi demo code you that comes with the NXP sdk so that I can get an image with this sensor?
I wished if there some documentation that guides through the process of importing another camera sensor into this demo code.
Thank you so much in advance!
I am really so sorry for the later reply.
Just get the internal AE reply, and really a sad story to you: MIPI-CSI2 can't support raw data due to VIDEO_MUX bugs, and this information will add to the RT1170 reference manual.
So, could you please check your external AR1335, whether that can be configured to support, eg. RGB?
To raw data 10bit now, really no workaround now.
I am so sorry for the later response and thanks so much for your effort.
just for clarification: does this mean using an 8bit monochrome grayscale image sensor is impossible with the RT1176 (OV9281 to be precise, data format is RAW8)? Is this a fundamental limitation of the hardware, or just an issue in the SDK that will be fixed in the future?
I already have this sensor running using an NXP iMX8M Mini SoC which seems to use a similar MIPI CSI peripheral, but need to port to RT1176.
From the internal AE description, it is the chip MIPI CSI can't support it, not the SDK issues.
So the reference manual will add the related description in the future.
But if you want to use the raw data, you need to use the parallel CSI interface instead of the mipi-csi.
Sorry for the inconvenience we bring you.
Already have you double check with our internal expert and reply you in your new post.
Any new issues, please follow that new post.
Until now, we still don't have the directly document about importing another camera sensor.
As I know, the camera sensor should also have the configure tools, which can do the configuration and generate the code. Whether your AR1335 already have the related drivers from it's own company?
I also check internally, didn't find the AR1335 RT related drivers until now.
Below is a summary of the image parameters that I am interested in:;
; Target Vt Pixel Frequency: 220 MHz
; Input Clock Frequency: 24 MHz
; Actual Vt Pixel Clock: 220 MHz
; Actual Op Pixel Clock: 110 MHz
; pll_multiplier (M value) = 55
; pre_pll_clk_div (N value) = 2
; pll_multiplier2 (M2 value) = 55
; pre_pll_clk_div2 (N2 value) = 2
; Fpfd = 12 MHz
; Fvco = 660 MHz
; Fvco2 = 660 MHz
; Vt Sys Divider = 1
; Vt Pix Divider = 3
; Op Sys Divider = 1
; Op Pix Divider = 6
; [IMAGE PARAMETERS]
; Requested Frames Per Second: 30
; Output Columns: 640
; Output Rows: 480
; Use Y Summing: Unchecked
; X-only Binning: Unchecked
; Allow Skipping: Checked
; Blanking Computation: HB Max then VB
; Max Frame Time: 33.3333 msec
; Max Frame Clocks: 7333333.3 clocks
; Readout Mode: 1, YSum: No, XBin: No
; Horiz clks: 640 active + 1688 blank = 2328 total
; Vert rows: 480 active + 2674 blank = 3154 total
; Output Cols: 640
; Output Rows: 480
; FOV Cols: 640
; FOV Rows: 480
; Actual Frame Clocks: 7342512 clocks
; Row Time: 10.582 usec / 2328 clocks
; Integration Time: 33 msec.
; Frame time: 33.375055 msec
; Maximum Frame Rate allowed: 191.739fps
; Frames per Sec: 29.962 fps
I am using mipi CSI2 demo code on imx RT1170 to get an 480x640 image from AR1335.
I found that the output format of this camera sensor is: 10bits. what changes do I need to do in the CSI driver so that I can capture the image i need.
I used a manufacturer tool to generate the configuration file, yet i am far from getting an image due to my wrongfully configured CSI driver.
Could you please advice on what need to be modified in the demo code CSI driver to capture an image from AR1335.
I verified with my scope that the camera sensor is streaming data; and I was able to capture the CSI reading registers while debugging my code: please see attached screen shot. However, I am not able to get a full frame buffer CSI interrupt. Again this has to do with the way I initialized the MIPI CSI, could you or some one else help me set thiis up properly, based on the image requirements I stated in my previous email.
Thanks for your information.
Please keep patient, I will check more details about your code, and check with our internal CSI expert.
After I get any valuable information, I will let you know ASAP.
Please give me more time, thanks so much!
i hope you are doing well! I just want to follow up with you and see if you got any updates for my issue.
Again I need help configure the CSI driver so that I can capture the 10 bits raw data for each output pixel data. After capturing the data I can use my own software to convert data to a RGB 888 format.
One more thing I want to let you know about is that the output format of the camera sensor is: RAW10 ( 10 bits) and in the current nXP CSI/camera driver I dont see this format supported. I see only these ones:
/*! @brief Pixel format definition. */
typedef enum _video_pixel_format
/* RGB */
kVIDEO_PixelFormatXRGB8888 = FSL_VIDEO_FOURCC('X', 'R', '2', '4'), /*!< 32-bit XRGB8888. */
kVIDEO_PixelFormatRGBX8888 = FSL_VIDEO_FOURCC('R', 'X', '2', '4'), /*!< 32-bit RGBX8888. */
kVIDEO_PixelFormatXBGR8888 = FSL_VIDEO_FOURCC('X', 'B', '2', '4'), /*!< 32-bit XBGR8888. */
kVIDEO_PixelFormatBGRX8888 = FSL_VIDEO_FOURCC('B', 'X', '2', '4'), /*!< 32-bit BGRX8888. */
kVIDEO_PixelFormatRGB888 = FSL_VIDEO_FOURCC('R', 'G', '2', '4'), /*!< 24-bit RGB888. */
kVIDEO_PixelFormatBGR888 = FSL_VIDEO_FOURCC('B', 'G', '2', '4'), /*!< 24-bit BGR888. */
kVIDEO_PixelFormatRGB565 = FSL_VIDEO_FOURCC('R', 'G', '1', '6'), /*!< 16-bit RGB565. */
kVIDEO_PixelFormatBGR565 = FSL_VIDEO_FOURCC('B', 'G', '1', '6'), /*!< 16-bit BGR565. */
kVIDEO_PixelFormatXRGB1555 = FSL_VIDEO_FOURCC('X', 'R', '1', '5'), /*!< 16-bit XRGB1555. */
kVIDEO_PixelFormatRGBX5551 = FSL_VIDEO_FOURCC('R', 'X', '1', '5'), /*!< 16-bit RGBX5551. */
kVIDEO_PixelFormatXBGR1555 = FSL_VIDEO_FOURCC('X', 'B', '1', '5'), /*!< 16-bit XBGR1555. */
kVIDEO_PixelFormatBGRX5551 = FSL_VIDEO_FOURCC('B', 'X', '1', '5'), /*!< 16-bit BGRX5551. */
kVIDEO_PixelFormatXRGB4444 = FSL_VIDEO_FOURCC('X', 'R', '1', '2'), /*!< 16-bit XRGB4444. */
kVIDEO_PixelFormatRGBX4444 = FSL_VIDEO_FOURCC('R', 'X', '1', '2'), /*!< 16-bit RGBX4444. */
kVIDEO_PixelFormatXBGR4444 = FSL_VIDEO_FOURCC('X', 'B', '1', '2'), /*!< 16-bit XBGR4444. */
kVIDEO_PixelFormatBGRX4444 = FSL_VIDEO_FOURCC('B', 'X', '1', '2'), /*!< 16-bit BGRX4444. */
/* YUV. */
kVIDEO_PixelFormatYUYV = FSL_VIDEO_FOURCC('Y', 'U', 'Y', 'V'), /*!< YUV422, Y-U-Y-V. */
kVIDEO_PixelFormatYVYU = FSL_VIDEO_FOURCC('Y', 'V', 'Y', 'U'), /*!< YUV422, Y-V-Y-U. */
kVIDEO_PixelFormatUYVY = FSL_VIDEO_FOURCC('U', 'Y', 'V', 'Y'), /*!< YUV422, U-Y-V-Y. */
kVIDEO_PixelFormatVYUY = FSL_VIDEO_FOURCC('V', 'Y', 'U', 'Y'), /*!< YUV422, V-Y-U-Y. */
kVIDEO_PixelFormatXYUV = FSL_VIDEO_FOURCC('X', 'Y', 'U', 'V'), /*!< YUV444, X-Y-U-V. */
kVIDEO_PixelFormatXYVU = FSL_VIDEO_FOURCC('X', 'Y', 'V', 'U'), /*!< YUV444, X-Y-V-U. */
Can Raw10 be converted to RGB888 in the application code?
I attached some data sheets for your reference.