We are using the CSI in our custom RT1052 board, our code is BASED on the AN12110SW, but as we don't have an LCD, our camera is monochrome, etc - it's been altered according to our hardware.
I can confirm that I'm able to now write and update CSI registers now that I'm on an RT1052 (so I'm past the issue here: RT1050 CSI Inactive )... You can see schematics in that ticket too, everything's connected...
I can now also confirm the CSI_MCLK outputs a 24MHz signal too, but I'm still stuck waiting at:
while (kStatus_Success != CAMERA_RECEIVER_GetFullBuffer(&cameraReceiver, (uint32_t *) &(csiFrameBuf[i]) )) { }
All other RT1052 interfaces are working fine (I2C, SPI, SDRAM 16bit at least, FlexSPI, JTAG, GPIO)...
Questions:
1.
Is anything missing from the below to kick start the CSI? - Why wouldn't I be getting any buffers?
uint8_t isc_status;
CLOCK_SetMux(kCLOCK_CsiMux, 0);
CLOCK_SetDiv(kCLOCK_CsiDiv, 0);
memset(csiFrameBuf, 0, sizeof(csiFrameBuf));
const camera_config_t cameraConfig = {
.pixelFormat = kVIDEO_PixelFormatRGB565,
.bytesPerPixel = BYTE_PER_PIXEL,
.resolution = FSL_VIDEO_RESOLUTION(CAMERA_WIDTH, CAMERA_HEIGHT),
.frameBufferLinePitch_Bytes = CAMERA_WIDTH * BYTE_PER_PIXEL,
.interface = kCAMERA_InterfaceGatedClock,
.controlFlags = CAMERA_CONTROL_FLAGS,
.framePerSec = 30,
};
CAMERA_RECEIVER_Init(&cameraReceiver, &cameraConfig, NULL, NULL);
imgSensorPowerOn();
do {
isc_status = initImgSensor(true);
} while (isc_status != 0);
for (uint32_t i = 0; i < CAMERA_FRAME_BUFFER_COUNT; i++) {
CAMERA_RECEIVER_SubmitEmptyBuffer(&cameraReceiver, (uint32_t)(csiFrameBuf[i]));
}
startImgSensorStreaming();
CAMERA_RECEIVER_Start(&cameraReceiver);
for (uint32_t i = 0; i < CAMERA_FRAME_BUFFER_COUNT; i++) {
while (kStatus_Success != CAMERA_RECEIVER_GetFullBuffer(&cameraReceiver, (uint32_t *) &(csiFrameBuf[i]) )) { }
}
2.
Is it required to run the image sensor using CSI_MCLK, or should the CSI work just fine with the Image Sensor's PXCLK being based off a different clock?
3.
In copying the Appnote - pinmux currently sets up CSI like this (note they set up SCI for 8 data pins, but they are using 2 bytes per pixel, why isn't the camera adapter set up for 1 byte / pixel if using 8 bit parallel mode?
void BOARD_InitCSIPins(void) {
CLOCK_EnableClock(kCLOCK_Iomuxc);
IOMUXC_SetPinMux(
IOMUXC_GPIO_B1_12_CSI_PIXCLK,
0U);
IOMUXC_SetPinMux(
IOMUXC_GPIO_B1_15_CSI_MCLK,
0U);
IOMUXC_SetPinMux(
IOMUXC_GPIO_B1_13_CSI_VSYNC,
0U);
IOMUXC_SetPinMux(
IOMUXC_GPIO_B1_14_CSI_HSYNC,
0U);
IOMUXC_SetPinMux(
IOMUXC_GPIO_AD_B1_08_CSI_DATA09,
0U);
IOMUXC_SetPinMux(
IOMUXC_GPIO_AD_B1_09_CSI_DATA08,
0U);
IOMUXC_SetPinMux(
IOMUXC_GPIO_AD_B1_10_CSI_DATA07,
0U);
IOMUXC_SetPinMux(
IOMUXC_GPIO_AD_B1_11_CSI_DATA06,
0U);
IOMUXC_SetPinMux(
IOMUXC_GPIO_AD_B1_12_CSI_DATA05,
0U);
IOMUXC_SetPinMux(
IOMUXC_GPIO_AD_B1_13_CSI_DATA04,
0U);
IOMUXC_SetPinMux(
IOMUXC_GPIO_AD_B1_14_CSI_DATA03,
0U);
IOMUXC_SetPinMux(
IOMUXC_GPIO_AD_B1_15_CSI_DATA02,
0U);
}