We have a custom hardware board based on i.MX8QXP SoC. Connected to it is the Sony imx215 camera that output image in RAW10 format.
We have few questions as below:
(1)
When these pixel values are received from ISI as 16-bit, the 10-bit value from RAW10 format is placed in it as 00XX XXXX XXXX 0000, where X is one bit of the color value. Hence in the application, received 16-bit value is shifted right by 4. Also, since application treats resulting 16-bit value as 16-bit color sample, value is scaled to 16-bit from 10-bit. (i. e. multiply by 65535.0/1023).
For a sample, output captured from ISI when test pattern is enabled in imx215.
Refer Without code changes in application RAW.png and With code changes in application for RAW.png
In case "Without code changes in application RAW.png " above, bits are lost in color values when captured from ISI. Case "With code changes in application for RAW.png" shows correct results when code changes were made in application not to lose bits.
We would like to know if there is a way to avoid shifting bits or do it in hardware or GPU? (To avoid the overhead caused to application because of above).
(2)
It looks like the output image from camera is very low in brightness and overall appearance that it requires gamma correction and/or white balance to get a good picture.
A sample is below.
Refer "Without code changes in application.png" and "With code changes in application, brightness and color values enhanced in windows photo application.png"
Is there any support from hardware to perform such operations?
Can someone pls respond to this question?