AnsweredAssumed Answered

Memory corruption using CSI

Question asked by Matthias Weisser on Apr 13, 2016
Latest reply on Apr 14, 2016 by Matthias Weisser

We have a custom board which uses a custom driver for the CSI modules. CSI1 is connected to an external video decoder (TW9990) in CCIR mode and CSI2 is connected to the internal VADC. On both channels I see memory corruption (writing behind the allocated buffer) when the video signal is disturbed (cable loose, camera loses power). I see this behavior even when only one channel is activated. Does anyone have an idea what can be wrong? I tried zillions of combinations of CSI settings and handling every interrupt source of the CSI module (used the linux driver as a reference). But I can't see that I am doing anything wrong now.


I get a perfect image as long as the camera signal(s) are stable. As soon as the signal get disturbed I get these annoying errors.

Any help is greatly appreciated.