I'm trying to debug a strange error. I have a program that is using ADC1. I am working in C++. The program works perfectly with optimization off. I have a fairly complicated class that wraps the ADC for configuration and access. Again, no problems when optimizations is set to -O0. When enable any level of optimization, however, all of the ADC channels immediately begin reading 0. Here's what I know.
I have checked all the hardware registers that I can think of through the SWD debugger and the configuration seems to be identical with or without optimization (as expected). I have checked SEQA-CTRL and its configuration is:
Note that the burst bit is reading as active. The ADC1.CTRL register is configured as:
So the clock divisor is getting correctly set. Additionally, the clock is enabled in the SYSCON->SYSAHBCLKCTRL0 and the SWM->PINENABLE0 register has the correct input pins enabled.
Both the SEQA-GDAT and DATx registers show no valid data (all bits are clear). I am at a loss.
Any further ideas what to check? I rewrote just the ADC code without the C++ class and I'm seeing the results I would expect to see. However, the configuration is -- again -- the same so far as I can tell (TRIGPOL is set to NEGATIVE_EDGE) but that shouldn't matter with BURST should it? Any ideas what I could look into to try to solve this problem?
Martin Jay McKee