Hi all,
I'm finding that if I compile & optimise my firmware with anything other than -O0 (using MCUXpresso), that the ADC readings are massively different... by about a factor of 10.
The ADC is polled, so it's not an interrupt memory being read/written out of order problem.
I've confirmed timing between ADC conversions, and the conversion itself does not change by checking a spare GPIO on a DSO.
I've tested using "pragma GCC optimize" in the source file handling the ADC conversion, but it makes no difference.
I'm running out of ideas.
The relevant source is below.... it's pretty simple.
void ADC_Init(void)
{
adc16_config_t ADCInit;
//init
ADC16_GetDefaultConfig(&ADCInit);
//adc clock needs to run between 1 and 18Mhz.
//options below give sample time of ~30uS
ADCInit.clockSource = kADC16_ClockSourceAlt1; //bus clock (12Mhz) div by 2 = 6mhz
ADCInit.clockDivider = kADC16_ClockDivider4; //div by 2 again = 1.5mhz
ADCInit.longSampleMode = kADC16_LongSampleCycle24; //long sample
//other
ADCInit.referenceVoltageSource = kADC16_ReferenceVoltageSourceVref;
ADCInit.resolution = kADC16_ResolutionSE12Bit; //12bit
ADC16_Init(ADC0, &ADCInit);
//software trigger
ADC16_EnableHardwareTrigger(ADC0, false);
//self calibrate
ADC16_DoAutoCalibration(ADC0);
}
uint32_t ADC_DoConversion(uint8_t Channel)
{
//init
ADCChannelConfig.channelNumber = Channel;
ADCChannelConfig.enableInterruptOnConversionCompleted = false;
//start conversion
ADC16_SetChannelConfig(ADC0, ADC_CHANNEL_GROUP, &ADCChannelConfig);
//wait for conversion to complete
while (!(kADC16_ChannelConversionDoneFlag & ADC16_GetChannelStatusFlags(ADC0, ADC_CHANNEL_GROUP)))
{
__NOP();
};
//return result
return ADC16_GetChannelConversionValue(ADC0, ADC_CHANNEL_GROUP);
}
Thanks for any help!