I'm currently trying to debug some unexpected behavior - namely that the ADC measurements of a power detector are very temperature dependent. We're using the MK22FN256VLL12 with the ADC operating in 16 bit mode, asynch ADC clock of typical 5.2 MHz, with averaging level of 4. We also use an external 2.5V voltage reference with the ADC.
I have read https://www.nxp.com/docs/en/application-note/AN5250.pdf and noted that it is mentioned that temperature has a
a rather strong effect on the ADC measurement.
My question number one is:
- How large is this effect - I don't see any specs anywhere in any of the K22 data sheets, so what are we talking?
What we do now is more or less acording to 16-bit SAR ADC calibration . We store the positive, negative gains and offsets in non-volatile memory, and let the MCU load this into the correct ADC registers on initialization. No further calibration is done, but I'm doubting whether this is really a good procedure when AN5250 states that temperature influences offset and gain error drift. So what is the recommended approach to have the ADC operate equally well at -25 and 85 degrees C? Calibrate every time the temperature has decreased by more than 10 degrees since the last calibration?
Note: I'm not saying anything is wrong with the ADC - it could be the power sensors, as we are still testing - it is just discomforting that I haven't been able to find any data on how the ADC performs vs. temperature.Any information is welcome.Thanks.