I've notice that voltage readings are off by almost half a volt when reading a 0V channel.
First the channel was set to 8V and the ADC measurement was made. Then, the channel was set to 0V, ensured in a steady state, and then 10 ADC conversations were made. Both the ADC reading and the oscilloscope show that the 0V ADC reading is almost ~500 mV off, the first read and then ~50mv off on the remaining reads:
Each pair of "spikes" correspond directly to an ADC read.
It would make sense that the first reading is off perhaps due to internal ADC capacitance, but it doesn't quite explain why the subsequent reads are always at ~40 mV. Bottom line: it is never reading the true 0V like it should be.
1. The ADC calibration offset is 0, and the ADC calibration gain is ~1.022,
1. The ADC conversation is set to 1 sample in high-speed conversation mode with a conversation time of 6.25 us
2. The 0v reading is connected to a 100kOhm pull-down resistor
- It is also connected in parallel to a 392kOhm resistor with a MOSFET in the off-state to the 8v channel.
Solved! Go to Solution.
Hello Dane,
These voltage disturbances (voltage drops/peaks) at the ADC input can be reduced by choosing the correct conversion time and external RC components. I recommend you to check the following application note that provides guidelines to achieve optimal performance in these circumstances.
https://www.nxp.com/docs/en/application-note/AN4373.pdf
Best regards,
Felipe
Hello Dane,
These voltage disturbances (voltage drops/peaks) at the ADC input can be reduced by choosing the correct conversion time and external RC components. I recommend you to check the following application note that provides guidelines to achieve optimal performance in these circumstances.
https://www.nxp.com/docs/en/application-note/AN4373.pdf
Best regards,
Felipe
That application note was exactly our issue. Thanks for the help!
Increasing the conversion time solved the issue.