We are using MPC5746C micro for one of our projects and we observed below issue for our ADC Module:
We are providing 5V reference voltage to our custom board, but because of the Resistors used in the Voltage Divider Circuits at the Regulator Output, we are observing slightly higher voltage which is approximately 5.04V (which is unavoidable).
So, when we try to convert the ADC Count observed in the CDATA field of ADC_0.CDR register to Voltage value, around 20-50mV higher voltage is observed compared to the ADC Input voltage.
To resolve this issue, any suggestions on the below points:
1. We are following the ADC Calibration Process as mentioned in the MPC5746C reference manual section 188.8.131.52. Apart from this was there any other procedure to be followed to ensure ADC is properly calibrated?
2. As mentioned above, due to the slight change in the reference voltage, we are receiving different ADC Output value from the applied input voltage
As per the MPC5748G reference manual, in case of any board issues, can we use the ADC0_OFSGNUSR register to eliminate the slight difference in the ADC Output value OR is there any other method that can be suggested to resolve this issue?
I'm actually also interested in that. Right now I'm using the DEVKIT-MPC5744P with an external voltage reference (LT1790 at 4.096V) which is actually pretty precise (no ripple seen so far). (The internal reference voltage (3V3 or 5V) is so noisy that the calibration of the ADC fails 1 in 12 times.) There is still some HF noise on the line, though, but nothing too serious. (It's still a work in progress.)
The thing is that the tested ADCs have a slight gain error above 900-1000mV. When V_in is below that voltage value they are working pretty accurate. At maximum V_in (~4.096V) it's missing around 10 LSB which is a relative error around 0.25%. So if anybody got the right answer to do a better job, maybe you want to tell us.