Hi,
I would like to understand how to correct non-linearity in the output during the calibration of the ADC.
Figure 10, Document Number: AN13413, Rev 0
As the image shows, how do I perform the calibration by giving different inputs across the range of the ADC during calibration? I am using low-level drivers and I have the below function:
Adc_Sar_Ip_StatusType Adc_Sar_Ip_DoCalibration(const uint32 u32Instance)
S32K3XX RM, Rev 8
What is the 'known reference voltage' sampled during calibration? Is it the bandgap voltage? What is the exact value used? (The data sheet says 1.2V but when I read the bandgap voltage it is 1.19V which is alright but then there is an error of 0.01V) Is this usually acceptable?
In the values that I read during functional conversions, I see that there is an error of 0.2 V (calibration was successful as a return from the above function) from what is set. I am sure the power supply is not in error as it gives the output voltage set corresponding to the input voltage requested. We have also confirmed this via a DMM.
How do I ensure that the calibration is done correctly?
Is there something I need to additionally consider during calibration?