This is in regard to using the Band Gap Voltage (Vbg) to improve the ADC readings in regard to variation in the reference supply (Vref or VDD). We are implementing it to make such improvements, however we can't spend much CPU time on this function.
The reference manual simply states that:
"In order to compensate for VDDA reference voltage variation in this case, the reference voltage
is measured during production test using the internal reference voltage VBG, which has a narrow variation
over temperature and external voltage supply. VBG is mapped to an internal channel of each ADC module."
This does not clarify for what "reference voltage variations" compensation can occur.
From some statements by NXP personnel and from comments out of the sample code provided in "S12ZVC192-Voltage_measurement-CW106" it is clear that using Vbg on the ADC samples can compensate for low supply voltages causing Vref-VDD to be less than 5 volts. We do not need this, we need compensation for other variations.
So the question that we have is, Can the recommended use of Vbg provide SOME compensation for the following:
- Variation in device temperature
- Variation from one S12ZVM device to the next
We are not looking for any guaranteed amount of correction or improvement. The answer to this question will drive our design to only read Vbg once each power up or some continuous readings of it during application operation (Of course, using EEPROM we may even only read it once for each device in our ECU module factory)