I am trying to calculate the worst-case tolerance of S32K116 comparator (use 1V bandgap).
I found that the input offset is 25mV in full temperature. and it would be 2.5% tolerance (25mV/1V). Together with the 3% tolerance on bandgap. There will be 5.5% tolerance on the comparator. Since this tolerance looks a little too big for me, I am not sure if I got something wrong in this calculation.
Do I need to consider this input offset in a worst-case calculation?
How much of this tolerance can be reduced if I have a calibration in room temperature?
Solved! Go to Solution.
Hello @Sid_Zhou,
It is more complicated.
The Vbg, which has its own tolerance, can be used as the reference for the internal DAC.
The DAC divides the Vin2 (Vreference voltage, Vbg).
1 LSB = Vin2 / 256
The DAC error is specified:
VAIO is then the offset at the input of the CMP itself between INP and INN.
Regards,
Daniel
The DAC error is around 1LBS (1V/256=3.9 mV). it is small compare with the tolerances on Vbg(3%) and Vaio(25mV).
We have an application need to monitor a critical voltage Vtest not drop below 0.8V. Assume we have this voltage connect to INP.
1. With Vaio=-25mV Vaio, we need to set INN to be 0.825V.
2. With Vbg=0.97V, we need to set VOSEL=218. (218*0.97/256=0.826V)
So VOSEL=218 is what I can make sure the MCU (any Batch, any temperature) trigger when Vtest<0.8V.
But with this setting, when worst case goes to opposite direction. (Vbg=1.03, Vaio=+25mV), The trigger voltage is now 0.90V (218*1.03V/256+25mV). Thus, the worst case of Comparator is (0.8V,0.9V), which is a range around 12%.
Back to our real application, we need to monitor +3.3V not drop below 3.0V, it is only 9% change.
The CMP seems not able to reach 100%-trigger (Vtest<3.0V) and 0%-not-trigger (Vtest>3.3V).
Thanks @danielmartynek