K22: Calibrating ADC with VDDA = 3.3V, VREFH = 2.5V?

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

K22: Calibrating ADC with VDDA = 3.3V, VREFH = 2.5V?

1,111 Views
troelsoesteraa
Contributor III

Hi there,

I have found one discussion regarding using dual 3.3V supplies for VDDA and VREFH and disussing whether or not this was a good idea. I have searched, but couldn't find anything on this subject.

My device: MK22FN256VLL12

I need to use a 2.5V external, high precision voltage reference for the ADC, so that is connected at the VREFH pin.

However, I also want to ensure the the performance/gain/offset of the ADC is as good as possible, so I am implementing the calibration procedure as also described in AN3949.

AN3949 states that VREFH and VDDA should be above 3V. The Reference manual for my device states that VREFH = VDDA. I meet neither of those requirements.

- What would your recommendation be in such a case?

- Can I just multiply the sum of CLP and CLM registers by 3.3/2.5 or something?

I'd rather not spend money on a DC switch that can be controlled by a GPIO, if this can be avoided.

0 Kudos
4 Replies

693 Views
maxkessler
Contributor I

Jo, thanks

0 Kudos

691 Views
troelsoesteraa
Contributor III

Thanks for the reply, it is much appreciated.

The "VREFH = VDDA" I agree is not a requirement, but it seems to be "highly suggested" during ADC calibration.

For the MCU I'm using, it is described under "33.4.6 Calibration function", pg. 740 in the reference manual:

For best calibration results:
• Set hardware averaging to maximum, that is, SC3[AVGE]=1 and SC3[AVGS]=11
for an average of 32
• Set ADC clock frequency fADCK less than or equal to 4 MHz
• VREFH=VDDA
• Calibrate at nominal voltage and temperature

And thanks for the circuit - instead of the 3.0V VREFH, I'm running 2.5V, so we're somewhat it the same boat regarding unequal VREFH and VDDA. :smileyhappy:

I'd really appreciate some more details detailing why, for calibration, VREFH should be at VDDA potential, e.g. what happens internally during calibration, which voltage references are used, and how this affects the calibration result, but I've been unable to find any details on this.

The end result of this might be a test setup, where I measure different voltages between 0-2.5V with/without calibration, but then again, this might not tell the true story with only a few samples to try this on.

0 Kudos

693 Views
egoodii
Senior Contributor III

I don't think 'for best results' means anything more than that given a fixed internal-noise-level, the maximum ENOB is achieved with maximum Vref.  The specs are all given at 3V, I chose 3V to be a 'near maximum I could regulate' and insure <= my 3.3V supply.

0 Kudos

690 Views
egoodii
Senior Contributor III

I'm not sure where you get the 'VrefH = Vdda' requirement.  The datasheets I look at say this:

pastedImage_1.png

So, anything from 1.13V to Vdda. Naturally enough, accuracy specs reduce at lower VrefH.

Now I DO agree that at ALL times Vdda must be within 100mV of Vdd:

pastedImage_2.png

so I can't see the logic in 'dual, analog and digital supplies'.  I run a 3V accurate reference myself:

pastedImage_3.png

The calibration procedure should relate itself directly to the VrefH you are giving.  I wouldn't change any results.

0 Kudos