When I setup the micro for 16-bit differential (DIFF=1 & MODE=11), I get an effective range of -16384 to 16383 instead of -32768 to 32767. This shows only 15-bit 2's complement output instead of 16 as claimed. When I changed it to single-ended, I get the full range of 0-65535 as expected. When I changed it to 13-bit differential (MODE=01), I also get the expected range of -2048 to 2047. So only 16-bit differential not working. Anyone got an idea why? Or is that a screw-up on NXP?
Solved! Go to Solution.
I'm pretty sure you're expecting it to work as a SEVENTEEN bit converter.
One problem here is that neither the Reference Manual or Data Sheet give clear definitions of the conversion ranges. You're sort of "just meant to know how this works". Or you're meant to reverse-engineer it. I have to do that a lot.
The basic question is "what is full scale?". When in single-ended mode, the single ADC input going from 0V to 3.3V should give a converted value of 0 to 65535, so there are 2^16 levels between 0V and 3.3V, with each step being 3.3/2^16.
The full scale when in differential mode goes from the differential inputs connected to 0V and 3.3V to them being swapped, connecting to 3.3V and 0V. So the full scale range is from -3.3V to +3.3V, or a full scale range 6.6V and so I'm pretty sure there are 2^16 levels with each step being 6.6/2^16.
You've got ADC_DM0 tied to the middle so your signal is only going from -1.65V to +1.65V. So the proper conversion for a voltage swing of half the range is -16384 to 16383. To get +-32k with that input voltage swing you'd have to have a 17 bit converter.
Or to put it another way, if you left everything else alone and connected ADC_DM0 to ground your sine wave would be converted to 0 to 32767. If you connected ADC_DM0 to 3.3V you'd get -32768 to 0. If it was magically giving you +-32k with your setup as you seem to expect it would have to be able to measure +-64k.
Tom
I checked the differential 13-bit mode again and it is behaving like the differential 16-bit mode. So, as Tom said, my issue is the interpretation of full scale range. With consistent results from both 13-bit and 16-bit modes, I agree that this is not a bug. Moral of the story here is don't trust what you think you find after a 16 hours straight debugging session.
This was tested using ADC0_DP0/ADC0_DM0 pair. ADCH bits on ADC0_SC1A register was set to 00000 and DIFF to 1. MODE bits on ADC0_CFG1 register was set to 11. The output range was still 15 bits (1 sign bit + 14 data bits). As far as I can tell from the reference manual, those are the only required configuration for 16-bit differential operation.
If it is a conversion problem, I'd try changing the clocking and the "extra clocks" in case you have managed a configuration where the chip doesn't have enough clocks to complete the job.
Are you running both inputs (+ve and -ve) alternately from VREFH to VREFL? If you have the negative ADC input tied to a mid-level voltage then that would explain your only seeing half the expected range.
Tom
VREFH is set to 3.3V and VREFL to Ground. The ADC_DP0 input signal is a 0-3.3V sine wave and ADC_DM0 tied to 1.65V. The sampled values showed a sine wave same as the input; it is is not chopped in any way. So, I don't think this is a conversion issue. The same code with only the MODE bits changed works in 13-bit differential but not 16-bit differential.
I'm pretty sure you're expecting it to work as a SEVENTEEN bit converter.
One problem here is that neither the Reference Manual or Data Sheet give clear definitions of the conversion ranges. You're sort of "just meant to know how this works". Or you're meant to reverse-engineer it. I have to do that a lot.
The basic question is "what is full scale?". When in single-ended mode, the single ADC input going from 0V to 3.3V should give a converted value of 0 to 65535, so there are 2^16 levels between 0V and 3.3V, with each step being 3.3/2^16.
The full scale when in differential mode goes from the differential inputs connected to 0V and 3.3V to them being swapped, connecting to 3.3V and 0V. So the full scale range is from -3.3V to +3.3V, or a full scale range 6.6V and so I'm pretty sure there are 2^16 levels with each step being 6.6/2^16.
You've got ADC_DM0 tied to the middle so your signal is only going from -1.65V to +1.65V. So the proper conversion for a voltage swing of half the range is -16384 to 16383. To get +-32k with that input voltage swing you'd have to have a 17 bit converter.
Or to put it another way, if you left everything else alone and connected ADC_DM0 to ground your sine wave would be converted to 0 to 32767. If you connected ADC_DM0 to 3.3V you'd get -32768 to 0. If it was magically giving you +-32k with your setup as you seem to expect it would have to be able to measure +-64k.
Tom
In 16-bit differential mode, I expect the values with 1 sign bit and 15 data bits; therefore, range of -32768 to 32767. I don't understand your comment about full range being -3.3V to 3.3V. VREFL is ground and the ADC input is not allowed below -0.3V on this chip.
The same setup was also tested in differential 13-bit mode. In this mode, I got values -4096 to 4095 indicating the correct 1 sign bit + 12 data bits. So why in differential 16-bit mode the values only have 1 sign bit and 14 data bits?
Hi,
Please be note that the 16-bit accuracy specifications listed in the manual are achievable only on the differential pins ADCx_DP0, ADCx_DM0. All other ADC channels meet the 13-bit differential/12-bit single-ended accuracy specifications.
So please double confirm if you are using the correct channels and if not, please try to change to the correct one.
This was tested using ADC0_DP0/ADC0_DM0 pair. ADCH bits on ADC0_SC1A register was set to 00000 and DIFF to 1. MODE bits on ADC0_CFG1 register was set to 11. The output range was still 15 bits (1 sign bit + 14 data bits). As far as I can tell from the reference manual, those are the only required configuration for 16-bit differential operation.