DEVKIT-S12ZVC ADC interference and wrong temperature

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

DEVKIT-S12ZVC ADC interference and wrong temperature

Jump to solution
1,585 Views
nelson18
Contributor II

Using the DEVKIT-S12ZVC with ADC0 setup from processor expert.

Problem:
 getting ADC channel interference, channel after is skewed if one before moves.
 also getting a very skewed temperature reading, 1.2V it should be 1.7-2.1 range based on the             formula.

Setup:

10 bits resolution with a Conversion time of 57uS, with each sample taking 24ADC clocks cycles with 144.5us
I have 5 channels, each doing there own sample. All set to VRH_1, VRL_1 reference.

Channels

PAD11
PLO_HVI0_KWL0
BATS_VSUP_SENSE_VOLTAGE
CHIP_Temp_Sensor__or_Band_Gap
PAD10

Update**

The ADC seems to be working now, had to set
CPMUHTCTL_HTE = 1; //enable internal temp sensor
CPMUHTCTL_VSEL = 0; //select temp sensor and not bandgap

for the temperature sensor.

I think this also fixed the bleed for all channels other than the last conversion affecting the first conversion on the next read.

Still trying to figure out why I am getting bleed from the last conversion into the first conversion. The timing of all conversion is well under my start measurement call.


0 Kudos
Reply
1 Solution
1,559 Views
nelson18
Contributor II

CPMUHTCTL_HTE = 1, CPMUHTCTL_VSEL = 1 for bandagap voltage; CPMUHTCTL_HTE = 1, CPMUHTCTL_VSEL = 0 for sensor voltage.

I was missing this step which processor expert does not do.
Now I am getting a somewhat sensible reading of 37C at ambient 17C when I read temperature alone.

View solution in original post

0 Kudos
Reply
1 Reply
1,560 Views
nelson18
Contributor II

CPMUHTCTL_HTE = 1, CPMUHTCTL_VSEL = 1 for bandagap voltage; CPMUHTCTL_HTE = 1, CPMUHTCTL_VSEL = 0 for sensor voltage.

I was missing this step which processor expert does not do.
Now I am getting a somewhat sensible reading of 37C at ambient 17C when I read temperature alone.

0 Kudos
Reply