ADC Problem with HCS08DZ/DN TYPE:  Delay time must be inserted before ADC started, delay too long

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

ADC Problem with HCS08DZ/DN TYPE:  Delay time must be inserted before ADC started, delay too long

2,359 Views
wander_zhang
Contributor I

In my Application, I have to measure a voltage from an analog PIN periodically. The voltage is turned dynamically. That means, the voltage is normally turned off and is 0v, it is turned on and rises to a certain value, lets say 3.26v(the rise time is quite short  and is about 5µs)the ADC resolution is selected as 10 bit, so the digital value of the voltage is 1024*3.26/3.3=1011 in the theory. In the practice it is not so. If I start to measure the voltage right after the voltage is turned on, the digital value I get will be much smaller than the theoretical value 1011. but if i make a small delay after the voltage is turned on and then start the ADC, the digital value will be bigger but still smaller than the 1011. only if the delay time reaches  about 80us, the digital value is near 1011.  the following is an experiment I did. the voltage is 3.26v.

 

Delay time(µs)             Digital Value

 

20                              760

30                              886

40                              950

50                              980

60                              1003

70                              1007

80                              1010

 

 

the delay time is a bit too long for my application.

 

Has anyone got the same problem with me? Can this problem be caused by the false setting of the ADC registers? Or it is the property of this Microcontroller?

Labels (1)
0 Kudos
Reply
4 Replies

1,158 Views
wander_zhang
Contributor I

Hello,

 

the problem is solved. :smileyhappy:

 

ADC clock is the same as the bus clock, which is 16MHZ und the short sample time is selected. The circuit used is a simple CR circuit.  the measure point is before the serial Resistor to the Microcontroller.(not on the Microcontroller side) The input resistance is too big. i have chosen a smaller resistor and it works very well. the rise time is 5µs, since the setting time is at least 3 times the rise time, the delay time should be i think at least 15µs.

 

thanks a lot for the information.

 

Regards

 

C.Zhang

 

 

0 Kudos
Reply

1,158 Views
bigmac
Specialist III

Hello,

 

You do not say what ADC clock frequency you are using, and whether you have selected the long sampling interval?  Let's assume so, and that you have an ADC clock of 1 MHz.  The acquisition plus conversion period will be about 43 microseconds.

 

Where did you measure the waveform rise time?  Between this point and the input pin, have you added any series resistance, or shunt capacitance?  If so, the rise time may be considerably degraded.

 

Keep in mind that the required settling time will be many times the rise time to achieve a reading within 1 LSB at 10-bit resolution.  For example, if the equivalent circuit is represented by a simple CR time constant, the settling time would need to be 7 times the time constant value.  Measuring the rise time to 90 percent of the steady value, the settling period would need to be at least three times the measured rise time.  With the equivalent of more than one CR time constant present, the settling time may need to be considerably longer, compared with the rise time.

 

Regards,

Mac

 

0 Kudos
Reply

1,158 Views
wander_zhang
Contributor I

Hello Mr Peg,

 

 

thank You very much for you quick replay. your ansower is very helpful and I hope the problem is resolved.

 

Regards 

 

C.Zhang

 

 

0 Kudos
Reply

1,158 Views
peg
Senior Contributor IV

Hello and welcome to the fora, wander_zhang.

 

It sound like you need to buffer your voltage source to present a lower inpedance source to the converter. With a high impedance source you will see this sort of thing occur.

 

0 Kudos
Reply