The datasheet for the LPC11E6x says on pages 70 and 71 that the temperature sensor's analog response is approximately Vadc (mv) = -2.36 * T (Celsius) + 606. The only further qualification given to this formula is that the ADC must be set up in single channel burst mode, and the first 8 samples should be discarded.
However, I have found that the measured value varies significantly with ADC sampling frequency. Averaging over 350 samples, in single channel burst mode, with the first 50 samples discarded, in an ambient 23C:
- At 500 kHz / 25 = 20 kSps, I measure 32.16 +- 0.28 C
- At 16 MHz / 25 = 640 kSps, I measure 35.44 +- 0.26 C
- At 24 MHz / 25 = 960 kSps, I measure 39.13 +- 0.28 C
- At 48 MHz / 25 = 1.92 MSps, I measure 27.01 +- 0.248
This is presumably due to the variation of the ADC input impedance as the sampling frequency changes - although I can't explain why the trend goes up in temperature (down in voltage) until 24 MHz, then changes to down in temperature at 48 MHz. All readings are higher than expected, particularly at lower sampling rates.
I have two questions:
- Which peripheral clock and sampling frequency is the datasheet's curve for? (Figure 37, Table 24)
- Is the sensor still linear and repeatable if a different sampling frequency is used?