Accurate time-stamping

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Accurate time-stamping

Jump to solution
2,277 Views
gmscribe
Contributor II

Hi guys,

I'm currently working with a Kinetis K70 and am wishing to time-stamp certain readings I'm receiving from the ADC.

My issue is that I wish for a microsecond timebase, whilst at the same time I want a counting period of ~ 1 second, meaning that a 16-bit counter is too small.

I have attempted to use the PIT, polling the counter value and converting it to microseconds, however, I'm observing significant inconsistencies in the values returned, resulting in my relative timings sometimes being out by a factor of 10. I'm guessing perhaps the PIT has some unpredictable wait states and can't be used in this manner?

Can anyone suggest a viable alternative?

Many thanks

Labels (1)
0 Kudos
1 Solution
1,140 Views
gmscribe
Contributor II

Hi guys,

Thanks for the replies. Turning off interrupts, fetching from the PIT in a certain order and, more importantly, solving some ADC sample jitter that was instead the main cause of the problem, I'm now seeing timings accurate to around +-100ns off the PIT. Apologies for some time-wasting.

If I ever need sub microsecond accuracy, I think Hui_Ma's suggestion will likely produce less variation, though also note it's possible to use the Ethernet IEEE.1588 timestamp system (K60/70) to great effect, as it's a high-speed 32-bit timer with a timestamp buffer.

Many thanks

View solution in original post

0 Kudos
3 Replies
1,141 Views
gmscribe
Contributor II

Hi guys,

Thanks for the replies. Turning off interrupts, fetching from the PIT in a certain order and, more importantly, solving some ADC sample jitter that was instead the main cause of the problem, I'm now seeing timings accurate to around +-100ns off the PIT. Apologies for some time-wasting.

If I ever need sub microsecond accuracy, I think Hui_Ma's suggestion will likely produce less variation, though also note it's possible to use the Ethernet IEEE.1588 timestamp system (K60/70) to great effect, as it's a high-speed 32-bit timer with a timestamp buffer.

Many thanks

0 Kudos
1,140 Views
ndavies
Contributor V

The problem isn't the PIT timer. It delivers an  interrupt or DMA request on a very timely basis. It is far more accurate then your requirements.

Set up your pit timer to trigger an ADC conversion on a consistent basis and have it free run. Have the pit timer routine count the number of ADC readings it triggered. You then just need to keep the count and ADC reading together. If you need 1millisecond sample time have your ADC triggered every millisecond. Every count of an ADC trigger will then be 1 millisecond. If you use a 32bit int to count the ADC conversions it will give you a roll over every 49 days at a 1 millisecond resolution.

Yes, there will be some jitter between the time the ADC is triggered and when the results are available, this should be well below your timing resolution.

This works best if you use the PIT to drive the PDB which then triggers the ADC and then DMA the ADC results into an array. Using the DMA will reduce the load on the CPU. Take a look at App Note AN4590 on how to use the DMA in ADC conversions.

We use the PIT to trigger the PDB which triggers the first ADC reading in a list of ADC conversions, We use 2 DMAs to cycle through a set of ADC conversions. So 1 trigger with several ADC channel conversions at the PIT Triggered time.

This method should give you the ability to get well below 1 millisecond sampling rates.

0 Kudos
1,140 Views
Hui_Ma
NXP TechSupport
NXP TechSupport

Hi,

You also could consider to use ARM Cortex-M4 provided system tick timer(SysTick).

More detailed info, please chec below link:

http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.dui0553a/Babieigh.html

Wish it helps.

0 Kudos