I need to know how the hardware averaging works so that I can use it effectively without altering my read timing. I'm assuming that if I have a model that samples the ADC every 0.1ms using the PDB trigger and I set hardware averaging to 4 samples then I will get a sample every 0.4ms? Or does the hardware averaging take my step size into account and sample every 0.025ms so that I get an averaged sample every 0.1ms?
For reference, I'm using a tapped delay to get an RMS value for the sine wave input signal and I need to know exactly how many delays I need. I need to know if that RMS value goes above some threshold in a sample period. For instance, if the RMS value goes above 100mV and stays for a period of 40ms, I want to trigger a flag. If it crosses the threshold and dips back below in less than 40ms I will disregard it.