I am using the K60 tower and experimenting with using the PDB to trigger DAC updates, and have a question about the effect of the "Multiplier" property on the Init_PDB bean.
I have DAC0 set to be hardware triggered via the PDB, and its in buffered mode with half a sine wave loaded into the 16 data registers. I have it in swing mode so it will generated a sine wave for me.
I'm using a peripheral bus speed of 4Mhz, so when I have the multiplier set to "multiply by 1" and the prescaler set to "divide by 1", the PE displays the Counter Frequency as "4000.0000 kHz" which looks right. I have the DAC0 Trigger interval value as 23, and it displays the Trigger interval time as 5.750us, which also looks right. (BTW, it seems weird that I have to enter the raw counters rather than requesting a time in ms or us). This generates a nice sine wave (well, nice enough given the step size) at about 5555Hz. When I use a scope to measure the duration of the individual steps, it's 5.96us, which matches what the PE is telling me for the trigger interval.
Now, I switch the PDB multiplier to be "multiply by 10" and leave everything else the same. The PE displays the counter frequency as 40000kHz and the DAC0 Trigger Interval Time as 0.575us. But when I run this, my sine wave frequency is
555Hz and the oscilloscope-measured step size is 59.6us, which does_not_ agree with that the PE told me; it's different by a factor of about 100.
The K60 data sheet says this about the multiplier "This bit selects the multiplication factor of the prescaler divider for the counter clock." which makes it sound it should be slowing things down rather than speeding things up.
So, it looks to me like the PE is calculating the DAC0 Trigger Interval Time incorrectly. Or, did I miss something?
I'm using CodeWarrior V10.1.