RS08: Accuracy of programmed trim

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

RS08: Accuracy of programmed trim

4,561 Views
jbotte
Contributor I
I need to program the RS08's DCO to a particular frequency using the "Enable Trimming/DCO Output Frequency (Hz)" field in the Communications Setting box. My question is: what is the accuracy of that setting?

I understand the DCO itself is +/- 2% over the temperature and voltage range of the device (and the resolution of the trim is +/-0.2%), but what is the accuracy of the algorithm that determines the untrimmed ICSOUT frequency of the chip and sets the trim value in Flash? So, for instance, if I were using the setting of "10000000" (10MHz) for the DCO at a particular voltage and temperature, what accuracy can I expect from the setting calculated (presuming no DCO drift from when the setting was calculated)? I need this value for my worst-case timing calculations to ensure that I don't violate the restictions that I have for the application I am working on.

FYI, I am currently using Codewarrior v5.7.0 that includes v6.1 of the debugger/programmer and USB connected SofTec DEMO9RS08KA2 board.


Labels (1)
Tags (1)
0 Kudos
11 Replies

1,162 Views
bigmac
Specialist III
Hello,
 
I do not know what timing method the onboard USB-BDM interface uses during the calibration process, but I might assume its accuracy would be better than the trim resolution.  However, other questions do arise -
 
I would assume that the calibration process will program a byte value to flash address $3FFA, and the user code must transfer this value to ICSTRM register to achieve calibration.  This would seem to ignore the FTRIM bit within ICSSC.  The data sheet for the KA2 doesn't specifically say, but if we assume that the FTRIM bit results in a frequency shift of 0.4% (to achieve +/-0.2% resolution), this would imply that the ICSTRM value alone would provide a minimum step of about 0.8%, for a resolution of +/-0.4%, or thereabouts, as a result of the calibration process at room temperature.  Unfortunately, the data sheet appears not to include an indicative plot of frequency versus temperature for the internal oscillator, to gauge the likely frequency variation over a specific temperature range.
 
Since the trim range is quite wide, the assumed trim frequency wll need to be ascertained.  I suspect it may be 31.25kHz, the "nominal" figure in the data sheet.  However, in some instances, 32.768kHz may be more useful.  I am not sure whether the calibration process permits this choice.
 
Finally, there is likely to be considerable unit-to-unit variation for the required trim value, so each unit will require specific calibration.
 
Regards,
Mac
 
0 Kudos

1,162 Views
jbotte
Contributor I
The calibration process is configured from the "True-Time Simulator & Real-Time Debugger" by selecting the "SofTec-RS08->Communication..." menu item which brings up the "MCU Conguration" dialogue box (I have the DEMORS08KA2 selected in the Hardware Model pull-down for my case). From there, the Communications Settings button brings up the Communications Setting dialogue (go figure...) which has a check box to "Enable Trimming" and a text field to enter the desired DCO frequency in Hz (e.g. 20000000). The box indicates that it will use the calculated trim value to set the $FFFA and $FFFB Flash locations when the part is programmed. So, it does appear that the FTRIM bit will be set (the PDF "Debugger_HC08.pdf" that comes with Codewarrior has more info... my version had it on p. 628/629).

As for 32.768kHz... that's well within the range of the ICS specifications for the chip (which are given as 31.25kHz to 39.0625kHz, trimmed). So that's perfectly feasible. In that case you'd just trim for a DCO of 16777216 (the reference oscillator to DCO scale is 1:512) and accept the calibration error, trim resolution limitation, and deviation over temp and voltage. In my case, what I'm trying to do is set the clock for as high a frequency as possible such that for the calibration errors (in determining the trim value by the software/hardware configuration I have as well as in bulk programming in production), trim resolution errors, and errors introduced by drift over voltage and temperature, the chip won't be running out of specification (i.e. without going over 20MHz in the worst case of all the errors added together -- to guarantee I will never exceed the published f_BUSmax. specification of 10MHz).

Application note AN3041 has much more detailed information on the ICS module (including some much better operational descriptions than the RS08's data sheet has). AN2496 describes the calibration methods for the earlier ICG module, but AN3041 says that the ICS is a stripped-down version of the ICG and that the same iterative approach should be used to trim the ICS (and that the algorithms should be portable). But... that still doesn't answer what the trimming accuracy of the specific setup I am using (software, USB, and hardware) actually is.

And finally, there is apparently tremendous variability between individual chips re: the internal reference oscillator frequency. This is driven home in the entirely annoying mask errata sheet "MSE9RS08KA2_2M44C" for the RS08 (and its ICS module): "The trim value for any particular clock frequency is unique to each device." (which basically limits the reliable bus clock speed to 5MHz due to the bug).
0 Kudos

1,162 Views
bigmac
Specialist III
Hello,
 
If the bus frequency is not critical for the application, I cannot quite see the problem.  Simply calibrate each device to achieve a bus frequency of 10MHz.  But make sure you apply the calibration value to ICSTRM register before reducing the divider value from the power-up default.
 
If the bus frequency has an overall accuracy of 1-2 percent, I wouldn't consider this to be a problem for the MCU itself, even when nominally at the maximum operating frequency.  I think the main issue would be if calibration is not done, and the frequency error could easily be +/-25 percent.
 
I agree that the calibration value will be unique to each device, and calibration should be performed at the same time the device is programmed.  It is probably a good idea to check whether there is a valid calibration value present at address $3FFA (i.e. the value is not $FF for the unprogrammed state), within your initialisation code, to protect against omission of calibration during production.
 
Regards,
Mac
 
0 Kudos

1,162 Views
jbotte
Contributor I
The "problem" I'm trying to address is how to get maximum performance (highest bus speed) of the chip without exceeding the published fBUSmax. of 10MHz under worst case trim setting accuracy, and over the full temperature and voltage range. Targetting a DCO frequency of 20MHz would allow the published specification to be violated (I agree that the chip would probably work fine, but I want to ensure that I've met the published maximums). If I were aiming for a particular frequency (i.e. if I were trying to do an RS232 interface or something), I'd target that frequency and then just have to put up with the variability over the range of operation of the product.
 
0 Kudos

1,162 Views
noritan
Contributor I
My assumption to trim the OSC is as follows.

The Background Debug Mode (BDM) protocol has a command named SYNC which is used to make the target device to drive the BKGD pin LOW for 128 BDC clock cycles.  It is available to calculate the BDC clock cycle by Input Capture function connected to the BKGD pin.  The strategy to trim the OSC is depend on the firmware of the tool. It is availble to do a binary-search or a calculation based on the resolution of TRIM value 0.2%.

If my assumption is RIGHT, it is recommended to reduce stray capacities at the BKGD pin to get an accurate TRIM value result.

My favorite DCO frequency is 19.6608MHz, the internal clock is trimmed to 38.400kHz.  It is suitable for higher UART communication rate.
0 Kudos

1,162 Views
jbotte
Contributor I
Thank you for the additional support! I am quite confident now that I can meet the timing requirements of my application with the RS08.

0 Kudos

1,162 Views
peg
Senior Contributor IV
Hi,
 
Not sure how relevant this is to the RS08 but I just tripped over a statement by a P&E employee about the trim of the QG. He said that the trim of the QG is done to 31,250kHz and is accurate to 0.2%!
 
 
 
0 Kudos

1,162 Views
jbotte
Contributor I
Between AN2496 and your statement, it looks like it really may be accurate to the trim resolution of 0.2%. I'll go with that and sample parts as I go (take your practical approach to it as a sanity check) to make sure that the answer is accurate. To that end, I will be setting the target frequency to 19,569,471Hz. This allows for the worst-case +2% variability over temperate and voltage and worst-case +0.2% trim resolution error to ensure that the DCO cannot exceed 20MHz (and therefore that the bus speed cannot exceed 10MHz, the published maximum bus speed). That should provide the maximum "performance" possible from the chip without exceeding specifications. Thanks for your help!
 
0 Kudos

1,162 Views
peg
Senior Contributor IV
Hi,
 
I do not have any direct experience with the RS08, but a lot with GB/GT and QG. It is my experience that the ability of the USBBDM and the P&E software to acurately produce the correct trim value lies somewhere between the lowest trim bit and the FTRIM. A quite large proportion of devices will give one of two values for LSB trim "every second time". Also having trimmed a device then let the device run for an hour will about 90% of the time introduced enough heat to mostly always increase the LSB trim by one. This makes the FTRIM all but useless except for carefully hand trimmed devices running under very strict operating conditions.
 
Having said that I have also noticed that it seems to be resonably stable over time in that connecting to a "already running for a year in the field" device will often show a trim calculated value within one (or most of the time spot on) of the one which was programmed in at manufacture. (Allowing for room or running temperature)
 
Also the trims I calculate are always significantly different (say 10) from the factory programmed value. One of the reasons for this is that I run them at 3.3V and they are calibrated (I believe) for 3V. But there seems to be something else as well.
 
Hope this helps, it is not very scientific but opinions furmulated over a long time from "real world" observations.
 


Message Edited by peg on 2007-06-21 12:26 PM
0 Kudos

1,162 Views
Curt
Contributor IV
This is a good question, and I'd like to know the answer also.  Perhaps if you asked again in the hardware forum someone there will offer some insight.
 
Ultimately, I expect that there are so many variables involved that we just need to cut and try.
0 Kudos

1,162 Views
jbotte
Contributor I
I was oscillating back and forth as to whether to post it in the hardware forum as well, but I figured it was more of a function of the Codewarrior toolset, the development board, and/or communications link. If I don't hear anything here in a day or two, I will post it to the hardware side of things (and post back here if I get a response).

0 Kudos