Good evening everyone,
I have recently encountered a problem during testing of my code on-board my HC9S12DP256B. I code in C using Imagecraft ICC12 Ver 7. Currently I do all my testing by downloading code into RAM using a serial connection and DBug-12 w/ the target jumpered for EVB mode.
First some background: my C code is designed to abstract the user away from details of setting registers. I provide people with a set of functions that they can use to set up one of any number of modules on-board the target of their choice (currently I only have code that works on my ‘DP256s, however with additional overhead I’m sure I’ll be able to get the code to work on other variants of HC12s…). Part of my code allows the user to configure the Clocks and Reset Generator (CRG) and to select the target operating frequency (using the PLL as clk source), with a follow-on requirement to supply the new freq value to other modules thru functions to ensure that these modules (such as the ECT, SCI, EEPROM) “know” what their module clk freq (also known as the bus clk freq) is in order to operate properly.
In my first terminal dump (attached to this post), the CRG is config’d to operate at a PLL freq of 24 MHz (the standard setting once DBug-12 has finished its setup) and with an SCIO BAUD rate of 9600. In each iteration of tests for different SYNR and REFDIV values, I have printfs that show what these values are in addition to what the PLLCLK freq will be (ie: results that fall under the “- - - CRG TEST A - Success - - -“). Part of the requirement of my code is that any user-proposed values for SYNR and REFDIV must yield a PLLCLK that is not greater than the max operating freq (I _assume_ 50MHz, is this value correct for this device?!?), not less than the min oper freq (I have no idea what this is as it is not explicitly called out in the Advance Info manual so I assume it is 16MHz,… is this correct?!?), and must result in an integer-value frequency. I then determine what BR value is required to set SCI0BR based on the desired BAUD rate and the new bus clk freq (ie: results immediately under “- - - SCI Observation A - - -“). In addition, each iteration of this test shows what the actual BAUD rates will be (truncated to the nearest integer).
In the second attachment, I show what I get for results when I attempt to change the PLL clk freq on-the-fly. Things work properly for the first two iterations of 24 and 23 MHz (the PLLCLK/2 becomes the Bus Clk and is also the SCI Module Clk..), however beyond this things go “haywire.” After doing a memory test, I determined that although my code size is large, there is enough space between the data, text and stack areas to ensure that the stack shouldn’t do any creepy things. I attempted to set some breakpoints just after modifying the SYNR and REFDIV values; the processor somehow gets “lost” and spits out random values as shown in the second capture right after attempting to set the PLL to operate at 44 MHz (22 MHz Bus Clk freq).
Am I in violation of a basic tenet of how to properly adjust the system operating frequency? Should I be setting the CRG to operate from the crystal osc (16 MHz), then attempting to set-up the PLL and ensure it has entered into lock prior to making it the clk source? I am currently checking to see if the PLL has entered lock, however I don’t know if I’m doing this validation step properly as I go from one PLL freq to another on-the-fly.
An application where the freq goes from the crystal osc freq to one PLLCLK setting is fine if you only change the clk freq once, what if I need to go from 23 to 21 MHz or 24 to 18 MHz, then from 18 to 15 MHz, etc?
Please let me know if anyone has any recommendations/ideas, thanks!!
Attachments: 1. TERM_DUMP_9NOV06 1 of 2.TXT
2. TERM_DUMP_9NOV06 2 of 2.TXT