I'm working on code for an MC9S08QG8 which will take an int from the capture module, scale it, and send it over I2C to a DAC. As a general note, I'm currently working with optimization disabled.
I've gotten the I2C function to communicate with the DAC, however when I add a function to initialize my "scale" struct which computes/ holds some constants for the scaling operation it runs into the weeds. Even though the function (below) seems simple enough, just commenting it out is the difference between the code working vs not working.
By not working, I mean to say that the processor seems to start immediately at its I2C instructions rather that go though it's initialization functions - including the PE init function.
I'm rather new to this, and I would *greatly* appreciate some help in where to start debugging this and what to look for. When the debugger doesn't start at the beginning of the program I'm not really sure where to start.
//The function that throws everything off when not commented out is below
typedef struct{
int Num, Den, Sin_low, Sout_low;
char status_Error;//!<Allows higher level code to tell if something happened, or if an error occurred
byte Error; //!<stores the actual error
} scaledOut;
byte scale_init(scaledOut * scale_struct, const int out_low, const int out_hi, const int in_low, const int in_hi){
int tempNum = 0;
int tempDen = 0;
scale_struct->Num = 0;
scale_struct->Den = 0;
scale_struct->Sin_low = in_low;
scale_struct->Sout_low = out_low;
scale_struct->status_Error = 0;
scale_struct->Error = 0;
//go from ranges to a form of y= mx + b
//basic formula is y= (out_hi) (x - in_low) + a
// ---------------------
// in_hi - in_low
//
//Which roughly translates to (x-in_low)*(out_hi-out_low)/(in_hi-in_low) + out_low
//We will hold in_low and out_low, as well as pre-compute the numerator and denominator so
//that the processor only has to run
// (x-in_low)*(Num)/(Den) + out_low
// Compute Num and Den
tempNum = out_hi - out_low;
tempDen = in_hi - in_low;
scale_struct->Num = tempNum;
scale_struct->Den = tempDen;
return ERR_OK;//Good time to eliminate divide by zero, etc.
}
The function is called from the following main() function
void main(void)
{
/*Local variable definition*/
//!< Period Capture vars
Capture PeriodCap;
scaledOut scale_struct;
byte scale_err = 0;
//!<I2C Vars
//Omitted
//!<Vars for main, mostly for scaling voltage
int time = 0; //!<place-holder for our time capture
int oldTime = 0; //!<holds the last time capture so we don't bother sending it again
int out = 0; //!<scaled for output
const int out_low = 512; //!<Lowest value of our output (12 bit DAC)
const int out_hi = 3686; //!<Highest output value
const int in_low = 6683; //!<smallest expected input
const int in_hi = 20017; //!<Largest expected input
char loop = 0;
byte ErrorHold = 0; //!<Holds an error for debugging
word size = 0;
int i; //!<var for delay
flags = 1;
/*** Processor Expert internal initialization. DON'T REMOVE THIS CODE!!! ***/
PE_low_level_init();
/*** End of Processor Expert internal initialization. ***/
/* Write your code here */
//!<Initialize peripherals and their corresponding structs
I2C_init(&DAC, DAC_addr);
Capture_init(&PeriodCap);
//!< Get ready to scale some stuff
scale_err = scale_init(&scale_struct, out_low, out_hi, in_low, in_hi);
//!<Loop forever
for(;;) {
//Loop output for testing;
out = out + 1;
size = sizeof (out); //should always be sending two bytes
//////////////////////////////////////////////////////////////IIC Send (Direct)
//Bring input into an array so we have some precise bounds
//Omitted
/* Set slave address */
(void)I2C1_SelectSlave(DAC_addr);
/*! Send bytes in master mode
* "Send Block" takes a "const void *", which is then cast into OutPtrM which is a "const byte *"
* The data is finally read into the (8 bit) I2C I/O register, IICD, via
*
* IICD = *(OutPtrM)++;
*
* Which will begin transmitting MSB first
*/
error_I2CSend = I2C1_SendBlock(&Buff_Out,size,&ret);
//Delay
for(i = 0; i< 200;i++)
{}
}
/*** Don't write any code pass this line, or it will be deleted during code generation. ***/
/*** Processor Expert end of main routine. DON'T MODIFY THIS CODE!!! ***/
for(;;){}
/*** Processor Expert end of main routine. DON'T WRITE CODE BELOW!!! ***/
} /*** End of main routine. DO NOT MODIFY THIS TEXT!!! ***/
已解决! 转到解答。
Found out my issue(s) on this -
For resolved breakpoints being ignored, I had to remove all breakpoints, right click on the project and hit "clean project", which essentially builds from scratch.
For un-resolved breakpoints, I simply had to reduce the number of breakpoints.
As far as the code running off into the weeds, this turned out to be a stack issue. I noticed that just before the function call it was pushing data onto the stack, which was running over my I2C variables and tricking the subroutine to think the bus was not ready. Increasing the stack size let the code run as designed.
Found out my issue(s) on this -
For resolved breakpoints being ignored, I had to remove all breakpoints, right click on the project and hit "clean project", which essentially builds from scratch.
For un-resolved breakpoints, I simply had to reduce the number of breakpoints.
As far as the code running off into the weeds, this turned out to be a stack issue. I noticed that just before the function call it was pushing data onto the stack, which was running over my I2C variables and tricking the subroutine to think the bus was not ready. Increasing the stack size let the code run as designed.
Looking further into this, it looks more like my issue isn't necessarily the processor starting at a random place, but (resolved!) breakpoints being ignored.
Shown below - the small section of code is causing the uC to just loop and not respond. However, the processor won't even touch those breakpoints (even though they show resolved) for me to step in and see what's going on. Is this a known issue?
Does anyone have any info on this? Or should I re-post in the CodeWarrior section?