I'm using Processor Expert on a MP16 to develop software that's communicating over the I2C and doing some A-to-D with the ADC. My I2C code was working just fine, but when I mixed in the bits to control the ADC, my I2C driver stopped working as the interrupt was never firing.
With everything identical except not starting the ADC bean sampling by removing a2d_Measure(0);, the IIC works fine. Disabling the interrupt for the ADC (causing the sampling code that PE generates to block) also allows the IIC to run, though that is an unpalatable solution as it consumes an appreciable amount of time.
It doesn't seem to be hanging on the ADC interrupt, as the ADC code still runs and exits, just whenever ADCSC1 is written to start a conversion and an enabled interrupt.
I already had one problem with the I2C not working at all due to PE setting the GPIO pins to "high drive strength" which seemed to override the open-collector setting the I2C module requires. Is there some other "gotcha" I'm missing now? Or what else should I be exploring as the root-cause of this?
I was unable to get the PE ADC bean to play nice, despite being able to follow it's logic with little difficulty. When implemented in my own code, it works just fine with the IIC. So much for saving time with PE beans
The only unfamiliar bit of code I could find in the generated bean code was in the ADC ISR:
IPCSC_PULIPM = 1; /* Restore Interrupt Priority Mask */
Commenting it out seemed to have no effect, but I couldn't find any sort of related code in the ISR (or anywhere else). The ISR I wrote is totally naive about interrupt priorities.
Hello,
I have not reproduced such behaviour, therefore could you please post here Processor Expert project that leads you to getting the error?
best regards
Vojtech Filip
Processor Expert Support Team