I'm using a MC9S12A32 as a master to talk to a MAX186 Analog-to-Digital Converter over the SPI bus.
The MAX186 is a 12-bit A/D that transmits 16 bits, with the last 4 bits stuffed with 0s. In binary, a reading looks like bbbbbbbb bbbb0000, where "b" is a valid bit and the MSB is on the left. (http://datasheets.maxim-ic.com/en/ds/MAX186-MAX188.pdf)
However, as I'm reading the conversions from the A/D, I randomly see readings where the most-significant bit in the least-significant nybble is set, which shouldn't happen. For example, I'll see "good" readings of 0x0110, but then I'll see a few "bad" readings of 0x0118. Obviously, this looks like the bits just arrived late by one clock cycle, but I'm seeing accurate voltage readings.
This design has worked in the past with this A/D. I'm porting it from an HC11 MCU to the MC9S12A32. In the original code, CPHA and CPOL were both set to 0.
I currently have SPICR1 = 0x50 (SPI enabled, Master, CPHA = 0, CPOL = 0); SPIBR = 0x03 (OSC/16 = 4MHz/16); I've tried to slow the clock down (I was originally dividing OSC by 2), without luck.
I've tried CPHA and CPOL = 1 (SPICR1 = 0x5C) but I saw the same results.
Can anyone offer any advice on what else to try here?
Thanks in advance.