Using Chip_SPIM_XferHandler(), similar to periph_spi_sm_int project, for a read function. My example clocks out 6 bytes, then clocks in 2 bytes (for a total of 8 bytes). Works fine @ 1 MHz. When I increase the clock rate to 4 MHz, after clocking the correct number of bytes (8), I see a de-assert, assert, 1 more byte clocked, then a final de-assert.
Something about the interaction of the SPIMASTERIRQHANDLER(void) and the handling of tx and rx counts in Chip_SPIM_XferHandler()?
Anyone see any issues like this?
Enabling the same interrupts and options as the demo code:
Chip_SPI_EnableInts(LPC_SPIMASTERPORT,
(
SPI_INTENSET_RXDYEN |
SPI_INTENSET_RXOVEN |
SPI_INTENSET_TXUREN |
SPI_INTENSET_SSAEN |
SPI_INTENSET_SSDEN |
0
));
/* Setup master transfer options - 8 data bits per transfer, EOT, EOF */
spiMasterXfer.options =
SPIM_XFER_OPTION_SIZE(8) | /* This must be enabled as a minimum, use 8 data bits */
SPIM_XFER_OPTION_EOT | /* Enable this to assert and deassert SSEL for each individual byte/word */
SPIM_XFER_OPTION_EOF | /* Insert a delay between bytes/words as defined by frame delay time */
0;