I'm trying to determine the correct configuration for using ADC-triggered DMA. I have it working, but there is some unexpected behavior I would like to understand.
My usage case if fairly simple:
- using Sequence B
- single ADC channel 4
- single DMA channel 0
- a list of 10 Transfer_Descriptors of 16-bit x 1024 samples
- set ADC BURST bit to start, clear BURST bit on the 10th DMA IRQ.
For it to work, interrupt-related config:
- Sequencer config:
Chip_ADC_SetupSequencer(LPC_ADC, ADC_SEQB_IDX, (ADC_SEQ_CTRL_CHANSEL(4) | ADC_SEQ_CTRL_MODE_EOS));
- ADC interrupt config:
Chip_ADC_EnableInt(LPC_ADC, ( ADC_INTEN_SEQB_ENABLE | ADC_INTEN_OVRRUN_ENABLE));
NVIC_EnableIRQ(ADC_SEQB_IRQn);
- Transfer_Descriptor 'xfercfg':
DMA_XFERCFG_SETINTB
- DMA interrupt config:
Chip_DMA_EnableIntChannel(LPC_DMA, DMA_CH0);
LPC_INMUX->DMA_ITRIG_INMUX[0] = DMATRIG_ADC0_SEQB_IRQ;
NVIC_EnableIRQ(DMA_IRQn);
So the questions are:
(1) The User Guide (sec. 25.7.8) says: "If DMA is used for a sequence, the corresponding sequence interrupt must be disabled in the INTEN register". If I don't enable the sequence interrupt, the DMA interrupt never happens. Why?
(2) Is the Sequence B interrupt handler (ADC_SEQB_IRQHandler()) not supposed to be called? For the 10k sample that's collected, the Sequence B handler is called a few thousand times. Not once for every sample, but a lot. I'm only getting about half the sample rate expected, approx. 2.5 MSPS, is this slowing down the sampling?
(3) If I use the conversion interrupt instead of the end-of-sequence interrupt (MODE=1=ADC_SEQ_CTRL_MODE_EOS), the DMA interrupt never happens. Why? Not supported with DMA?
Original Attachment has been moved to: adc_dma.c.zip