Hi,
I'm using LPC1125 with two sequences. Seq A with some channels in single shot mode. Seq B with some channels in burst mode. Both SEQx_INT enabled und EOS (end of sequence) set. The sequences are exclusivly used - so far so good.
Unfortuntely there are circumstances when switching from sequence A to B no Interrupt-Flags of B are set.
What I do during init:
Chip_ADC_Init(...)
Chip_ADC_SetClockRate(...)
Setting Pins for ADC usage...
Chip_ADC_DisableSequencer(LPC_ADC, ADC_SEQA_IDX);
Chip_ADC_DisableSequencer(LPC_ADC, ADC_SEQB_IDX);
Chip_ADC_SetupSequencer(LPC_ADC, ADC_SEQA_IDX, .... | MODE_EOS);
Chip_ADC_SetupSequencer(LPC_ADC, ADC_SEQB_IDX, .... | BURST | MODE_EOS);
Chip_ADC_EnableInt(LPC_ADC, ADC_INTEN_SEQA_ENABLE | ADC_INTEN_SEQB_ENABLE);
Chip_ADC_EnableSequencer(LPC_ADC, ADC_SEQA_IDX);
Chip_ADC_EnableSequencer(LPC_ADC, ADC_SEQB_IDX);
// Note! I use the Interrupt flags only for polling if a sequence is finished - no expectation to call the ISR - no enable in NVIC
What I do during switch sequence:
Chip_ADC_ClearFlags(LPC_ADC, 0xffffffff);
Chip_ADC_StartBurstSequencer(LPC_ADC, ADC_SEQB_IDX);
If I do this it does NOT work (no sequence Interrupt flags set in burst mode), except walking through Debugger during switch. Interrupt flags are set in this case in burst mode.
If I do this:
Chip_ADC_ClearFlags(LPC_ADC, 0xffffffff);
for ( int i = 0; i < 1000; i++) { j++; j = LPC_ADC->Flags; }
Chip_ADC_StartBurstSequencer(LPC_ADC, ADC_SEQB_IDX);
it works. Same Problem when disable/enable sequencer inbetween. Are there any non documented reads/writes or order of Setting bits in control Registers?
No notes in errata available...
Thx, Thomas