LPC54102 ADC/DMA interrupt configuration

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

LPC54102 ADC/DMA interrupt configuration

1,459 Views
chrisoneill
Contributor I

I'm trying to determine the correct configuration for using ADC-triggered DMA.  I have it working, but there is some unexpected behavior I would like to understand.

 

My usage case if fairly simple:

- using Sequence B

- single ADC channel 4

- single DMA channel 0

- a list of 10 Transfer_Descriptors of 16-bit x 1024 samples

- set ADC BURST bit to start, clear BURST bit on the 10th DMA IRQ.

 

For it to work, interrupt-related config:

 

- Sequencer config:

Chip_ADC_SetupSequencer(LPC_ADC, ADC_SEQB_IDX, (ADC_SEQ_CTRL_CHANSEL(4) | ADC_SEQ_CTRL_MODE_EOS));

 

- ADC interrupt config:

Chip_ADC_EnableInt(LPC_ADC, ( ADC_INTEN_SEQB_ENABLE | ADC_INTEN_OVRRUN_ENABLE));

NVIC_EnableIRQ(ADC_SEQB_IRQn);

 

- Transfer_Descriptor 'xfercfg':

DMA_XFERCFG_SETINTB

 

- DMA interrupt config:

Chip_DMA_EnableIntChannel(LPC_DMA, DMA_CH0);

LPC_INMUX->DMA_ITRIG_INMUX[0] = DMATRIG_ADC0_SEQB_IRQ;

NVIC_EnableIRQ(DMA_IRQn);

 

So the questions are:

 

(1) The User Guide (sec. 25.7.8) says: "If DMA is used for a sequence, the corresponding sequence interrupt must be disabled in the INTEN register".  If I don't enable the sequence interrupt, the DMA interrupt never happens.  Why?

 

(2) Is the Sequence B interrupt handler (ADC_SEQB_IRQHandler()) not supposed to be called?  For the 10k sample that's collected, the Sequence B handler is called a few thousand times.  Not once for every sample, but a lot.  I'm only getting about half the sample rate expected, approx. 2.5 MSPS, is this slowing down the sampling?

 

(3) If I use the conversion interrupt instead of the end-of-sequence interrupt (MODE=1=ADC_SEQ_CTRL_MODE_EOS), the DMA interrupt never happens.  Why?  Not supported with DMA?

Original Attachment has been moved to: adc_dma.c.zip

0 Kudos
13 Replies

1,107 Views
chrisoneill
Contributor I

The BURST bit is used as the ADC trigger.

From the previously attached source file:

  Chip_ADC_StartBurstSequencer(LPC_ADC, ADC_SEQB_IDX);

where the channel converted is:

Chip_ADC_SetupSequencer(LPC_ADC, ADC_SEQB_IDX, (ADC_SEQ_CTRL_CHANSEL(AC_OUT_CHAN) | ADC_SEQ_CTRL_MODE_EOS));

where AC_OUT_CHAN = 4

0 Kudos

1,107 Views
jeremyzhou
NXP Employee
NXP Employee

Hi Chris ONeill,

I've replicated the issue, and I'll contact with AE team about this issue later.
Have a great day,
TIC

-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------

0 Kudos

1,107 Views
chrisoneill
Contributor I

Trigger source, as in the attached file:

// ADC Sequence B interrupt is selected for a DMA trigger
LPC_INMUX->DMA_ITRIG_INMUX[0] = DMATRIG_ADC0_SEQB_IRQ;

0 Kudos

1,107 Views
jeremyzhou
NXP Employee
NXP Employee

Hi Chris ONeill,

You might miss my reply, I'd like to know the trigger of the ADC, not the DMA.

Have a great day,
TIC

-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------

0 Kudos

1,107 Views
chrisoneill
Contributor I

Thanks Jeremy.  At this point I've worked past the operational issues.  The remaining questions are:

(1) The User Guide (sec. 25.7.8) says: "If DMA is used for a sequence, the corresponding sequence interrupt must be disabled in the INTEN register".  If I don't enable the sequence interrupt, the DMA interrupt never happens.  Is this just a User Guide errata?

 

(2) If I use the conversion interrupt instead of the end-of-sequence interrupt the DMA interrupt never happens.  Should it, or is it not supported with DMA?  For a single channel burst conversion of 10k samples is the end-of-sequence interrupt exactly the same as the conversion interrupt in terms of performance, accuracy, etc ? 

(3) I'm collecting 10k samples using DMA, so it seems synchronous clocking is the best to use.  I don't want to give up 20% of performance by slowing the system clock to get the max 80 MHz ADC clock.  So I should settle for a 50 MHz ADC clock = 2.5 MHz sample rate as a compromise.  The ADC clock is an integer divide of the system clock, yes?  So there's no way to derive a 80 MHz ADC clock from a 100 MHz system clock?

0 Kudos

1,107 Views
jeremyzhou
NXP Employee
NXP Employee

Hi Chris ONeill,

Thanks for your reply.

(1) The User Guide (sec. 25.7.8) says: "If DMA is used for a sequence, the corresponding sequence interrupt must be disabled in the INTEN register".  If I don't enable the sequence interrupt, the DMA interrupt never happens.  Is this just a User Guide errata?

-- After confirming, the  “disabled” word in the Remark should be a typo, should be changed to “enabled”.

(2) If I use the conversion interrupt instead of the end-of-sequence interrupt the DMA interrupt never happens.  Should it, or is it not supported with DMA?  For a single channel burst conversion of 10k samples is the end-of-sequence interrupt exactly the same as the conversion interrupt in terms of performance, accuracy, etc ? 

-- I'd like to confirm the the trigger source of ADC you choose in your testing, would you mind describe the testing in more detailed?

(3) I'm collecting 10k samples using DMA, so it seems synchronous clocking is the best to use.  I don't want to give up 20% of performance by slowing the system clock to get the max 80 MHz ADC clock.  So I should settle for a 50 MHz ADC clock = 2.5 MHz sample rate as a compromise.  

The ADC clock is an integer divide of the system clock, yes?  

 -- Yes, it's.

So there's no way to derive a 80 MHz ADC clock from a 100 MHz system clock?

-- No, you can't the 0 MHz ADC clock from a 100 MHz system clock.


Have a great day,
TIC

-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------

0 Kudos

1,107 Views
chrisoneill
Contributor I

Changing the sample rate in Chip_ADC_SetClockRate(LPC_ADC, ADC_MAX_SAMPLE_RATE) did change what I'm seeing.

I do have to enable the Sequence B interrupt for the DMA to trigger, but if I don't call NVIC_EnableIRQ(ADC_SEQB_IRQn) then I don't get the Sequence B interrupt handler called, which is good.

I'm using system clock = 100MHz.  I don't see the SetClockRate() call above setting the sample rate directly, but sets the divisor.  100Mhz / 48Mhz = divisor = 2.  Wouldn't that really be an ADC clock rate of 50Mhz?

The User Guide says the max ADC clock rate is 80Mhz.  Can I use the 100Mhz system clock, at 20 clocks per sample to give me the 5Mhz sample rate?

0 Kudos

1,107 Views
jeremyzhou
NXP Employee
NXP Employee

Hi Chris ONeill,,

I've escalated the question, so could you provide the build-able project that AE team can reproduce this case?
Have a great day,
TIC

-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------

0 Kudos

1,107 Views
jeremyzhou
NXP Employee
NXP Employee

Hi Chris ONeill,

Thanks for you reply.

Can I use the 100Mhz system clock, at 20 clocks per sample to give me the 5Mhz sample rate?

1) I'd highly recommend you to think another approach instead of this question, for instance getting 80MHz system clock for the ADC module.
Have a great day,
TIC

-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------

0 Kudos

1,107 Views
chrisoneill
Contributor I

I found that I can stop the sequence B interrupt handler from being called by removing the call:

NVIC_EnableIRQ(ADC_SEQB_IRQn);

I'm guessing that it might be useful for single conversions, but not in burst mode where the IRQ can't keep up with the converter.

But it hasn't increased the conversion speed.  I'm still getting about 2.5 MSPS.

0 Kudos

1,107 Views
jeremyzhou
NXP Employee
NXP Employee

Hi Chris ONeill,

After going over the codes, I find that the ADC sample clock is 48 MHz, the resolution bit is 12 bit and the sample time needs 20 ADC clocks for the 12-bit resolution, so it's easily to get the sampling frequency: 48 MHz/20=2.4 Msps.

To get the maximum of the sampling frequency, you need to reconfigure these above parameters.

Have a great day,
TIC

 

-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------

0 Kudos

1,107 Views
chrisoneill
Contributor I

I attached the file that configures and executes the ADC/DMA conversion.

-  sequence A is used for a 2-channel one-shop conversion, no issues there

- sequence B is used for a 1-channel burst mode conversion, total 10k samples, 1k x 10 Transfer_Descriptor buffers

 

I want to know what is the correct use of the ADC interrupt to drive this.  It's my impression that the sequence B interrupt handler shouldn't be getting called, since it slows things down (does it?), and gets called something like 4k times for a 10k sample, so it seems like it's not really doing anything useful, and I'm not using it to do any kind of processing.

 

The actual 10k sample looks fine, except it's sampling at about a 2.5 MSPS rate, not 5 MSPS, so I'm assuming something in my setup is incorrect.

0 Kudos

1,107 Views
jeremyzhou
NXP Employee
NXP Employee

Hi Chris ONeill,

Thank you for your interest in NXP Semiconductor products and 
for the opportunity to serve you.

(1) The User Guide (sec. 25.7.8) says: "If DMA is used for a sequence, the corresponding sequence interrupt must be disabled in the INTEN register".  If I don't enable the sequence interrupt, the DMA interrupt never happens.  Why?

--- I'm also a confused with the statement and I'll contact with the AE team for checking.

(2) Is the Sequence B interrupt handler (ADC_SEQB_IRQHandler()) not supposed to be called?  For the 10k sample that's collected, the Sequence B handler is called a few thousand times.  Not once for every sample, but a lot.  I'm only getting about half the sample rate expected, approx. 2.5 MSPS, is this slowing down the sampling?

-- I was wondering if you can share the more information as I'm not clear with this issue.

(3) If I use the conversion interrupt instead of the end-of-sequence interrupt (MODE=1=ADC_SEQ_CTRL_MODE_EOS), the DMA interrupt never happens.  Why?  Not supported with DMA?

 -- Whether you upload the sample code, I'd like to replicate the issue on my site.

I'm looking forward to your reply.
Have a great day,
TIC

-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------

0 Kudos