ADCs NOT Triggering from PDB, but from PIT triggers OK

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

ADCs NOT Triggering from PDB, but from PIT triggers OK

Jump to solution
2,287 Views
markwyman
Contributor III

Hi all, 

Background:

1. Processor Expert in KDS, 3.2.0 as the basis of my project

2. Processor is MKV31F512VLL12

3. Have working compiles and debug using the a tower board JTAG interface from it (PNE debugger)

4. Have struggled mightily with PE and these trigger sources to run the ADC samples, though found most problems so far...

 

Problems somewhat resolved:

1. In the ADC_LDD bean, modifying the Trigger options does not generate any code. 

2. To solve #1, the SIM bean must be added to the project, and the settings for ADC settings (SIM_SOPT7) must be manually set. At this point, the Trigger options become "intelligent", and the SIM_SOPT7 register then gets set properly.

3. The Conversion Time dialog of ADC_LDD presents a list of available timings, some of which (unknown to me when I started) are not available in the clock mode I am setting the bean up for, such as ALT_CLK2. It is very easy to select a timing clock that is not even on, leaving the ADC core non-clocked. You have to make dang sure you select an available clock here, and not leave it on Automatic. Otherwise you will spend days blaming the trigger signals like I did.

 

I have a project put together where I have been able to do the following:

1. Take simultaneous 2x ADC conversions triggered and based upon the PIT module channels for trigger. To do this I had to resolve the above problems.

2. I have DMA working just fine (Beans here rock!) took me all of about 10 minutes to add DMA support vs. single-sample interrupts and manual copy to buffers.

3. Many other peripherals and timers working just fine within the project.

4. I added a new Init_PDB bean to my project to start using the PDB (Programmable Delay Block) for my trigger source rather than PIT to get access to delays in trigger abilities. 

5. I followed this example: http://cache.nxp.com/files/32bit/doc/app_note/AN4688.pdf?fpsp=1&WT_TYPE=Application%20Notes&WT_VENDO... to try and replicate the settings as best I could on the PDB.

5. I can get the PDB to provide regular interrupts to a custom interrupt vector and increment a counter at my expected interval, so I know PDB is enabled, interrupts are on, working and setup.

 

What I cannot do:

1. I cannot get the PDB to trigger ADC conversions. I simply change SIM_SOPT7 register from the existing (and working) PIT timer to the PDB timer (only on one of the two channels), and the ADC channel 0 trigger source settings. After mapping to the new trigger source, the ADC channel no longer triggers, even though I continue to get regular PDB interrupts. SIM_SOPT7 is 0x8500 (First byte 0x85 says alternate trigger: PIT, second byte of 0x00 is default PDB trigger).

2. I have also tried the FTM modules in the past with the same problem, I cannot get them to trigger the ADC. I cannot trigger the ADCs with anything but the PIT timer it seems.

 

I have run out of ideas other than to blame a problem with the core of the CPU, or some errata I don't know about that is blocking me. I would expect the PDB to function for triggers as that is supposed to be the default trigger routing after reset.

 

I have attached the Processor Expert file in hopes that someone can poke around the PDB settings or SIM settings to see if I am simply missing something critical to get the ADCs to trigger from the PDB.

 

Thank you for any help!

 

-Mark Wyman

Original Attachment has been moved to: ProcessorExpert.zip

Labels (1)
1 Solution
1,629 Views
markwyman
Contributor III

I believe I have it. I was setting the delay registers arbitrarily in time from my main polling routine in response to serial commands. This, on occasion, would write a new delay value to the registers while a conversion was still in-progress. If I created a second trigger before the conversion was complete by altering the delays, that would be all she wrote and there would be no recovery easily of that stalled converter.

So I moved the setting of the delay into the end of the conversionComplete routine, and I have no more problems. I can be assured the converter isn't busy, and it is the right place to change the value for the next conversion anyhow.

The earlier problem I believe resulted from setting those delay values during my init call to setup the AD converters and the delay registers. The large jump in value during init resulted in two trigger events in close succession, causing the AD converters to lock up even before I finished the first conversion. This gave me the illusion that I had no clocks to the AD converters.

View solution in original post

0 Kudos
8 Replies
1,629 Views
markwyman
Contributor III

The first attachment was the working example, both ADCs triggered by PIT, this attachment is the first ADC tied to PDB0_CH0_TriggerA, so not much difference in projects apart from SIM_OPT7 values.

0 Kudos
1,629 Views
DavidS
NXP Employee
NXP Employee

Hi Mark,

I haven't had chance to look at your code.

But you might want to look at the KSDK_v2 PDB trigger of ADC example.  It might help indicate something amiss.

C:\NXP\KSDK_v2\SDK_2.0_FRDM-KV31F_KDS\boards\frdmkv31f\driver_examples\pdb\adc16_trigger

The readme.txt for this example is attached.

It starts with:

Overview

========

The pdb_adc16_trigger example shows how to use the PDB to generate a ADC trigger.

Regards,

David

1,628 Views
markwyman
Contributor III

Thanks! I will take a look and maybe dig in.

However, there should be a button in Design Studio called "Post to Forum" that will cure all inexplicable problems as long as you spend at least 15 minutes typing up a post.

Just after posting this, I decided to enable DMA on the second channel (the one working that was tied to the PIT timer), and suddenly the first channel started working from the PDB, which the second was still tied to the PIT and working as well. The only difference being I turned DMA on in the second channel ADC, and code was added to support and setup the ADC in the init function with DMA, and ping-ponging buffers for the second ADC in the Conversion Complete event.

Since I intended to turn on DMA for both channels anyway, I don't know how much time I will spend trying to figure out the problem, but mystery "cures" keep me up late at night.

Seems I am on the path, but now I am pretty gun-shy of messing around with trigger sources.

0 Kudos
1,628 Views
Alice_Yang
NXP TechSupport
NXP TechSupport

Hello Mark,

Frankly speaking , sorry i don't know your current problem in your last emai.

Does ADC can not trigger DMA ?

There is a PIT_ADC_DMA demo :https://community.nxp.com/docs/DOC-102951    maybe you can refer to it .

If it not this problem , please tell us what can we do for you at this time .

Hope it heps

Alice

0 Kudos
1,628 Views
markwyman
Contributor III

Hi Alice,

Thanks for the reply. The problem I was (am still?) having is trying to trigger both ADCs in this processor from first the PIT timer (I did get the PIT timer source working), and then moving them over to the PDB delay/timer. I would like to use the PDB timer, as the programmable delays are exactly what I believe I need in order to efficiently perform phase-cancellation of two lock-step external signals to observe short-term differences between the two (I don't care about the fundamental, just phase differences).  The PIT timer does not allow me to offset one sample from the other in time easily, where the PDB appears to do so.

Just trying to get the processor to switch from the PIT timer to the PDB timer was proving to be impossible. With the PDB running and producing timing events, the only register I was supposed to be able to set was the SIM_SOPT7 register to alter the trigger routing from one source to another. Nothing I was doing made a difference... until I turned on DMA for both channels (not just one mind you, it had to be both!) This of course leaves me a tad confused, but I am pointing a finger at the code generation for the LDD_ADC beans have a bug in there someplace when DMA is not on.

So I thought I had this all working when I left in a rush last evening, DMAs were regularly firing, I was able to transfer data... and then it quit. The DMAs get clogged up somehow, and stop repeating after up to several events. Now I don't have a great deal of experience with DMAs, so I am probably doing something not that bright so I have some reading to do, but that doesn't explain why they were working, and then quit.  I did make some changes to the sample rates, so that is probably it, but seems unlikely as I have a ton of CPU horsepower available. (About 3% utilized).

The more I was thinking of things last night, the less likely I will want DMAs as I need to adjust the phase correlation of the signals and gain in real-time, not on a per-block basis. I am hoping that the PDB timer will still work for this when I turn DMAs off. I can cut back on CPU time since I can under-sample my signal significantly since I have a very clean sinusoidal signal to deal with.

Thanks for any insight into my problems transitioning to the PDB timer.

What would be interesting is with the PIT example you have, trying to perform the same operations using the PDB timer, all else the same. Then let me know if it works.

0 Kudos
1,629 Views
Alice_Yang
NXP TechSupport
NXP TechSupport

Hello Mark,

 - So now your question is configure PDB trigger ADC ,

 the demo David E Seymour  mentioned is just the PDB trigger ADC , please

refer to it .

BR

Alice

0 Kudos
1,629 Views
markwyman
Contributor III

So the saga continues:

I turned off the DMA option in the ADC beans, and the PDB continues to work now! Yeah, I know. I am confused also. I have them working, so I am not looking back (for now). It may be related to the channel delay registers, as I had them set to something other than zero in the intention of being able to swing them up and down in value for my algorithm. 

The problem with GUI tools like Processor Expert is it is very difficult to use the scientific method to narrow down the cause of a problem when one option change results in many changes in the generated code base. I did refer to the example code and dumped some registers to see what may be going on, but frankly there were no significant differences apart from interrupt priorities and registers related to timing, of which my core is running at a different speed (Max!)

So I have been beyond the PDB trigger problem for a while now (yea!), but now I have a somewhat related issue:

The PDB block has two channels to the timer, and each of these has a Trigger A and Trigger B. Right now I am using Channel 0 Trig A for AD1, and Channel 1 Trig A for AD2. I am trying to get fine phase adjustment resolution (much finer than sample shifting) by adjusting only the Trigger A delay value on Channel 1 (PDB0_CH1DLY0)

For testing purposes I am stepping the delay on Trigger A from 0 (0uS) to 3000 (50uS) in 10 (0.167uS) steps every 1024 sample triggers by the following:

/*******************************************************
 * Set the second channel delay. Timing is related to the
 * PDB0 timing. This is currently set to 60MHz, so
 * increment in sample delay is 1/60MHz.
 */
bool AD2_SetPhaseDelay(uint16 phaseDelay)
{
 //We have to be careful that this is less than the main modulo timer value
 //TODO: Is this trashing current conversion?
 if (phaseDelay < PDB0_MOD)
 {
 EnterCritical();
 //Set the buffer value
 PDB0_CH1DLY0 = phaseDelay;
 //And load the buffer into the register...
 PDB0_SC |= PDB_SC_LDOK_MASK;
 ExitCritical();
 return TRUE;
 }
 return FALSE;
}

This results in the AD2 triggered by the PDB0 Channel 1 to suddenly stop triggering after I pass the 5uS threshold.

My conversion time is set at a single conversion time - Differential of 5.55uS, single conversion, so the timing is suspiciously similar, though I cannot understand why this may be happening.

What I am hoping to understand is how to properly modify this PDB0_CH1DLY0 register so that the AD2 does not stop triggering, or how to properly stop/modify/restart the AD I am adjusting so that triggers continue.

Ultimately I need the ability to shift a 13kHz signal 180 degrees on the second channel with better than 1 degree of resolution, which is 154uS time shift maximum from a sample on AD1 vs AD2.

Thanks

0 Kudos
1,630 Views
markwyman
Contributor III

I believe I have it. I was setting the delay registers arbitrarily in time from my main polling routine in response to serial commands. This, on occasion, would write a new delay value to the registers while a conversion was still in-progress. If I created a second trigger before the conversion was complete by altering the delays, that would be all she wrote and there would be no recovery easily of that stalled converter.

So I moved the setting of the delay into the end of the conversionComplete routine, and I have no more problems. I can be assured the converter isn't busy, and it is the right place to change the value for the next conversion anyhow.

The earlier problem I believe resulted from setting those delay values during my init call to setup the AD converters and the delay registers. The large jump in value during init resulted in two trigger events in close succession, causing the AD converters to lock up even before I finished the first conversion. This gave me the illusion that I had no clocks to the AD converters.

0 Kudos