I have searched the forum for answers, and several postings are relevant, but I am still confused.
I am using a Teensy3.5 (MK64FX512VMD12 based).
I want to create a bit stream as if it was generated with a 1MHz clock. Essentially I want to fake floppy disk write data pulses.
I have figured out that one less than ideal way is to stream an array of bytes to one of the ports, and simply use one bit as my wanted data stream. I can pre-process the 512 bytes of the sector and expand them into a 512(bytes per sector)x8(bits per byte)x4(1us pulses per bit)=16k array in RAM using the MFM coding scheme.
I am pretty sure that there are much more efficient/better ways to do this, but I want to experiment with this method.
I have already experimented with a sketch which uses the Teensy DmaChannel library.
I have programmed a PIT for 1MHz. I saw in the DMAMUX that there is a specific bit (TRIG) which selects this as the "trigger" for the DMA (to get the desired output data rate of 1Mb/s). What wasn't clear was the "source" part of the DMAMUX_CHCGFx register in this mode where the TRIG bit is '1' to select the PIT clock as trigger. I suspect (from experiment) that it needs to be "DMAMUX_SOURCE_ALWAYS0" (Teensy #define which I think is 58 for this SoC) so that the PIT clock always transfer a byte.
I found a simple example which transfers one byte (of 0xFF) from a memory location of the port toggle register (I used PORT C). The DMA transfer was size=1, length=1. When I enabled the DMA I got a nice 500kHz square wave on PORT C bit 0. Clearly the DMA being only a single byte transfer was being continually re-triggered. I need the behaviour that after starting the DMA it runs and completes, like I would expect for a memory DMA block transfer.
I did use Teensy DMA library code where I pointed the source to a 256 byte byte array in RAM. On the scope I saw the expected burst of data over 256us, then a little gap, then another block of 256us. Again I was seeing the re-run of the DMA transfer. Since the data stream in each burst was correct, this appeared to indicate that when the DMA re-runs the source address is reset?
When I changed the DMAMUX_CHCGFx by setting TRIG to zero and selecting PORTA as the source, when I toggled the appropriate PORT A input pin, it seemed to cause the PORT C I/O to change. So even though I was using an array, each byte was sent needing a trigger.
I am aware of (but do not fully understand) the minor and major loop registers.
SO I am now thinking that I simply need to detect once the burst of 256 has ended, and then stop the DMA so that it doesn't re-run?
I guess my confusion is that I could program the DMA to transfer 256 bytes, clocked by the PIT, start the DMA, and then I assumed that it would self-terminate. As I write this I think this is incorrect, and I need to step-in to stop the DMA.
The Teensy DMAChannel library has a way to attach and interrupt at the completion of the DMA (dma0.interruptAtCompletion(); ). What condition causes this interrupt to be triggered? Completing a major loop? Completing all major loops?
In the library the array length (256) gets written to BITER and CITER (all I know at the moment is that these are these are DMA TCD registers). NBYTES=1 (byte transfer I assume?). SOFF=1 (array byte increment?). This makes sense.
So I get the idea that each trigger (i.e.PIT or even PORT A pin) causes one DMA transfer, which in this case is the DMA transfer of one byte from memory to the same PORT register address, causing the I/O pin to change. This also does something to the minor and major loop count? At some point there is a interrupt (if enabled) which allows software to stop the transfer at that point?
Any clarification would be helpful.
I'm going to spend a few hours playing with the Teensy.
A pointer to nice PDF/presentation of the TCD structure and operation of the DMA would be appreciated. I searched but didn't find anything.