I'm hoping for some guidance in regards to using Coresight DAP over SWD on the KL02Z32 micros.
I currently have two FRDM-KL02Z development boards, and i'm trying to use one as a host and communicate with the debug port of the target using the SWDIO and SWDCLK pins.
The host is set-up to bit-bash the comms to the SWD pins on the target which is currently sitting pretty blinking an LED. My aim is to have the host hold the target in reset, mass erase the flash of the target and re-program it.
Where I'm currently at is; I have the host communicating with the target to the point of correctly receiving 0x0BC11477 back from the target when it's asked its IDCODE. From this point on is where I'm hitting a brick wall. In general I'm either receiving faults back from trying to read CTRLSTAT, or I'm receiving an ACK of 0x07.
I've read the Coresight DAP-lite manual, The ARM Debug Interface V5 Manual, many different websites and various open source code and am struggling to figure out where I'm going wrong. I have looked at CMSIS-DAP, swddude, MarkDing's SWD programming sram code.
I've observed a few oddities; The initialisation code of 0xE7 and 0x9E sent to the target doesn't seem to matter, if I exclude this from being sent, or send different bytes I still receive the IDCODE correctly. Is this normal?
Also, I have scoped the clock and data lines from the OpenSDA chip to the KL02Z whilst it is being programmed, when data is presented on SWDIO, the clock on SWDCLK is not consistent, it is only when SWDIO is held low that SWDCLK appears to have a consistent clock signal. Is this also normal? The diagrams in the datasheet appeared to show a consistent clock signal throughout a packet.
Here's the scenario I have at the moment:
Startup; (For clarity SWDCLK and SWDIO on my host application are GPIO's and connected to the SWDCLK and SWDIO pins of the target, which has not changed the MUX from default on those pins - default for the KL02Z is SWD operation).
- SWDCLK is set to output.
- Host turns SWDIO into an output and drives it high, SWDCLK is strobed from high to low 64 times (Although the datasheet said 50, some source code I found did this 64 times, it seems reasonable to assume this is ok)
- 0x9E is sent LSb First over SWDIO by the host (each bit of data sent is strobed from high to low)
- 0xE7 is sent LSb First over SWDIO by the host (I have tried all the different variations of sending these two bytes and neither variation has made any difference)
- Host turns SWDIO into an output and drives it high, SWDCLK is again strobed from high to low 64 times
- SWDIO is set low and the clock is strobed once (Without this, reading IDCODE fails, ACK is 0x07)
- Header to read IDCODE is formed and sent, ACK OK received, 32 bits of Data recieved, 1 parity bit received, Data is 0x0BC11477
- Empty byte is sent (Other source code does this, the datasheets I read do mention that 8 clock cycles after a packet are required to finish the transaction).
From here nothing seems to be consistent and none of the source codes or datasheets agree on what happens next.
If the first thing I do after the startup sequence is read the CTRLSTAT register back, WDATAERR is set - I haven't attempted a write yet!!
I've tried writing 0x50000000 to CTRLSTAT, from which some seem to sugest is required to turn the debugger on. That's the CSYSPWRUPREQ and CDBGPWRUPREQ bits of the CTRLSTAT register. I get an ACK of 0x07 whenever I try to write to the CTRLSTAT register.
I've tried writing to the ABORT register on startup to clear any error flags, I get an ACK OK back from trying to write to the ABORT register.
I can read the IDCODE register back at any time and always get a correct response.
I believe my headers to be correct to the datasheet; to read CTRLSTAT I am sending 0b10110001 and to write CTRLSTAT I am sending 0b10010101. Sent MSb first (this is what's on the datasheet diagram). I'm using the same function to generate the header to read the IDCODE and write ABORT that I do with CTRLSTAT.
What have I missed? I thought perhaps I'm having a timing issue, which is why i'm receiving ACK's of 0x07 so frequently, but given the randomness of the clock from what I observed on the OpenSDA chip, I'm at a loss to believe that if that works correctly, why doesn't my consistent clock?
What is the actual start up sequence? I cannot seem to find an agreement on what to do after IDCODE is read for the first time.