Hi All,
We alrways use DDR calibration tool to get some value.
But why Final Value is "End-0.5*tCK:" not "Mean" ?
Example
BYTE 0:
Start: HC=0x01 ABS=0x1C
End: HC=0x04 ABS=0x5C
Mean: HC=0x02 ABS=0x7B
End-0.5*tCK: HC=0x03 ABS=0x5C
Final: HC=0x03 ABS=0x5C
BYTE 1:
Start: HC=0x02 ABS=0x18
End: HC=0x04 ABS=0x4C
Mean: HC=0x03 ABS=0x32
End-0.5*tCK: HC=0x03 ABS=0x4C
Final: HC=0x03 ABS=0x4C
DQS calibration MMDC0 MPDGCTRL0 = 0x034C035C, MPDGCTRL1 = 0x03480340
DQS calibration MMDC1 MPDGCTRL0 = 0x03680370, MPDGCTRL1 = 0x035C0324
Hi Appolo
reason may be that using algorithm described in Figure 9. DQS Read Gating
Calibration: Timing Diagram AN4467 i.MX 6 Series DDR Calibration, difference between
too-early and too-late gate boundaries is one equal cycle, so when exceeding it
program considers more reliable to use too-late gate boundary and subtract 0.5tCK.
https://www.nxp.com/docs/en/application-note/AN4467.pdf
Best regards
igor
-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------