Content originally posted in LPCWare by Dave on Thu Apr 12 12:57:34 MST 2012
I thought it might be interesting to dynamically "fine-tune" the delay constants I am using for CMDDLY and FBCLKDLY in the Delay Control Register, so I followed the instructions in the user's manual in section 10.12.29 for reading the calibration register.
Interestingly enough, the value is 128-129, unless I add cold air to the SDRAM chip, in which case it moves to about 133.
Not a lot of movement here...
I also adjusted the delay constants from maximum to minimum value (max and min are determined by whether or not there are memory errors during memory testing), and found that the cal value never really changes much - it's always somewhere between 128 and 133...
So, here are my questions: Is this really necessary? Has anyone else successfully implemented this cal value in dynamically adjusting the delay values for the EMC controller to SDRAM? Has anyone ever had their system change enough to cause an SDRAM error while running?
I was just curious... conceptually, it seems like a good idea, but I don't get much information from it...