lpcware

SDRAM Init -- Fundamental Questions

Discussion created by lpcware Employee on Jun 15, 2016
Latest reply on Jun 15, 2016 by lpcware
Content originally posted in LPCWare by MikeSimmonds on Tue Nov 06 12:07:19 MST 2012

  I have 3 very basic questions about SDRAM init on the 1778/1788.
 
  My board has two of Micron 256Mbit devices
  [16Mx16bits banks:4 rows:13 columns:9]
 
  Wired with Addr and control lines in common, one device providing
  the high 16 data bits, the other providing the low 16 data bits.
  I.e. the CPU data bus is 32bits (the devices are 16bits wide).
 
  Question 1:
  What Address Mapping should I be using?
 
  I would assume (for Row-Bank-Column):
  "1 0 011 01 = 256 Mbit (16Mx16), 4 banks, row length = 13, column length = 9"
  from table 132 in UM10470 rev 2
 
  and that the EMC controller will know I have two devices because
  the device width is half the cpu width.
 
  If this is not the case (and the cpu 'sees' a 16Mx32bit device,
  then there is no available mapping to get the correct bank/row/column
  setup! (What do I do then?)
 
  Question 2:
  What mode word shift factor should I be using?
 
  I would assume (9 columns + 2 bank bits + 2 for CPU data bus of 32bits)
  NOTE 2 = cpu data bus size which is NOT the same as the device data bus size!
  or should it be (cols+bank+1) due to the device width?
 
  Question 3:
 
  The EMC chapter in the user manual (UM10470 rev 2) defines all timings
  and delays in terms of "CCLK" I.e. the processor clock [120 MHz for me]
  and NOT in terms of "EMCCLK" [60 Mhz for me].
 
  Is this just incredibly slap-dash authoring, or do they REALLY mean the
  CPU clock -- even though this is (now) twice the speed of the EMCCLK?
 
  Anyone care to comment?
  Especially NXP Europe/USA (re the shockingly ambiguous manual!)

Outcomes