Hi, we've been working on our custom board based on LS1028A, the memory that we're using is IS43QR85120B-083RBLI in 512Mb x8 config - 2GB total.
We've managed to perform validation and so on, DDR seems to be working fine.
We've performed some other tests like:
- enabled built self test in TF-A
- mtest in u-boot
- overnight tests on couple boards using memtester in Linux
Every test passed without any problems, however our electronic team is a bit worried about hardware level tests - from what they told me, the signal integrity does not match the requirements enforced by JEDEC. It's not a big gap between what JEDEC expects and what we have but still - it exists.
So my question is - what value in DDR controller determines the value that is being set in DRAM MR1 RTT_NOM?
We've been using STATIC_DDR configuration you can see below:
+const struct ddr_cfg_regs static_1600 = {
+ .cs[0].bnds = 0x7F,
+ .cs[0].config = 0x80010322,
+ .timing_cfg[0] = 0x80550018,
+ .timing_cfg[1] = 0xBDB48F42,
+ .timing_cfg[2] = 0x0048D114,
+ .timing_cfg[3] = 0x010C1000,
+ .timing_cfg[4] = 0x01,
+ .timing_cfg[5] = 0x03401400,
+ .timing_cfg[7] = 0x13300000,
+ .timing_cfg[8] = 0x02115600,
+ .sdram_cfg[0] = 0x650C0004,
+ .sdram_cfg[1] = 0x00401010,
+ .dq_map[0] = 0x56C5AC2C,
+ .dq_map[1] = 0xAD6B0000,
+ .dq_map[2] = 0x00,
+ .dq_map[3] = 0x01600000,
+ .sdram_mode[0] = 0x01010610,
+ .sdram_mode[1] = 0x00,
+ .sdram_mode[2] = 0x00,
+ .sdram_mode[3] = 0x00,
+ .sdram_mode[4] = 0x00,
+ .sdram_mode[5] = 0x00,
+ .sdram_mode[6] = 0x00,
+ .sdram_mode[7] = 0x00,
+ .sdram_mode[8] = 0x0400,
+ .sdram_mode[9] = 0x04A40000,
+ .sdram_mode[10] = 0x00,
+ .sdram_mode[11] = 0x00,
+ .sdram_mode[12] = 0x00,
+ .sdram_mode[13] = 0x00,
+ .sdram_mode[14] = 0x00,
+ .sdram_mode[15] = 0x00,
+ .md_cntl = 0x00,
+ .interval = 0x18600618,
+ .data_init = 0xDEADBEEF,
+ .clk_cntl = 0x01C00000,
+ .init_addr = 0x00,
+ .ddr_sr_cntr = 0x0,
+ .init_ext_addr = 0x00,
+ .zq_cntl = 0x8A090705,
+ .wrlvl_cntl[0] = 0x86750604,
+ .wrlvl_cntl[1] = 0x05060700,
+ .wrlvl_cntl[2] = 0x08,
+ .cdr[0] = 0x80080000,
+ .cdr[1] = 0xA180,
+ .debug[28] = 0x01080F70
+};
The validation steps selects the following:
1) Read ODT and driver:
- ODT 60ohm / DRAM driver strength 34 ohm - full
2) Write ODT and driver
- Controller DRV Strength - full strength / DRAM ODT - 60 ohm
As far as I understand, in the context of DRAM's MR1 RTT_NOM we're talking about 2) "Write ODT and driver" which was set during validation to 60 ohm, but I'm having difficulties understanding how the SDRAM_MODE register content reflects the RTT_NOM setting.
I mean - how can I be sure that DRAM ODT 60ohm selected by validation was in fact programmed in DRAM?
Hello @pb3
Hope this email finds you well,
Please note that the DDR Validation tool determine the results the ODT Validation.
The QCVS Tool is programmed to select a central value from among the passing cells.
This means that if all combinations pass validation,
QCVS will choose a value that lies at the center of the range of all passing cells.
Have a great day.
BR,
Hector V