RT1020 clocking SDRAM at 166Mhz: Zero timing margin?

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

RT1020 clocking SDRAM at 166Mhz: Zero timing margin?

893 Views
rshipman
Contributor V

Hi,

I hope you are all ok.

If you look at the RT1020 consumer datasheet:

i.MX RT1020 Crossover Processors for Consumer Products, Rev. 1, 04/2019

Page 43

Section 4.5.1.2.2 SEMC input timing in SYNC mode

With SEMC_MCR.DQSMD = 0x1, the TIS Data input setup parameter is 0.6.

I.e. The data has to be present on the bus 0.6ns before the RT1020 can clock it in.

If you look at a typical data sheet for a 166MHz SDRAM device, for example: the ISSI IS42S16160J-6. (Data sheet attached: 42-45S83200J-16160J.pdf)

The Access Time from Clock is 5.4ns. (Probably a JDEC spec.)

So 5.4 ns to present the data, leaving only enough time (0.6ns) before the RT1020 reads it in.

5.4 + 0.6 = 6ns. 166Mhz is a 6.02ns clock cycle. So practically zero margin.

What am I missing here? No one would accept a margin of zero, and the eval board seems to work, so I know I am missing something.

Or is there an adjustment/register that could help here? I couldn't see anything obvious in the reference manual or the semc/sdram demo.

Many thanks.

Labels (1)
0 Kudos
1 Reply

799 Views
jeremyzhou
NXP Employee
NXP Employee

Hi ,

Thank you for your interest in NXP Semiconductor products and for the opportunity to serve you.
5.4 ns is the maximum limit time of the Access Time From CLK, in reality, it definitely is less than 5.4 ns.

pastedImage_1.png
From the experience, slow down clock frequency is a good way to assure time configuration meets its characteristic requirements.

Have a great day,
TIC

 

-------------------------------------------------------------------------------
Note:
- If this post answers your question, please click the "Mark Correct" button. Thank you!

 

- We are following threads for 7 weeks after the last post, later replies are ignored
Please open a new thread and refer to the closed one, if you have a related question at a later point in time.
-------------------------------------------------------------------------------