RT1170 SEMC - Async SRAM Question

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

RT1170 SEMC - Async SRAM Question

Jump to solution
880 Views
hsims
Contributor I

We are working on an RT1172-based design that will interface with both SDRAM and SRAM via the SEMC peripheral. The SRAM is asynchronous, 16-bit access, and we are using Non-ADMUX mode.

I'd like to get confirmation that the SEMC can interface with SRAM & SDRAM in the same design using this SRAM configuration/mode. Specifically, that the following SEMC pins can co-exist:

SEMC_ADDR11 being SDRAM A11 and SRAM WE#
SEMC_ADDR12 being SDRAM A12 and SRAM OE#
SEMC_DM0 being SDRAM DQM0 and SRAM LB#
SEMC_DM1 being SDRAM DQM1 and SRAM UB#

Table 29-6 & Table 29-7 of the RT1170 reference manual seem to indicate this will work (so long as we mux SRAM CE# to an SEMC_CSXn pin via the IOCR), but there is also a note mentioning the recommended SRAM configuration is 16-bit ADMUX mode.

0 Kudos
1 Solution
838 Views
jingpan
NXP TechSupport
NXP TechSupport

Hi @hsims ,

There are no issues, RT1170 can support Non-ADMUX mode. In addition, this note means that for 8bit ADMUX mode and 16bit ADMUX mode, 16bit ADMUX mode is more recommended, because firstly, the capacity is larger, and secondly, 16bit ADMUX mode is easier to access than 8bit.

So, RT1172 can interface with both SDRAM and Non-ADMUX SRAM via the SEMC peripheral. 

 

Regards,

Jing

View solution in original post

0 Kudos
4 Replies
839 Views
jingpan
NXP TechSupport
NXP TechSupport

Hi @hsims ,

There are no issues, RT1170 can support Non-ADMUX mode. In addition, this note means that for 8bit ADMUX mode and 16bit ADMUX mode, 16bit ADMUX mode is more recommended, because firstly, the capacity is larger, and secondly, 16bit ADMUX mode is easier to access than 8bit.

So, RT1172 can interface with both SDRAM and Non-ADMUX SRAM via the SEMC peripheral. 

 

Regards,

Jing

0 Kudos
825 Views
hsims
Contributor I

Thanks Jing.

A follow-up question, related to the address line interfacing between the SEMC & the SRAM:

We've utilized our target SRAM part on a previous design with a Coldfire MCU. On that board, we don't connect A0 of the MCU to the SRAM, instead offsetting the connections such that the MCU's A1-20 connect to the SRAM’s A0-19. The low/high byte enable lines are also connected, of course.

Initially after looking at Section 29.3.1.7.1 of the RT1170 reference manual, I thought we should do the same here, as the lowest SEMC_ADDR bit isn’t used in 16-bit mode.

However, we have an existing RT1172 design where we interface with 16-bit SDRAM using the SEMC, and we connect SEMC_A0-12 to the SDRAM's A0-12 directly (with no offset) despite Section 29.3.1.4.1 showing that the lowest SEMC_ADDR bit also isn't used in the 16-bit case there, the same as the 29.3.1.7.1 SRAM section.

Can you please provide guidance on whether we should directly connect SEMC_A0-19 to the SRAM A0-19 or offset the lines such that SEMC_A0 is not connected to the SRAM and SEMC_A1-20 connect to SRAM A0-19?

0 Kudos
770 Views
jingpan
NXP TechSupport
NXP TechSupport

Hi @hsims ,

In 16-bit mode, the i.MXRT A0 line transfers the A1 value, and the A1 line transfers the A2 value, and so on. So, RT1170 A0 connect to 16bit SRAM A0.

 

Regards,

Jing

Tags (1)
0 Kudos
749 Views
hsims
Contributor I

Excellent, we will move ahead with that understanding. Thank you!

0 Kudos