Heillo @mayliu1
Thank you for your suggestion.
The inquiry in that thread is the same as my question.
However, I don't understand the "solution" posted.
This is because NXP has provided no information about UB#/LB#.
Although it is marked as the solution, please review the response content.
I suspect the original poster probably gave up waiting for a response from NXP.
What I want to know is the timing of the UB#/LB# signals output from the i.MX RT1170 SEMC to control SRAM.
Since the signal names are UB#/LB#, I assume they are active low, but I need information on when they are asserted low and when they are negated high.
In a typical memory controller, UB#/LB# are driven high when not accessing the SRAM;
when accessing the low byte D[7:0], LB# is driven low;
when accessing the high byte D[15:8], UB# is driven low.
However, on the i.MX RT1170 SEMC I observe behavior where the signals are driven low in some state, and
when accessing the low byte D[7:0], UB# is driven high to disable low byte of SRAM;
when accessing the high byte D[15:8], LB# is driven high to disable high byte of SRAM.
Is this the behavior NXP intends?
Best regards,
Ishii.