<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>LPC MicrocontrollersのトピックUsing 2 flash chips over SPIFI</title>
    <link>https://community.nxp.com/t5/LPC-Microcontrollers/Using-2-flash-chips-over-SPIFI/m-p/530291#M9977</link>
    <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;STRONG&gt;Content originally posted in LPCWare by jurrien on Thu Jan 22 02:46:59 MST 2015&lt;/STRONG&gt;&lt;BR /&gt;&lt;SPAN&gt;Hello,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;I am using 2 flash chips to store and load data (not to execute from). I use some logic gates and a GPIO to gate the chip select of the flash chips and the SPIFI for reading and writing the data. When I use this to read something from the chips I can select one without any problem and read them. I do, however, have to call the spifi init function after I switch between the chips. This goes fine as long as I am only reading, but if I switch after a write operation the init function returns 0x2000A. My first though was that I had to wait until the write operation has completed before I switch between the flash chips. However, as far as I understand, calling the __ISB() followed with the __DSB() function should enforce this, but does not change the return value of the init call. Can anyone explain what changed after a write operation compared to the read operation and why the chip suddenly isn't recognized anymore? And what should I do to make this work, do I have to reset anything after a write and a switch between the chips?&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks in advance,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Jurrien&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
    <pubDate>Wed, 15 Jun 2016 18:18:48 GMT</pubDate>
    <dc:creator>lpcware</dc:creator>
    <dc:date>2016-06-15T18:18:48Z</dc:date>
    <item>
      <title>Using 2 flash chips over SPIFI</title>
      <link>https://community.nxp.com/t5/LPC-Microcontrollers/Using-2-flash-chips-over-SPIFI/m-p/530291#M9977</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;STRONG&gt;Content originally posted in LPCWare by jurrien on Thu Jan 22 02:46:59 MST 2015&lt;/STRONG&gt;&lt;BR /&gt;&lt;SPAN&gt;Hello,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;I am using 2 flash chips to store and load data (not to execute from). I use some logic gates and a GPIO to gate the chip select of the flash chips and the SPIFI for reading and writing the data. When I use this to read something from the chips I can select one without any problem and read them. I do, however, have to call the spifi init function after I switch between the chips. This goes fine as long as I am only reading, but if I switch after a write operation the init function returns 0x2000A. My first though was that I had to wait until the write operation has completed before I switch between the flash chips. However, as far as I understand, calling the __ISB() followed with the __DSB() function should enforce this, but does not change the return value of the init call. Can anyone explain what changed after a write operation compared to the read operation and why the chip suddenly isn't recognized anymore? And what should I do to make this work, do I have to reset anything after a write and a switch between the chips?&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN&gt;Thanks in advance,&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;Jurrien&lt;/SPAN&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 15 Jun 2016 18:18:48 GMT</pubDate>
      <guid>https://community.nxp.com/t5/LPC-Microcontrollers/Using-2-flash-chips-over-SPIFI/m-p/530291#M9977</guid>
      <dc:creator>lpcware</dc:creator>
      <dc:date>2016-06-15T18:18:48Z</dc:date>
    </item>
  </channel>
</rss>

