<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic How make converted to tflite work? (guide on NXP site may be outdated) in eIQ Machine Learning Software</title>
    <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/How-make-converted-to-tflite-work-guide-on-NXP-site-may-be/m-p/1197390#M333</link>
    <description>&lt;P&gt;I follow the article&amp;nbsp;&lt;A href="https://community.nxp.com/t5/Software-Community-Articles/eIQ-Sample-Apps-TFLite-Quantization/ba-p/1131121" target="_blank" rel="noopener"&gt;https://community.nxp.com/t5/Software-Community-Articles/eIQ-Sample-Apps-TFLite-Quantization/ba-p/1131121&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I've installed tensorflow strictly the same version as in article, and downloaded the same model.&lt;/P&gt;&lt;P&gt;I've got converted .tflite model, that's seems good in netron visualizer.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Then (in opposite to the article) I try run it pyEIQ.&amp;nbsp;pyEIQ is checked, it worked with some tflite models.&lt;/P&gt;&lt;P&gt;But I get error&lt;/P&gt;&lt;LI-SPOILER&gt;self.interpreter = aNNInterpreter(model_file)&lt;BR /&gt;File "/usr/lib/python3.7/site-packages/eiq/engines/armnn/inference.py", line 19, in __init__&lt;BR /&gt;network = parser.CreateNetworkFromBinaryFile(model)&lt;BR /&gt;File "/usr/lib/python3.7/site-packages/pyarmnn/_generated/pyarmnn_tfliteparser.py", line 711, in CreateNetworkFromBinaryFile&lt;BR /&gt;return _pyarmnn_tfliteparser.ITfLiteParser_CreateNetworkFromBinaryFile(self, graphFile)&lt;BR /&gt;RuntimeError: Buffer #88 has 0 bytes. For tensor: [1,14,14,512] expecting: 401408 bytes and 100352 elements. at function CreateConstTensor [/usr/src/debug/armnn/19.08-r1/git/src/armnnTfLiteParser/TfLiteParser.cpp:2612]&lt;/LI-SPOILER&gt;&lt;P&gt;I think the issue is with tensorflow version, that converts tensorflow to tflite. But there is no more info about tf version to use.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Additionally I've tried modern versions of tensorflow i.e. 2.3.0, 2.3.1 but got segfault(139)&lt;/P&gt;</description>
    <pubDate>Thu, 10 Dec 2020 12:04:02 GMT</pubDate>
    <dc:creator>korabelnikov</dc:creator>
    <dc:date>2020-12-10T12:04:02Z</dc:date>
    <item>
      <title>How make converted to tflite work? (guide on NXP site may be outdated)</title>
      <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/How-make-converted-to-tflite-work-guide-on-NXP-site-may-be/m-p/1197390#M333</link>
      <description>&lt;P&gt;I follow the article&amp;nbsp;&lt;A href="https://community.nxp.com/t5/Software-Community-Articles/eIQ-Sample-Apps-TFLite-Quantization/ba-p/1131121" target="_blank" rel="noopener"&gt;https://community.nxp.com/t5/Software-Community-Articles/eIQ-Sample-Apps-TFLite-Quantization/ba-p/1131121&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I've installed tensorflow strictly the same version as in article, and downloaded the same model.&lt;/P&gt;&lt;P&gt;I've got converted .tflite model, that's seems good in netron visualizer.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Then (in opposite to the article) I try run it pyEIQ.&amp;nbsp;pyEIQ is checked, it worked with some tflite models.&lt;/P&gt;&lt;P&gt;But I get error&lt;/P&gt;&lt;LI-SPOILER&gt;self.interpreter = aNNInterpreter(model_file)&lt;BR /&gt;File "/usr/lib/python3.7/site-packages/eiq/engines/armnn/inference.py", line 19, in __init__&lt;BR /&gt;network = parser.CreateNetworkFromBinaryFile(model)&lt;BR /&gt;File "/usr/lib/python3.7/site-packages/pyarmnn/_generated/pyarmnn_tfliteparser.py", line 711, in CreateNetworkFromBinaryFile&lt;BR /&gt;return _pyarmnn_tfliteparser.ITfLiteParser_CreateNetworkFromBinaryFile(self, graphFile)&lt;BR /&gt;RuntimeError: Buffer #88 has 0 bytes. For tensor: [1,14,14,512] expecting: 401408 bytes and 100352 elements. at function CreateConstTensor [/usr/src/debug/armnn/19.08-r1/git/src/armnnTfLiteParser/TfLiteParser.cpp:2612]&lt;/LI-SPOILER&gt;&lt;P&gt;I think the issue is with tensorflow version, that converts tensorflow to tflite. But there is no more info about tf version to use.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Additionally I've tried modern versions of tensorflow i.e. 2.3.0, 2.3.1 but got segfault(139)&lt;/P&gt;</description>
      <pubDate>Thu, 10 Dec 2020 12:04:02 GMT</pubDate>
      <guid>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/How-make-converted-to-tflite-work-guide-on-NXP-site-may-be/m-p/1197390#M333</guid>
      <dc:creator>korabelnikov</dc:creator>
      <dc:date>2020-12-10T12:04:02Z</dc:date>
    </item>
  </channel>
</rss>

