<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: ONNX model slower on NPU than CPU in eIQ Machine Learning Software</title>
    <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/ONNX-model-slower-on-NPU-than-CPU/m-p/1196970#M332</link>
    <description>&lt;P&gt;&lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/157540"&gt;@rajaz&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Please provide more detail about project and customer name.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;-Manish&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Wed, 09 Dec 2020 23:16:27 GMT</pubDate>
    <dc:creator>manish_bajaj</dc:creator>
    <dc:date>2020-12-09T23:16:27Z</dc:date>
    <item>
      <title>ONNX model slower on NPU than CPU</title>
      <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/ONNX-model-slower-on-NPU-than-CPU/m-p/1196957#M331</link>
      <description>&lt;P&gt;I have download yolov3 and yolov3-tiny models from ONNX zoo and running on i.MX8M Plus devkit from variscite. According to&amp;nbsp;"&lt;SPAN class="fontstyle0"&gt;i.MX Machine Learning User's Guide, Rev. L5.4.47_2.2.0&lt;/SPAN&gt; " these models are tested.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;The issues is that running YOLOv3 or YOLOv3-Tiny on CPU is in fact faster than running on NPU. Can someone explain what am I doing wrong and why is that so? Thanks.&lt;BR /&gt;&lt;BR /&gt;&lt;STRONG&gt;Running on NPU&amp;nbsp;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;onnx_test_runner -j 1 -c 1 -r 1 -e vsi_npu ./tiny-yolov3/&lt;/LI-CODE&gt;&lt;LI-CODE lang="markup"&gt;2020-12-09 22:30:50.157550091 [E:onnxruntime:Default, runner.cc:217 operator()] Test tiny-yolov3 finished in 0.402 seconds, took 0.402 for each input&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;STRONG&gt;Running on CPU&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;onnx_test_runner -j 1 -c 1 -r 1 -e cpu ./tiny-yolov3/&lt;/LI-CODE&gt;&lt;LI-CODE lang="markup"&gt;2020-12-09 22:32:08.197318388 [E:onnxruntime:Default, runner.cc:217 operator()] Test tiny-yolov3 finished in 0.367 seconds, took 0.367 for each input&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;BR /&gt;I have noticed that running on NPU shows bunch of "unsupported node" messages&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;2020-12-09 22:30:49.538789386 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:Conv
2020-12-09 22:30:49.538949517 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:LeakyRelu
2020-12-09 22:30:49.538989393 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:MaxPool
2020-12-09 22:30:49.539026520 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:Conv
2020-12-09 22:30:49.539057396 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:LeakyRelu
2020-12-09 22:30:49.539090272 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:MaxPool
2020-12-09 22:30:49.539124273 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:Conv
2020-12-09 22:30:49.539154775 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:LeakyRelu
2020-12-09 22:30:49.539187026 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:MaxPool
2020-12-09 22:30:49.539221027 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:Conv
2020-12-09 22:30:49.539250903 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:LeakyRelu
2020-12-09 22:30:49.539282404 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:MaxPool
2020-12-09 22:30:49.539315906 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:Conv
2020-12-09 22:30:49.539345907 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:LeakyRelu
2020-12-09 22:30:49.539377283 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:MaxPool
2020-12-09 22:30:49.539410534 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:Conv
2020-12-09 22:30:49.539440160 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:LeakyRelu
2020-12-09 22:30:49.539472037 [W:onnxruntime:Default, vsi_npu_execution_provider.cc:174 GetUnsupportedNodeIndices] unsupported node:MaxPool&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;If I try to run using armnn as execution provider, I get an error.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;onnx_test_runner -j 1 -c 1 -r 1 -e armnn ./tiny-yolov3/&lt;/LI-CODE&gt;&lt;LI-CODE lang="markup"&gt;2020-12-09 22:42:08.375386673 [E:onnxruntime:Default, runner.cc:224 RunTests] Test tiny-yolov3 failed:Node:PermuteNCHW_51 Output:input_50 [ShapeInferenceError] Can't merge shape info. Both source and target dimension have values but they differ. Source=256 Target=128 Dimension=1
result:
        Models: 1
        Total test cases: 1
                Succeeded: 0
                Not implemented: 0
                Failed: 1
                        Got exception while running: 1
        Stats by Operator type:
                Not implemented(0):
                Failed:
Failed Test Cases:tiny-yolov3 of unknown version
test tiny-yolov3 failed, please fix it&lt;/LI-CODE&gt;</description>
      <pubDate>Wed, 09 Dec 2020 22:47:22 GMT</pubDate>
      <guid>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/ONNX-model-slower-on-NPU-than-CPU/m-p/1196957#M331</guid>
      <dc:creator>rajaz</dc:creator>
      <dc:date>2020-12-09T22:47:22Z</dc:date>
    </item>
    <item>
      <title>Re: ONNX model slower on NPU than CPU</title>
      <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/ONNX-model-slower-on-NPU-than-CPU/m-p/1196970#M332</link>
      <description>&lt;P&gt;&lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/157540"&gt;@rajaz&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Please provide more detail about project and customer name.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;-Manish&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 09 Dec 2020 23:16:27 GMT</pubDate>
      <guid>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/ONNX-model-slower-on-NPU-than-CPU/m-p/1196970#M332</guid>
      <dc:creator>manish_bajaj</dc:creator>
      <dc:date>2020-12-09T23:16:27Z</dc:date>
    </item>
    <item>
      <title>Re: ONNX model slower on NPU than CPU</title>
      <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/ONNX-model-slower-on-NPU-than-CPU/m-p/1197517#M334</link>
      <description>&lt;P&gt;&lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/105447"&gt;@manish_bajaj&lt;/a&gt;&amp;nbsp;I created a project in NXP support. It says the Project Pattern is "&lt;SPAN&gt;39735651&lt;/SPAN&gt;" and Project Name "&lt;SPAN&gt;Hands Down&lt;/SPAN&gt;". The project creation wizard didn't ask me to enter any customer name.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 10 Dec 2020 16:30:37 GMT</pubDate>
      <guid>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/ONNX-model-slower-on-NPU-than-CPU/m-p/1197517#M334</guid>
      <dc:creator>rajaz</dc:creator>
      <dc:date>2020-12-10T16:30:37Z</dc:date>
    </item>
    <item>
      <title>Re: ONNX model slower on NPU than CPU</title>
      <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/ONNX-model-slower-on-NPU-than-CPU/m-p/1197559#M335</link>
      <description>&lt;P&gt;&lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/157540"&gt;@rajaz&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;Can you share more detail about your company name?&amp;nbsp;&lt;/P&gt;
&lt;P&gt;This community might not be correct place for your question to be answered.&lt;/P&gt;
&lt;P&gt;-Manish&lt;/P&gt;</description>
      <pubDate>Thu, 10 Dec 2020 18:39:27 GMT</pubDate>
      <guid>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/ONNX-model-slower-on-NPU-than-CPU/m-p/1197559#M335</guid>
      <dc:creator>manish_bajaj</dc:creator>
      <dc:date>2020-12-10T18:39:27Z</dc:date>
    </item>
  </channel>
</rss>

