<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: imx8: running pyeiq facial expression detection on 5.4.24 bsp with video source thowing error in eIQ Machine Learning Software</title>
    <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066327#M186</link>
    <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;It worked by adding this with command. Just for my info what exactly this flag does here.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
    <pubDate>Wed, 05 Aug 2020 19:31:53 GMT</pubDate>
    <dc:creator>vsuneja63</dc:creator>
    <dc:date>2020-08-05T19:31:53Z</dc:date>
    <item>
      <title>imx8: running pyeiq facial expression detection on 5.4.24 bsp with video source thowing error</title>
      <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066323#M182</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; color: #51626f; font-family: inherit; font-size: 100%; font-style: inherit; font-variant: normal; font-weight: inherit; letter-spacing: normal; orphans: 2; overflow-wrap: break-word; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; vertical-align: baseline; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;I could run facial expression detection on imx8mq evk(5.4.24 bsp) with input image. Same i tried to run with video source but its thowing error. Here are the logs:&lt;/P&gt;&lt;P style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; color: #51626f; font-family: inherit; font-size: 100%; font-style: inherit; font-variant: normal; font-weight: inherit; letter-spacing: normal; min-height: 8pt; orphans: 2; overflow-wrap: break-word; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; vertical-align: baseline; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;&amp;nbsp;&lt;/P&gt;&lt;P style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; color: #51626f; font-family: inherit; font-size: 100%; font-style: inherit; font-variant: normal; font-weight: inherit; letter-spacing: normal; orphans: 2; overflow-wrap: break-word; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; vertical-align: baseline; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;&lt;STRONG style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; font-family: inherit; font-size: 100%; font-style: inherit; font-weight: bold; vertical-align: baseline; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;INFO: Created TensorFlow Lite delegate for NNAPI.&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; font-family: inherit; font-size: 100%; font-style: inherit; font-weight: bold; vertical-align: baseline; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;Applied NNAPI delegate.&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; font-family: inherit; font-size: 100%; font-style: inherit; font-weight: bold; vertical-align: baseline; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;Using /dev/video2 as video device&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; font-family: inherit; font-size: 100%; font-style: inherit; font-weight: bold; vertical-align: baseline; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;Resolution not supported. Using 640x480 instead.&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; font-family: inherit; font-size: 100%; font-style: inherit; font-weight: bold; vertical-align: baseline; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;[ WARN:0] global /usr/src/debug/opencv/4.2.0.imx-r0/git/modules/videoio/src/cap_gstreamer.cpp (713) open OpenCV | GStreamer warning: Error opening bin: no element "imxvideoconvert_g2d"&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; font-family: inherit; font-size: 100%; font-style: inherit; font-weight: bold; vertical-align: baseline; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;[ WARN:0] global /usr/src/debug/opencv/4.2.0.imx-r0/git/modules/videoio/src/cap_gstreamer.cpp (480) isPipelinePlaying OpenCV | GStreamer warning: GStreamer: pipeline have not been created&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; font-family: inherit; font-size: 100%; font-style: inherit; font-weight: bold; vertical-align: baseline; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;Your video device could not be initialized. Exiting...&lt;/STRONG&gt;&lt;/P&gt;&lt;P style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; color: #51626f; font-family: inherit; font-size: 100%; font-style: inherit; font-variant: normal; font-weight: inherit; letter-spacing: normal; min-height: 8pt; orphans: 2; overflow-wrap: break-word; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; vertical-align: baseline; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;&amp;nbsp;&lt;/P&gt;&lt;P style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; color: #51626f; font-family: inherit; font-size: 100%; font-style: inherit; font-variant: normal; font-weight: inherit; letter-spacing: normal; orphans: 2; overflow-wrap: break-word; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; vertical-align: baseline; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;As per my understanding imx8mq evk doesn't support g2d. So what can be done here to make it working?&lt;/P&gt;&lt;P style="border-image-outset: 0; border-image-repeat: stretch; border-image-slice: 100%; border-image-source: none; border-image-width: 1; color: #51626f; font-family: inherit; font-size: 100%; font-style: inherit; font-variant: normal; font-weight: inherit; letter-spacing: normal; orphans: 2; overflow-wrap: break-word; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; vertical-align: baseline; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px; padding: 0px; margin: 0px; border: 0px none currentColor;"&gt;Would like to here from you asap.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 05 Aug 2020 18:07:04 GMT</pubDate>
      <guid>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066323#M182</guid>
      <dc:creator>vsuneja63</dc:creator>
      <dc:date>2020-08-05T18:07:04Z</dc:date>
    </item>
    <item>
      <title>Re: imx8: running pyeiq facial expression detection on 5.4.24 bsp with video source thowing error</title>
      <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066324#M183</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hello, &lt;A class="jx-jive-macro-user" href="https://community.nxp.com/people/vsuneja63@gmail.com"&gt;vsuneja63@gmail.com&lt;/A&gt;‌,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;PyeIQ's demos use v4l2 pipelines as default to stream the video captured from your camera, and those pipelines use imxvideoconvert_g2d for video conversion. Did you specified any --video_fwk? Could you please try to use --video_fwk=opencv and see if it works?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Alifer&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 05 Aug 2020 18:38:30 GMT</pubDate>
      <guid>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066324#M183</guid>
      <dc:creator>Alifer_Moraes</dc:creator>
      <dc:date>2020-08-05T18:38:30Z</dc:date>
    </item>
    <item>
      <title>Re: imx8: running pyeiq facial expression detection on 5.4.24 bsp with video source thowing error</title>
      <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066325#M184</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I tested with command "&lt;SPAN style="display: inline !important; float: none; background-color: #f5f6fa; color: #5c5962; font-family: 'SFMono-Regular',Menlo,Consolas,Monospace; font-size: 0.75em; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; line-height: 1.4; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: pre; word-spacing: 0px;"&gt;pyeiq &lt;/SPAN&gt;&lt;SPAN style="box-sizing: border-box; color: #268bd2; font-family: &amp;amp;quot; sfmono-regular&amp;amp;quot;,menlo,consolas,monospace; font-size: 12px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: pre; word-spacing: 0px;"&gt;--run&lt;/SPAN&gt;&lt;SPAN style="display: inline !important; float: none; background-color: #f5f6fa; color: #5c5962; font-family: 'SFMono-Regular',Menlo,Consolas,Monospace; font-size: 0.75em; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; line-height: 1.4; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: pre; word-spacing: 0px;"&gt; face_and_eyes_detection &lt;/SPAN&gt;&lt;SPAN style="box-sizing: border-box; color: #268bd2; font-family: &amp;amp;quot; sfmono-regular&amp;amp;quot;,menlo,consolas,monospace; font-size: 12px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: pre; word-spacing: 0px;"&gt;--video_src&lt;/SPAN&gt;&lt;SPAN style="box-sizing: border-box; color: #859900; font-family: &amp;amp;quot; sfmono-regular&amp;amp;quot;,menlo,consolas,monospace; font-size: 12px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: pre; word-spacing: 0px;"&gt;=&lt;/SPAN&gt;&lt;SPAN style="display: inline !important; float: none; background-color: #f5f6fa; color: #5c5962; font-family: 'SFMono-Regular',Menlo,Consolas,Monospace; font-size: 0.75em; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; line-height: 1.4; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: pre; word-spacing: 0px;"&gt;/path_to_the_video&lt;/SPAN&gt;". I didn't specify "--video_fwk=OpenCV" in this, will test with this &amp;amp; update on the same. Does g2d is used in codebase? As per my understanding g2d is not available in imx8mq evk.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 05 Aug 2020 18:52:52 GMT</pubDate>
      <guid>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066325#M184</guid>
      <dc:creator>vsuneja63</dc:creator>
      <dc:date>2020-08-05T18:52:52Z</dc:date>
    </item>
    <item>
      <title>Re: imx8: running pyeiq facial expression detection on 5.4.24 bsp with video source thowing error</title>
      <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066326#M185</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Yes, we are using g2d in the code of our v4l2 pipelines, since we developed PyeIQ&amp;nbsp;for i.MX 8MP and i.MX 8 QM. To run the demos on i.MX 8MQ try to use --video_fwk=opencv.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Alifer&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 05 Aug 2020 19:06:44 GMT</pubDate>
      <guid>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066326#M185</guid>
      <dc:creator>Alifer_Moraes</dc:creator>
      <dc:date>2020-08-05T19:06:44Z</dc:date>
    </item>
    <item>
      <title>Re: imx8: running pyeiq facial expression detection on 5.4.24 bsp with video source thowing error</title>
      <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066327#M186</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;It worked by adding this with command. Just for my info what exactly this flag does here.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 05 Aug 2020 19:31:53 GMT</pubDate>
      <guid>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066327#M186</guid>
      <dc:creator>vsuneja63</dc:creator>
      <dc:date>2020-08-05T19:31:53Z</dc:date>
    </item>
    <item>
      <title>Re: imx8: running pyeiq facial expression detection on 5.4.24 bsp with video source thowing error</title>
      <link>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066328#M187</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;The --video_fwk flag is used to choose which video framework is going to be used to stream video, the available options are v4l2 (default), opencv and gstreamer (experimental). Both v4l2 and gstreamer run with hardcoded pipelines that use g2d and opencv is a simple default pipeline with minimal configuration to run in most environments.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Alifer&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 05 Aug 2020 19:50:52 GMT</pubDate>
      <guid>https://community.nxp.com/t5/eIQ-Machine-Learning-Software/imx8-running-pyeiq-facial-expression-detection-on-5-4-24-bsp/m-p/1066328#M187</guid>
      <dc:creator>Alifer_Moraes</dc:creator>
      <dc:date>2020-08-05T19:50:52Z</dc:date>
    </item>
  </channel>
</rss>

