<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic model inference in docker container in i.MX Processors</title>
    <link>https://community.nxp.com/t5/i-MX-Processors/model-inference-in-docker-container/m-p/2315983#M243964</link>
    <description>&lt;P&gt;hi,&amp;nbsp; community&lt;/P&gt;&lt;P&gt;# iMX 8MP&lt;/P&gt;&lt;P&gt;@ Harvey021&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;I have encountered an issue regarding model inference—it fails inside the Docker container but works on the host machine.&amp;nbsp;&lt;/SPAN&gt;The relevant shared libraries have already been copied into the container, and the program runs without reporting any missing dependencies&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;When I ran the official verification program, it encountered a segmentation fault.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Execute the command as follows：&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;root@imx8mpevk:~/examples# ./label_image --external_delegate_path=/usr/lib/libvx_delegate.soo&lt;BR /&gt;INFO: Loaded model ./mobilenet_v1_1.0_224_quant.tflite&lt;BR /&gt;INFO: resolved reporter&lt;BR /&gt;INFO: Vx delegate: allowed_cache_mode set to 0.&lt;BR /&gt;INFO: Vx delegate: device num set to 0.&lt;BR /&gt;INFO: Vx delegate: allowed_builtin_code set to 0.&lt;BR /&gt;INFO: Vx delegate: error_during_init set to 0.&lt;BR /&gt;INFO: Vx delegate: error_during_prepare set to 0.&lt;BR /&gt;INFO: Vx delegate: error_during_invoke set to 0.&lt;BR /&gt;INFO: EXTERNAL delegate created.&lt;BR /&gt;INFO: Applied EXTERNAL delegate.&lt;BR /&gt;W [HandleLayoutInfer:332]Op 162: default layout inference pass.&lt;BR /&gt;Segmentation fault&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;gdb debug:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="noway_0-1770780041856.png" style="width: 400px;"&gt;&lt;img src="https://community.nxp.com/t5/image/serverpage/image-id/376432i13E843EC2B7DCECA/image-size/medium?v=v2&amp;amp;px=400" role="button" title="noway_0-1770780041856.png" alt="noway_0-1770780041856.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;docker image: ubuntu24.04-arm64&lt;/P&gt;&lt;P&gt;docker command:&amp;nbsp;docker run -it --rm --device=/dev/galcore --device=/dev/mxc_hantro --device=/dev/mxc_hantro_vc8000e --network=host --privileged=true ubuntu:dl_gdb bash&lt;/P&gt;&lt;P&gt;target image version: linux6.6.36 full&lt;/P&gt;&lt;P&gt;target machine: i.MX 8mp&lt;/P&gt;</description>
    <pubDate>Wed, 11 Feb 2026 03:21:13 GMT</pubDate>
    <dc:creator>noway</dc:creator>
    <dc:date>2026-02-11T03:21:13Z</dc:date>
    <item>
      <title>model inference in docker container</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/model-inference-in-docker-container/m-p/2315983#M243964</link>
      <description>&lt;P&gt;hi,&amp;nbsp; community&lt;/P&gt;&lt;P&gt;# iMX 8MP&lt;/P&gt;&lt;P&gt;@ Harvey021&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;I have encountered an issue regarding model inference—it fails inside the Docker container but works on the host machine.&amp;nbsp;&lt;/SPAN&gt;The relevant shared libraries have already been copied into the container, and the program runs without reporting any missing dependencies&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;When I ran the official verification program, it encountered a segmentation fault.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Execute the command as follows：&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;root@imx8mpevk:~/examples# ./label_image --external_delegate_path=/usr/lib/libvx_delegate.soo&lt;BR /&gt;INFO: Loaded model ./mobilenet_v1_1.0_224_quant.tflite&lt;BR /&gt;INFO: resolved reporter&lt;BR /&gt;INFO: Vx delegate: allowed_cache_mode set to 0.&lt;BR /&gt;INFO: Vx delegate: device num set to 0.&lt;BR /&gt;INFO: Vx delegate: allowed_builtin_code set to 0.&lt;BR /&gt;INFO: Vx delegate: error_during_init set to 0.&lt;BR /&gt;INFO: Vx delegate: error_during_prepare set to 0.&lt;BR /&gt;INFO: Vx delegate: error_during_invoke set to 0.&lt;BR /&gt;INFO: EXTERNAL delegate created.&lt;BR /&gt;INFO: Applied EXTERNAL delegate.&lt;BR /&gt;W [HandleLayoutInfer:332]Op 162: default layout inference pass.&lt;BR /&gt;Segmentation fault&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;gdb debug:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="noway_0-1770780041856.png" style="width: 400px;"&gt;&lt;img src="https://community.nxp.com/t5/image/serverpage/image-id/376432i13E843EC2B7DCECA/image-size/medium?v=v2&amp;amp;px=400" role="button" title="noway_0-1770780041856.png" alt="noway_0-1770780041856.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;docker image: ubuntu24.04-arm64&lt;/P&gt;&lt;P&gt;docker command:&amp;nbsp;docker run -it --rm --device=/dev/galcore --device=/dev/mxc_hantro --device=/dev/mxc_hantro_vc8000e --network=host --privileged=true ubuntu:dl_gdb bash&lt;/P&gt;&lt;P&gt;target image version: linux6.6.36 full&lt;/P&gt;&lt;P&gt;target machine: i.MX 8mp&lt;/P&gt;</description>
      <pubDate>Wed, 11 Feb 2026 03:21:13 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/model-inference-in-docker-container/m-p/2315983#M243964</guid>
      <dc:creator>noway</dc:creator>
      <dc:date>2026-02-11T03:21:13Z</dc:date>
    </item>
    <item>
      <title>Re: model inference in docker container</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/model-inference-in-docker-container/m-p/2317272#M244006</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/254955"&gt;@noway&lt;/a&gt;,&amp;nbsp;&lt;/P&gt;
&lt;P style="box-sizing: border-box; margin: 0px 0px 15px; color: #333f48; font-family: Poppins, sans-serif; font-size: 16px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 300; letter-spacing: normal; orphans: 2; text-align: left; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: normal; background-color: #ffffff; text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial;"&gt;I hope you are doing very well.&lt;/P&gt;
&lt;P style="box-sizing: border-box; margin: 0px 0px 15px; color: #333f48; font-family: Poppins, sans-serif; font-size: 16px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 300; letter-spacing: normal; orphans: 2; text-align: left; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: normal; background-color: #ffffff; text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial;"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P style="box-sizing: border-box; margin: 0px 0px 15px; color: #333f48; font-family: Poppins, sans-serif; font-size: 16px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 300; letter-spacing: normal; orphans: 2; text-align: left; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: normal; background-color: #ffffff; text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial;"&gt;Please take a look to the chapter&amp;nbsp;&lt;STRONG style="box-sizing: border-box; font-weight: bold;"&gt;2.6.2 Building the TensorFlow Lite Library with the Flex Delegate for i.MX Linux platforms&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;and&amp;nbsp;&lt;STRONG style="box-sizing: border-box; font-weight: bold;"&gt;2.6.2.2 Setting up Docker VM&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;of&amp;nbsp;&lt;A style="box-sizing: border-box; background-color: transparent; color: #215bd6; text-decoration: underline;" href="https://www.nxp.com/docs/en/user-guide/IMX-MACHINE-LEARNING-UG.pdf" target="_self" rel="nofollow noopener noreferrer"&gt;i.MX Machine Learning User's Guide&lt;/A&gt;.&lt;/P&gt;
&lt;P style="box-sizing: border-box; margin: 0px 0px 15px; color: #333f48; font-family: Poppins, sans-serif; font-size: 16px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 300; letter-spacing: normal; orphans: 2; text-align: left; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: normal; background-color: #ffffff; text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial;"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P style="box-sizing: border-box; margin: 0px 0px 15px; color: #333f48; font-family: Poppins, sans-serif; font-size: 16px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 300; letter-spacing: normal; orphans: 2; text-align: left; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: normal; background-color: #ffffff; text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial;"&gt;I hope this can helps.&lt;/P&gt;
&lt;P style="box-sizing: border-box; margin: 0px 0px 15px; color: #333f48; font-family: Poppins, sans-serif; font-size: 16px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 300; letter-spacing: normal; orphans: 2; text-align: left; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: normal; background-color: #ffffff; text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial;"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P style="box-sizing: border-box; margin: 0px 0px 15px; color: #333f48; font-family: Poppins, sans-serif; font-size: 16px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 300; letter-spacing: normal; orphans: 2; text-align: left; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: normal; background-color: #ffffff; text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial;"&gt;Best regards,&lt;/P&gt;
&lt;P style="box-sizing: border-box; margin: 0px 0px 15px; color: #333f48; font-family: Poppins, sans-serif; font-size: 16px; font-style: normal; font-variant-ligatures: normal; font-variant-caps: normal; font-weight: 300; letter-spacing: normal; orphans: 2; text-align: left; text-indent: 0px; text-transform: none; widows: 2; word-spacing: 0px; -webkit-text-stroke-width: 0px; white-space: normal; background-color: #ffffff; text-decoration-thickness: initial; text-decoration-style: initial; text-decoration-color: initial;"&gt;Chavira&lt;/P&gt;</description>
      <pubDate>Thu, 12 Feb 2026 16:20:49 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/model-inference-in-docker-container/m-p/2317272#M244006</guid>
      <dc:creator>Chavira</dc:creator>
      <dc:date>2026-02-12T16:20:49Z</dc:date>
    </item>
    <item>
      <title>Re: model inference in docker container</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/model-inference-in-docker-container/m-p/2321592#M244153</link>
      <description>&lt;P&gt;&lt;SPAN&gt;Thank you for your answer. But I was running Docker on 8mp. My base image is arm64/v8/ubuntu24.04&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 24 Feb 2026 02:18:27 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/model-inference-in-docker-container/m-p/2321592#M244153</guid>
      <dc:creator>noway</dc:creator>
      <dc:date>2026-02-24T02:18:27Z</dc:date>
    </item>
  </channel>
</rss>

