Question about imx8m plus EVB

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Question about imx8m plus EVB

4,683 Views
ryanzj_huang
Contributor I

:

1. How to inference our own models, my question is, will it only accept, TFlite, ArmNN and CV. If yes, should we convert our models to this format. and also i would like to know what specs or format does the inference model should be to accepted by inference engine in NXP.

2. How do we know the latest pyeIQ version is running on NPU or CPU. because when i check in TOP command, it shows only CPU loading and its like above 150%. 

Thanks

Labels (1)
0 Kudos
11 Replies

3,488 Views
kenizgandhi
Contributor III

Hello @manish_bajaj@Alifer_Moraes@ryanzj_huang ,

I want to run some object and scene detection applications on imx8mpevk board. I have already boot the linux onto the board using sd card.

Linux version 5.10.9-1.0.0+g32513c25d8c7 (oe-user@oe-host) (aarc1
Machine model: NXP i.MX8MPlus EVK board

I have connected mipi csi2 to the board and able to capture video using mipi ov5640 camera and display it on HDMI monitor. I have already successfully installed eIQ package using pip3 install eiq.

I am using it on linux on virtual machine

1. I want to know the further steps to be done to perform AI/ML?

2. Do I need to boot linux again using YOCTO? 

3. Do I need to use MCUXpresso IDE/SDK? If yes, how?

I am not able to open the link https://pyeiq.dev. Can you please check and tell if the link is working for you. 

I have no idea how to proceed further. 

Your inputs will be highly appreciated. 

Thank You. 

Kind Regards,

Keniz

0 Kudos

3,471 Views
kenizgandhi
Contributor III

SOLVED : it was simple . pip3 install packagename

Hello @manish_bajaj , 

I have successfully installed pyeiq using pip3 install pyeiq. But when I write pyeiq I am facing the following error: 

Traceback (most recent call last):
File "/usr/bin/pyeiq", line 12, in <module>
from eiq.apps.pyeiq_launcher.config import APPS, DEMOS
File "/usr/lib/python3.8/site-packages/eiq/apps/pyeiq_launcher/config.py", line 4, in <module>
from eiq.apps.switch_image.switch_image import eIQSwitchLabelImage
File "/usr/lib/python3.8/site-packages/eiq/apps/switch_image/switch_image.py", line 16, in <module>
import cv2
ModuleNotFoundError: No module named 'cv2'

Can You please tell me what is the reason behind the error?

Thank You.

Kind Regards,

Keniz

0 Kudos

3,447 Views
manish_bajaj
NXP Employee
NXP Employee

@kenizgandhi,

What image are you using, what's the BSP version? Make sure you are using  correct version.

 

-Manish

0 Kudos

3,437 Views
kenizgandhi
Contributor III

Hello  @manish_bajaj , 

I solved the above error. Below I have mentioned the other error that I’m currently facing:

Linux version 5.10.9-1.0.0+g32513c25d8c7 (oe-user@oe-host) (aarc1
Machine model: NXP i.MX8MPlus EVK board

I have connected mipi csi2 to the board and able to capture video using mipi ov5640 camera using gst-launch-1.0 v4l2src device=/dev/video1 ! autovideosink and display it on HDMI monitor. 

But when I try to run object detection demo application using pyeiq --run object_detection_tflite --video_src=/dev/video1 . 

I am getting error like :

[16732.920316] mxc-mipi-csi2.0: unsupported csi-sam command -1068476902.
Using /dev/video1 as video device
Your video device could not be initialized. Exiting...

Can you please help me solve this error. 

Thank You. 

Kind Regards,

Keniz

0 Kudos

3,403 Views
manish_bajaj
NXP Employee
NXP Employee

@kenizgandhi,

We will need more details to understand the issue. Please capture all the details for us to better understand the issue. 

Below are the steps we followed.

1. Download the "Linux 5.10.9_1.0.0" for the i.MX 8M Plus EVK from https://www.nxp.com/design/software/embedded-software/i-mx-software/embedded-linux-for-i-mx-applicat...
2. Flash the "imx-image-full-imx8mpevk.wic" to an SD card
3. Insert the SD card, connect the Ethernet port to the internet, insert the OV5640 camera into the CSI1 MIPI slot, and turn on the board. ( CSI 2 MIPI will not work)
4. Login and type "pip3 install pyeiq"
5. Type "pyeiq --run object_detection_tflite --video_src=/dev/video1"
6. Demo plays on screen
 
I will suggest to flash new image in sd card as captured above and follow the steps mentioned.
 
-Manish

4,575 Views
ryanzj_huang
Contributor I

Hi, I used tensorflow model. my theory is we convert tf to tflite and then quantize a model, so we can inference other models on NPU. But problem is there is no proper documentation which i can find. I did find something online. 

1. tf to tflite (https://www.youtube.com/watch?v=cWrb3qIFlCQ&t=5s)

2. quantizing the tf lite model (https://www.youtube.com/watch?v=4iq-d2AmfRU)

After performing this steps, i cannot proceed further as i dont know where to place.

It would be helpful if you confirms us the steps and show the proper documentation of inferencing other models and except the one available in demo. 

And also as per my understanding, eIQ is older version and Pyeiq is newer version. but the documentation i followed is for eiq, so will it make difference.

Thank you

0 Kudos

4,341 Views
manish_bajaj
NXP Employee
NXP Employee

@ryanzj_huang ,

 

Please open the ticket to private community.

-Manish

0 Kudos

4,602 Views
Alifer_Moraes
NXP Employee
NXP Employee

Hello Ryan,

Most of the models we are using are TFLite models, which is also supported by ArmNN. The most indicated spec is that your model is quantized, that way you will get the best performance from NPU.

Most of the demos on PyeIQ runs inference on NPU, except object_detection_dnn and face_and_eyes_detection. You can verify they are running on NPU by the following message:

INFO: Created TensorFlow Lite delegate for NNAPI. Applied NNAPI delegate.

The CPU load you are seeing are other tasks related to demos, such as loading images, resizing, and so on.

Regards,

Alifer

0 Kudos

4,602 Views
manish_bajaj
NXP Employee
NXP Employee

ryanzj.huang@adlinktech.com‌,

Can you clarify what you mean by your own models? What framework are you using? Please share more detail about your Models. We currently support TFlite, ARMNN framework that can run on NPU. NPU expect quantize Model.

-Manish

0 Kudos

4,602 Views
manish_bajaj
NXP Employee
NXP Employee

ryanzj.huang@adlinktech.com,

Can you share detail about your company and project? For 8Mplus question i would suggest to discuss with your assigned support.

-Manish

0 Kudos