Can install tensorflow for python3 on imx8MM EVK board?

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Can install tensorflow for python3 on imx8MM EVK board?

8,052 Views
hiectai
Contributor III

Hello community.
I need to install tensorflow for python3 run on the imx8MM EVK board.
After reference the document about NXP eIQ™ Machine Learning at: "https://www.nxp.com/docs/en/nxp/user-guides/UM11226.pdf" , I was installed tensorflow, running benchmark and building example from sources successfully.

But "python3 import tensorflow as tf" had the error:
root@imx8mmevk:~/bazel# python3
Python 3.5.5 (default, Nov 6 2019, 02:53:57)
[GCC 7.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named 'tensorflow'

I tried "pip3 install tensorflow" but failed.
root@imx8mmevk:~/bazel# pip3 install tensorflow
ERROR: Could not find a version that satisfies the requirement tensorflow (from versions: none)
ERROR: No matching distribution found for tensorflow

I also tried to install tensorflow through bazel as shown in the link below but it failed at build bazel.
https://github.com/samjabrahams/tensorflow-on-raspberry-pi/blob/master/GUIDE.md

ERROR: Could not build Bazel

Let me ask is there any way for tensorflow to work on python3?
Thank so much.

Labels (1)
Tags (2)
10 Replies

6,144 Views
raluca_popa
NXP Employee
NXP Employee

Hi @khoefle ,

The code comments are the only doc available for eIQObjectDetection .

As you mentioned, TFLite does not support non max supression. These are examples of how to interpret the output for detection using just TFLite -> refer to the usage of eiq.engines.tflite.inference.TFLiteInterpreter

 

Regards,

Raluca

0 Kudos

7,409 Views
raluca_popa
NXP Employee
NXP Employee

Hi,

eIQ only supports the C++ API for tflite. Support to build a custom CPP app with tensorflow will be available in the next yocto bsp release. There is currently no eIQ support for the python API for tensorflow nor tflite.

Is python tensorflow a must on your side? For running the inference, tflite is better suited for embedded. You can cross compile a custom CPP app with eIQ tflite support in the yocto toolchain.

If you need the tensorflow python API just to create or adjust a model, this can be done on a HOST PC, with pure tensorflow python API, no need for eIQ for this step. It is recommended to use the same version used in eIQ (1.12) to avoid any compatibility issues when running the inference.

Regards,

Raluca

7,409 Views
hiectai
Contributor III

Thanks for your help.
I have no knowledge of Deep Learning Training or Inference. My task in project is to prepare python tensorflow development environment for imx8mm board.
Excuse me, but could you explain more about:
"If you need the tensorflow python API just to create or adjust a model, this can be done on a HOST PC, with pure tensorflow python API, no need for eIQ for this step. It is recommended to use the same version used in eIQ (1.12) to avoid any compatibility issues when running the inference. "
Does that mean I can training model on a host PC (like ubuntu) with python tensofflow 1.12 and then run inference with trained model on imx8mm EVK board? In this case, app for run inference must be written in C ++ using tensorflow library builded from eIQ, right?
Regards

0 Kudos

7,409 Views
raluca_popa
NXP Employee
NXP Employee

Hi,

Exactly, training can be done on the host PC (like ubuntu) with python tensorflow 1.12.

Depending on what you wan to achieve with your application, there is also the option to use a pre-trained model: Tensorflow Models ; TensorFlow Lite models . Or you can start with a pre-trained model and use transfer learning to specialize it for your use case - check this community post: https://community.nxp.com/docs/DOC-343848

eIQ handles the Inference. For embedded Systems, it is recommended to convert the tensorflow model to tflite. There are several ways to do this:
- Quantization aware training: https://github.com/tensorflow/tensorflow/tree/r1.12/tensorflow/contrib/quantize

- Convert to tflite and quantize model post training - check this post: https://community.nxp.com/community/eiq/blog/2019/07/15/eiq-sample-apps-tflite-quantization

- Convert to tflite post training without quantization (set converter.post_training_quantize = False in the previous post)

To run tflite Inference with eIQ on the board, there are currently two options:
- TFLite runtime. See ref manual 7.2.2. Building example from sources.
- ARMNN runtime. See ref manual 8.2. Using Arm NN in a custom C/C++ application

This isn't a hard rule, but in most of the cases you will get the best performance with quantization aware training and running inference with ARMNN. 

Regards,

Raluca

6,271 Views
khoefle
Contributor II

Any updates on this? I would need tensorflow in Python not for my model, but for post procesisng after a ssd output.

Using the board with a hos PC Is not an option, since it must be used as a standalone. TFLite will also not take care of this. 

0 Kudos

6,249 Views
raluca_popa
NXP Employee
NXP Employee

Hi @khoefle ,

The Yocto BSP provides support for TFLite python API - refer to Chapter 3 TensorFlow Lite: https://www.nxp.com/docs/en/user-guide/IMX-MACHINE-LEARNING-UG.pdf

Here you can find an example of post processing of detection model: 

https://source.codeaurora.org/external/imxsupport/nxp-demo-experience-demos-list/tree/scripts/machin...

Could you clarify exactly why you need Tensorflow for post-processing? In general it is not mandatory, other libraries are used for this purpose.

Regards,

Raluca

0 Kudos

6,190 Views
khoefle
Contributor II

Thank you very much, I will look into the provided link.

From a short look at the snippet, I saw that gstreamer is used for post processing - is this due to the fact that a video is streamed to a host pc - or is it possible to use this snippet without any host machine attached to it, so that the board will work alone?

I need tensorflow mainly for the non max supression part function is not available at tflite. 

I am using the same functions for post processing as available by the eiq portal plugin folder. 

0 Kudos

6,172 Views
raluca_popa
NXP Employee
NXP Employee

Hi @khoefle ,

The board can work alone. Video is not streamed to host PC, it is directly output on a display connected to the board. Gstreamer is available in the yocto image (note that imx-image-full contains all the eIQ related SW).

These set of demos might also be helpful as a starting point (note that the mid term plan is to deprecate pyEiq):

https://source.codeaurora.org/external/imxsupport/pyeiq/tree/eiq/modules/detection?h=v3.0.0

https://community.nxp.com/t5/Blogs/PyeIQ-3-0-0-Release-User-Guide/ba-p/1305998

Regards,

Raluca 

 

6,168 Views
khoefle
Contributor II

Hi @raluca_popa 

Thank you very much, this looks interesting!

Regarding the first link:

1) Is there any documentation on how to use the eIQObjectDetection class?
2) I do not see any usage of the non max supression method, is this capsuled behind the demo base class?

Thank you very much for your help!

Regards,

Kevin

0 Kudos

7,409 Views
roxana_capitanu
NXP Employee
NXP Employee

nxa06357‌, can you please help here?