Hi @Sunil77,
In our BSP, TensorFlow Lite is included instead of TensorFlow because of this library contains many packages unnecessary to deploy your machine learning model.
For embedded systems with a constrained memory, you will need to develop your machine learning model using a host PC and then export to a TensorFlow Lite model.
Finally, in the i.MX8M Plus you can use Python or C++ API for TFLite interpreter and obtain inferences.