I'm currently building ocr-image-dev and in some of the reading it seemed that nnstreamer and tensorflow-list are in the image-full-dev and tried to build that but it failed. I vaguely remember you can have multiple build types and it will put them all in deploy/images/v5-imx8mpevk/, but that was back on dora I believe.
We're currently at lf-5.10.y but would like to get to kirkstone on 5.15 ultimately. I have a meeting in the morning 9:00am PST and I know we have some type of support, so I can get a support number if you need it. I spent all day trying to figure this out to no avail.
The other info I can give you is that I saw on page 5 of the machine learning document for the neural network that the imx8mp is the only one that supports the NPU, but I think we only need the GPU support which all the imx8 processors seem to have. Good to know we have that option if we need to pursue it. I also trying to clone onto the EVK since i have a network, but I ran out of space unzip'ing the package. Odd that you guys provide .zip files, usually we see tarballs on Linux. One page had some info on building a cmake file for https://github.com/tensorflow/tensorflow.git in hopes the shared library is all I need to link with our app to add the nnstreamer and tensorflow-lite. Then I tried to format a USB thumb drive as ext4 and it did mount it on the EVK but for some reason what was 32gb on Ubuntu 20.04 was only 1.2gb when mounted on the EVK, not sure what was up there, I went back and forth a few times...32gb on Ubuntu, 1.2gb running on the EVK...:-/