Hi NXP community.
My first post here...
I purchased the i.mx 8m evaluation kit to work on video streaming from ip cameras. I have successfully got the board to boot from a Linux BSP sd card, can ssh to the board and poke around. There was no gnu gcc compiler out of the box, so I spent a couple days figuring out and setting up a cross compiling environment on my Mac. It works by creating a Docker container with the Ubuntu 18.04 image and gcc build tools. After getting the docker image created and running, I followed the I.MX8 Hello World Linux post here as a guide: i.MX 8 Hello World Linux - ArmV7a 32-bit and ArmV8a 64-bit
I am porting an ffmpeg and openmax based video streaming platform to the i.mx8 and am at a cross roads.
I have the ffmpeg library compiled from source and can connect with a test program to an ip camera and get data from the camera to the i.mx8 evaluation board. Cool so far. I haven't tried any decompression or other steps yet.
Being that there appears to be a long ride on any road I now choose to go down, can anyone recommend whether to keep openmax or suggest an alternative that fits better? I've spent some time looking for openmax downloads for the board, but finally took a step back and decided to ask the community.
The platform I'm porting from will accept streams from multiple cameras, decode them and do some manipulation on the raw raster image, re-encode them and stream back out to client viewers who periodically connect to the platform. Re-encoding is only done when client viewers are connected. The client viewers are able to select at will which camera to view from. That's kind of a high level description of my end goal & hope it makes sense.
I'd really appreciate any direction to how-to docs, examples, source downloads for recommendations you all think will work for me.
- Doug Slattery
"Nothing is impossible if ImPossible"