This thread is about running the i.MX8M in dual-display mode with two 1080p video outputs. I have experienced screen-tearing on the eLCDIF-driven display when putting the system under a reasonable amount of load, and it seems that this could be a hardware limitation. NXP BSP commit logs suggest that in dual-display mode, the eLCDIF must be limited to 1280x720p60, and I would like to confirm whether or not that is the case.
The i.MX8M has two display controllers, eLCDIF and DCSS. Section 18.104.22.168 of the reference manual (IMX8MDQLQRM) states that both the eLCDIF and DCSS support resolutions of at least 1920x1080p60. The DCSS is able to drive both MIPI-DSI and HDMI outputs, whereas the eLCDIF is only able to drive MIPI-DSI. Figure 13-1 from the reference manual gives a good picture of this:
A typical application may be to use an ADV7535 MIPI-to-HDMI bridge - as is used in the IMX-MIPI-HDMI adapter for the i.MX8MQ EVK - to achieve two HDMI outputs. I have been testing this exact setup, and encountered some issues which suggest that the eLCDIF may not be able to transmit at 1920x1080p60 when the DCSS is also in use.
2. Default NXP BSP caps the eLCDIF resolution to 720p in dual-display mode
The 4.14.98 BSP from NXP offers a device tree configuration for dual-display mode in arch/arm64/boot/dts/freescale/fsl-imx8mq-evk-dual-display.dts which has the following limit set for the eLCDIF:
status = "okay";
max-res = <1280>, <720>;
If I look in the commit history, I find an explanation given in the commit MLK-18877-3: arm64: dts: imx8mq: Refactor dts files:
2. Removed the max-res binding from the main dtsi file. This is a limitation and should NOT be present in the main definition of lcdif. Use this limitation only in specific dts files which requires such limitations. For example in the dual-display.dts file, where the lcdif is used along with the dcss; lcdif has to be limited because of the low bandwidth when used in the same time with dcss.
Despite this warning - which I cannot find specified in NXP's documentation - it is still possible to increase the max-res value to 1920x1080, and on the NXP image you will get Weston running at that resolution.
3. Screen tearing on eLCDIF-driven display with a custom Linux image
While running a custom Linux image on the i.MX8M EVK with the modified dual-display device tree (max-res=1080p) and HW setup, I encounter serious screen tearing/shuddering artifacts on the eLCDIF-driven interface when running typical GStreamer pipelines which exercise the VPU and display video on the DCSS-driven display. Since these other processes are memory-intensive, it might suggest that I am experiencing the low-bandwidth problem referred to in the commit message above?
If I use max-res=720p (as is default in the BSP), I experience no screen tearing or artifacts whatsoever.
I should note that I was not able to reproduce the screen tearing issues on the NXP image, but I still hope to get some clarification on the HW limitations.
Given the above:
- I would like to know more about the apparent limitation of the eLCDIF when it is used together with the DCSS. Can anybody provide more detail on this matter?
- I am investigating the suitability of the i.MX8M as a platform, and it is important that the eLCDIF is also able to support 1920x1080p60 together with the DCSS. Can NXP or somebody else confirm that there is a way to achieve this, or will there always be a potential bandwidth issue? Can something be done to mitigate that, and if so, what? Or should I conclude that the eLCDIF is limited to 1280x720p60 when using both display controllers?
Please let me know if anything was unclear.