We have encountered a strange problem on our customiezed board based on i.mx6q platform.
The LTIB package is L3.0.35_12.05.01.01_GA_source.tar.gz
Steps to reproduce the problem are listed below
1. Build the GPU driver ( galcore.ko ) in module way.
2. Poweron the board.
3. insmod the driver module (galcore.ko ) into kernel from usb disk.
4. The driver probe process is halted forever.
After added some debug infor in to kernel source code, we found the probe process is halted in _ResetGPU() function which is located in
../drivers/mxc/gpu-viv/arch/XAQ2/hal/kernel/gc_hal_kernel_hardware.c
Then we checked why the _ResetGPU can't work. We found that GPU idle register value and control register value are always 0x0. Below is the code.
for(;;)
/* Disable clock gating. */
gcmkONERROR(gckOS_WriteRegisterEx(Os,
Core,
Hardware->powerBaseAddress +
0x00104,
0x00000000));
......
gcmkONERROR(gckOS_ReadRegisterEx(Os,
Core,
0x00004,
&idle));
printk("@@GPU idle_reg: 0x%x\n",idle);
gcmkONERROR(gckOS_ReadRegisterEx(Os,
Core,
0x00000,
&control));
printk("@@GPU control_reg: 0x%x\n",control);
......
Solved! Go to Solution.
You don´t have to use the driver generated by the source code. Instead, you use the driver built in kernel. The only things you get from the gpu source is the libraries... the .ko comes from kernel.
The problem was in my previous tests, the gpu and vpu were never powered on. I talked here about vpu because I used also the vpu. And the tests on the gpu gave me the same kind of problems.
Excellent !
Hi AndreSilva ,
I am concerned about the issue reported on this link
Low 2D graphics performance on KitKat and Lollipop
which can be issue for my issues
Android Lollipop MediaCodec not decoding raw H264 stream 1080P at 60 FPS
Slow H264 1080P@60fps Decoding on Android Lollipop 5.0.2 SabreSD
can you please help me to find out the root cause.
Regards,
Gurtaj
Ok, i made some new tests and now it's working. The problem was I used wrong combinations of dtb, kernel and distribution.
In the repository of freescale, 3 dbt are proposed (normal, ldo and hdcp). When gpu and vpu drivers are activated in the kernel, it seems I need to use the ***-ldo.dtb. With that it works. I'm not completely sure of that because, the system is blocked when I use the normal dtb and gpu and vpu driver activated in the kernel (the board is connected via usb and i used a screen console for the interface. The system is completely launched, but the keyboard doesn't respond anymore).
But this is quite cool, with the official imx6 kernel and the good dtb, you can use any distribution of linux.
Hi,
I've got the same problem with my imx6q-sabresd.
gcmkONERROR(gckOS_WriteRegisterEx(Os,Core,0x00000,0x00000900)); in file gpu-viv/arch/XAQ2/hal/kernel/gc_hal_kernel_hardware.c
I check the .dtb file, i try to add the option status = "okay" in the "gpu block" but i don't have any good results. The GPU seems not activated.
So, i've got different questions :
Thanks in advance.
Hi Jerome.
which BSP are you using ? you can test if the gpu is correctly installed by running the tiger app demo placed in /opt/viv_samples.
I am not aware of the issue you are facing, by default the gpu driver is set to be kernel built-in not module.
regards,
Andre
I make the tests with BSP 3.10.17-ga
I'm not able for the moment to make the tiger test because I don't have any hdmi screen. I just work in console mode (via a usb connection and screen or via ssh). But I think the test will fail (normally I could do it at the end of this week). Could you supply me any .dtb for sabresd with gpu activated ? I check in the code and the register addresses for the gpu seem to be the problem. When it attempts to write in the register, the system is completely blocked and i need to reset the board.
I catched the .dtb from the official git of freescale (http://git.freescale.com/git/cgit.cgi/imx/linux-2.6-imx.git/), version 3.10.17ga. I compiled it from the .dts and .dtsi.
Are you using another kernel in your bsp ? I mean, have you built the kernel from git and then used it with the BSP you downloaded from the website ?
In the different solutions that I tried :
I tested for gpu driver in the kernel and as a module.
have you tested using our board with no changes ? you did change only the kernel for your customized board ? did you rebuild the gpu driver (you need the source code). I never worked on the dtb files as you are doing.
regards,
Andre
Hi Andre, As it turns out, our SoloLites have the GPU disabled. Case closed! Thanks for your help. -John
Great news, sorry for not asking that before, but as you guys were having that issue I thought it was already enabled =)
regards,
Andre
You don´t have to use the driver generated by the source code. Instead, you use the driver built in kernel. The only things you get from the gpu source is the libraries... the .ko comes from kernel.
Issue fixed.
awesome =)
Andre,
I'm having an issue similar to what Jun Sun reported: during gpu_init() the CPU is stuck in an infinite loop in _ResetGPU().
I'm using an imx6sl with the jb4.3_1.1.0-ga version of the kernel. galcore is built within the kernel. I've validated that LDOs are setup correctly.
Any help would be greatly appreciated.
-John
Hi John,
can you build the kernel setting the GPU driver to be installed as an external module and see if the issue still persists ?
thanks,
Andre
Hi Andre, I'm trying that now. We've used the same kernel on our quad/solo designs without a problem -- the sololite is a bit different. One key difference between our board and the FSL reference designs is that we don't use a PMIC. Reading the GPU2D address space returns all zeroes, which could mean lack of clock/power. I just received an EVK2 board, so I'll be looking at that as well. Thanks for your help. -John
Excellent, let me know.
regards,
Andre
I built galcore as kernel module and the galcore options are below:
Galcore version 4.6.9.9754
galcore options:
irqLine = -1
registerMemBase = 0x00000000
registerMemSize = 0x00004000
irqLine2D = 42
registerMemBase2D = 0x02200000
registerMemSize2D = 0x00004000
irqLineVG = 43
registerMemBaseVG = 0x02204000
registerMemSizeVG = 0x00004000
contiguousSize = 33554432
contiguousBase = 0x9E000000
bankSize = 0x00000000
fastClear = -1
compression = -1
signal = 48
baseAddress = 0x80000000
physSize = 0x00000000
logFileSize = 0 KB
powerManagement = 1
gpuProfiler = 0
coreClock = 156000000
gckGALDEVICE_Construct -- IrqLine: -1 RegisterMemBase: 0
gckGALDEVICE_Construct -- IrqLine2D: 42 RegisterMemBase: 2200000
gckGALDEVICE_Construct -- IrqLineVG: 43 RegisterMemBase: 2204000
gckGALDEVICE_Construct -- physical: 2200000 device->registerBases: E0878000
gckGALDEVICE_Construct -- physical: 2204000 device->registerBases: E0880000
I believe the options look okay for a sololite -- do they match what you've seen? Problem persists, any reads from the GPU2D memory space returns zero. I've validated that the LDO values look okay. Where else do you recommend I try?
Thanks,
-John