Hi community,
I want to know how to change ghe memory allocation size to i.MX6Q GPU.
In Linux BSP (L3.0.35_4.1.0), it is defined in /arch/arm/mach-mx6/board-mx6q_sabresd.c as following on line 1282, isn't it?
=====
static struct viv_gpu_platform_data imx6q_gpu_pdata __initdata = {
.reserved_mem_size = SZ_128M,
};
=====
Then, how about in Yocto BSP?
What file defines the memory allocation size to i.MX6Q GPU?
Best Regards,
Satoshi Shimoda
Hi there,
From the i.MX6 documentation, the only way to define the amount of memory used by gpu is via kernel command line, there is no function in userspace to do that.
you can define the gpu memory on kernel command line, setting the variable vmalloc=XXXM which XXX can be the ammount of memory you want.
Hope it helps
Have a great day,
-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------
Hi Bio_TICFSL,
> From the i.MX6 documentation,
Could you let me know what documentation did you indicate?
> setting the variable vmalloc=XXXM which XXX can be the ammount of memory you want.
Is vmalloc valid for L3.10.17_1.0.0-ga even though Linux Release Notes (IMX6LXRNSSD Rev L3.10.17_1.0.0-ga) does not mention vmalloc in Table 8?
Best Regards,
Satoshi Shimoda
--- "From the i.MX6 documentation, the only way to define the amount of memory used by gpu is via kernel command line, there is no function in userspace to do that."
The documentation mentioned is Linux release notes.
---"vmalloc=XXXM"is not mentioned in the documentation, but is the way to setting the ammount of memory you want
Regards
Hi Bio_TICFSL,
Thank you for you reply.
OK, I understood.
By the way, our partner tried change contiguousSize from 128MB to 256MB in driver/mxc/gpu-viv/hal/os/linux/kernel/gc_hal_kernel_driver.c, and check memory status with "free" command.
Then, memory size was decrease about 128MB.
So we guess contiguousSize is GPU memory allocation size.
Is our guess correct?
Or memory size was decrease with other reason?
Best Regards,
Satoshi Shimoda
For Kernel 3.10.17-1.0.0_ga it is in:
drivers/mxc/gpu-viv/hal/OS/linux/kernel/gc_hal_kernel_driver.c
Hi Daniel Schaeffer,
Could you give me a reply?
Actually, rendering speed is not enough for 1920x1080 resolution now even though it is ok for XGA (1024x768).
To investigate whether this issue will be resolved by GPU memory allocation size or not, we want to get information about above.
Best Regards,
Satoshi Shimoda
We guess the following code decides the OpenGL memory allocation = 128MB.
It this right?
drivers/mxc/gpu-viv/hal/os/linux/kernel/gc_hal_kernel_driver.c
line 131
#if gcdENABLE_FSCALE_VAL_ADJUST
static ulong contiguousSize = 128 << 20;
#else
static ulong contiguousSize = 4 << 20;
I would appreciate it if you give a reply as soon as possible.
Best Regards,
Satoshi Shimoda
Could you let me know your status?
I want to get a answer as soon as possible because about 1 month has past from create this post.
Best Regards,
Satoshi Shimoda
Bio_TICFSL can you review this question to see if you can help on this case?
Hi Daniel Schaeffer,
Thank you for your quick reply.
I check the file, and I found the following code.
According this code, I think memory allocation size for GPU is 2MB, right?
=====
static ulong registerMemSize = 2 << 10;
module_param(registerMemSize, ulong, 0644);
static int irqLine2D = -1;
module_param(irqLine2D, int, 0644);
static ulong registerMemBase2D = 0x00000000;
module_param(registerMemBase2D, ulong, 0644);
static ulong registerMemSize2D = 2 << 10;
module_param(registerMemSize2D, ulong, 0644);
static int irqLineVG = -1;
module_param(irqLineVG, int, 0644);
static ulong registerMemBaseVG = 0x00000000;
module_param(registerMemBaseVG, ulong, 0644);
static ulong registerMemSizeVG = 2 << 10;
module_param(registerMemSizeVG, ulong, 0644);
=====
And which is memory allocation setting for OpenGL(3D)?
Best Regards,
Satoshi Shimoda