Installing libGL and friends for Linux 4.9

Showing results for 
Search instead for 
Did you mean: 

Installing libGL and friends for Linux 4.9

Contributor III

This is a how-to for installing GPU binaries for X11 so it can use the Vivante drivers.

Make sure your environment variables are set correctly as there will be lots of cross compiling :-)  See the end of this post for help on setting environment variables for cross compiling.  This should be understood before continuing this post.


My setup is currently built with an Arch Linux root file system,


# Ignore other board DTBs and mute warnings,

sudo tar xf ArchLinuxARM-armv7-latest.tar.gz \

    -C ${ROOTFS} \

    --warning=no-unknown-keyword \

    --exclude="./boot/dtbs" \

    --exclude="./boot/initramfs-linux.img" \


Of course you'll want to use which ever distro works best for you (e.g. Yacto).  Arch Linux comes with a package manager that can install packages without compiling (albeit for generic ARM >< ).

The rest of this post will build and install the following required to get Vivante working with X11.

  • Kernel Vivante module 6.2.4 (kernel-module-imx-gpu-viv)
  • Proprietary libgl binaries (imx-gpu-viv)
  • Xorg Vivante extension (xserver-xorg-video-imx-viv)


The first code to compile is the Linux kernel module for galcore.  There is an existing galcore module in the kernel source tree.  I wrestle with understanding which is more beneficial.  I've chosen to use the module code provided by NXP.  The following generates this binary,

/usr/lib/modules/$(uname -r)/kernel/drivers/mxc/gpu-viv/galcore.ko

git clone 
cd kernel-module-imx-gpu-viv
git checkout upstream/6.2.4.p1.0

Replace the kernel's version,

rm -rf ${KERNELDIR}/drivers/mxc/gpu-viv

mkdir ${KERNELDIR}/drivers/mxc/gpu-viv
cp -R kernel-module-imx-gpu-viv-src/* ${KERNELDIR}/drivers/mxc/gpu-viv/

Then make sure CONFIG_MXC_GPU_VIV is set in the linux configuration file (.config and your project's defconfig file) and rebuild and install the kernel.  Upon booting with the new kernel, dmesg will show Galcore loading with the new version,

$ dmesg | grep -i "galcore"

[ 1.139315] galcore: clk_get vg clock failed, disable vg!
[ 1.141816] Galcore version

The galcore clock failure can safely be ignored.

Required Packages

The following packages need to be installed before continuing (this example is for Arch Linux),

pacman -S --needed xorg-server libx11 libxext xorg-server-devel xf86-video-fbdev \
    pixman icu


Start by downloading the binary,


chmod +x imx-gpu-viv-5.0.11.p8.3-hfp.bin

./imx-gpu-viv-5.0.11.p8.3-hfp.bin --force --auto-accept

If you have a hard time reading their site try switching to source code view in your browser.  Adding --auto-accept means you are agreeing to their EULA.  I use the following method to place the binaries into my rootfs - it could be written more compactly ><

# Exit on error

set -e

# Echo commands for verification
set -x

# Select which interface to use (options: fb, x11, dfb, wl)


# It's very important to set this to your root file system!


cd imx-gpu-viv-5.0.11.p8.3-hfp

pushd gpu-core/usr/lib

# Clean up previous symbolic links from last run
rm -rf****

# Create ny symbolic links to point to our desired interface
ln -s libEGL-${_EGL}.so
ln -s libEGL-${_EGL}.so
ln -s libEGL-${_EGL}.so
ln -s libEGL-${_EGL}.so
ln -s libGAL-${_EGL}.so
ln -s libGLESv2-${_EGL}.so
ln -s libGLESv2-${_EGL}.so
ln -s libGLESv2-${_EGL}.so
ln -s libVIVANTE-${_EGL}.so


# Copy over the Vivante configuration file (Vivante.icd)
sudo cp -R gpu-core/etc ${ROOTFS}

pushd ${ROOTFS}/usr/lib

# Make sure there are no symbolic links.  Also remove the mesa stuff
sudo rm -rf*** mesa/* mesa/*


pushd gpu-core/usr

CP="sudo cp -r --remove-destination"

${CP} lib/libVDK* ${ROOTFS}/usr/lib/
${CP} lib/* ${ROOTFS}/usr/lib/
${CP} lib/libVSC* ${ROOTFS}/usr/lib/
${CP} lib/libGLSLC* ${ROOTFS}/usr/lib/
${CP} lib/libVIV* ${ROOTFS}/usr/lib/
${CP} lib/libGAL* ${ROOTFS}/usr/lib/
${CP} lib/libEGL* ${ROOTFS}/usr/lib/
${CP} lib/libOpen* ${ROOTFS}/usr/lib/
${CP} lib/libGLESv1* ${ROOTFS}/usr/lib/
${CP} lib/libGLESv2* ${ROOTFS}/usr/lib/

if [ ! -d ${ROOTFS}/usr/lib/dri ]; then
    sudo mkdir ${ROOTFS}/usr/lib/dri
${CP} lib/dri/* ${ROOTFS}/usr/lib/dri/

${CP} include/EGL ${ROOTFS}/usr/include/
${CP} include/GLES2 ${ROOTFS}/usr/include/
${CP} include/HAL ${ROOTFS}/usr/include/


# Make sure this file is not used instead of our own
sudo rm ${ROOTFS}/usr/lib/
sudo rm ${ROOTFS}/usr/lib/
sudo rm ${ROOTFS}/usr/lib/
sudo rm ${ROOTFS}/usr/lib/
sudo rm ${ROOTFS}/usr/lib/



So far, the kernel driver has been built and the proprietary binaries have been installed.  Now the hooks must be built so X11 will work correctly.  You will need some X11 headers installed.  For Arch Linux, this is libx11libxext, .  This produces the following,


Grab the source,


tar xf xserver-xorg-video-imx-viv-5.0.11.p8.3.tar.gz

I had to remove `xf86DisableRandR' (deprecated) and replace shadowUpdateRotatePackedWeak and
shadowUpdatePackedWeak by simply removing "Weak()" from,

I also had to wrap "#define XV 1" with checks and remove "#if XV" and it's corresponding "#endif".  This was not ideal since XV is defined in xorg/xorg-server.h and wrapping and forcing this value does not seem like the appropriate solution.  You can use the following patch (which includes the above fixes) to get it working for now,

diff --git a/EXA/src/vivante_fbdev/vivante_fbdev_driver.c b/EXA/src/vivante_fbdev/vivante_fbdev_driver.c
index 063802a..2b78773 100644
--- a/EXA/src/vivante_fbdev/vivante_fbdev_driver.c
+++ b/EXA/src/vivante_fbdev/vivante_fbdev_driver.c
@@ -59,7 +59,9 @@ static Bool gEnableFbSyncExt = FALSE;

+#if !defined(XV)
#define XV 1

#include "mipointer.h"

@@ -801,7 +803,7 @@ FBDevCreateScreenResources(ScreenPtr pScreen)

if(fPtr->shadowFB) {
if (!shadowAdd(pScreen, pPixmap, fPtr->rotate ?
- shadowUpdateRotatePackedWeak() : shadowUpdatePackedWeak(),
+ shadowUpdateRotatePacked : shadowUpdatePacked,
FBDevWindowLinear, fPtr->rotate, NULL)) {
return FALSE;
@@ -1048,7 +1050,7 @@ FBDevScreenInit(SCREEN_INIT_ARGS_DECL)
xf86DrvMsg(pScrn->scrnIndex, X_INFO, "display rotated; disabling DGA\n");
xf86DrvMsg(pScrn->scrnIndex, X_INFO, "using driver rotation; disabling "
- xf86DisableRandR();
+ ;
if (pScrn->bitsPerPixel == 24)
xf86DrvMsg(pScrn->scrnIndex, X_WARNING, "rotation might be broken at 24 "
"bits per pixel\n");
@@ -1130,7 +1132,6 @@ FBDevScreenInit(SCREEN_INIT_ARGS_DECL)
pScreen->CreateScreenResources = FBDevCreateScreenResources;

-#if XV
XF86VideoAdaptorPtr *ptr;

@@ -1139,7 +1140,6 @@ FBDevScreenInit(SCREEN_INIT_ARGS_DECL)
xf86XVScreenInit(pScreen, ptr, n);

if(gEnableXRandR) {
if (!imxDisplayFinishScreenInit(pScrn->scrnIndex, pScreen)) {

Then build and install,

    sysroot=${ROOTFS} prefix=${ROOTFS}/usr LFLAGS="${LDFLAGS}" && \

sudo -E make prefix=${ROOTFS}/usr install

Xorg Configurations

Make sure to configure /etc/X11/xorg.conf file with the vivante information,

Section "Device"
    Identifier "i.MX Accelerated Framebuffer Device"
#    Driver "vivante"
    Option "fbdev" "/dev/fb0"
    Option "vivante_fbdev" "/dev/fb0"

Section "Screen"
    Identifier "Default Screen"
    SubSection "Display"
        Depth 24
        Modes "720x1280"

Obviously make sure that Modes and Depth are set according to your needs.  You may comment out Driver "Vivante" if you are not able to get the Vivante drivers to work.

Starting X

Some pointers when starting Xorg.  If you are just trying to get things started then make sure the following are done,

# Manually load the vivante module,

modprobe vivante

# Make sure the DRI card is accessible by Xorg,
chmod o+rw /dev/dri/card0
Xorg &

export DISPLAY=:0

Or write a udev rule, (/etc/udev/rules.d/98-galcore.rules)

# for mxc_asrc, mxc_ipu, and mxc_vpu,
KERNEL=="mxc_*", MODE="0666"
# Frame buffers (there probably is no fb2)
KERNEL=="fb[0-9]*", MODE="0666"
SUBSYSTEM=="video", MODE="0666"
KERNEL=="galcore", MODE="0666"
# Uncomment the following if you do not intend to add video group to your user
#SUBSYSTEM=="drm", MODE="0666"

The Vivante module can be loaded on startup as well,

echo "vivante" > /etc/modules-load.d/vivante.conf

I've had no luck getting udevadmin to restart and set the permissions correctly so instead, restart the device and everything should look correct. 

Most likely you will have some issues; don't fret.  If requested, I will show how to build xorg-server to ensure matching ABIs and firmware.

***Oddly, Xorg seems to segfault the first time but work afterwords.***

Cross Compiling Environment Variables

export ARCH=arm
export CPU=armv7l
export MARCH=armv7-a
export MTUNE=cortex-a9
export HOST=arm-linux-gnueabihf

export LINARO_PATH=[Cross compiler root path]

export ROOTFS=[Path to your ARM root file system]

export CFLAGS="-march=${MARCH} -mtune=${MTUNE} --sysroot=${ROOTFS}"

# I prefer GOLD since it is much faster
export LDFLAGS="-march=${MARCH} -mtune=${MTUNE} --sysroot=${ROOTFS} -Wl,-fuse-ld=gold"

export CONF_ARGS="\
    --host=${HOST} \
    --with-sysroot=${ROOTFS} \
    --prefix=/usr \
    --sysconfdir=/etc \
export CONFIGURE="./configure ${CONF_ARGS}"

# Some projects appear to be auto tools but really aren't and will need this defined,

# Required for auto tools using pkg-config.  This directs them to the root file system.

# This assumes that you have preserved the install and have not used some other process

# that cleans up /usr/lib/pkfconfig

export PKG_CONFIG_LIBDIR=${ROOTFS}/lib/pkgconfig:${ROOTFS}/usr/lib/pkgconfig

# Make sure the cross compiler is included in your environment variables.  Notice that

# we've used Linaro here ;-)

export PATH=${BUILD}/$(shell uname -m)/bin:${LINARO_PATH}/bin:${shell echo $${PATH}

# Kernel source

export KERNELDIR=[Path to your kernel source]

Labels (4)
Tags (4)
5 Replies

NXP Employee
NXP Employee


   Can we installing GPU binaries for X11 on i.MX8 series platform? Some customer had required ubuntu filesystem, but vivante GPU is default not support X11 backend so i had not enable Gnome with GC7000.


Gnar Fang

0 Kudos

Contributor II

natesigrist‌ Great tutorial!

I'm trying to enagle openGL. I have an IMX6 solo with vivante gpu. I got stuck in the following error. I think I followed the steps you provided and I got this error output at boot. I'm compiling using defconfig with this: CONFIG_MXC_GPU_VIV=y (I don't have access to insmod in my custom board so I can't use modules):

galcore: clk_get vg clock failed, disable vg!
Galcore version
Galcore options:
irqLine = 21
registerMemBase = 0x00130000
registerMemSize = 0x00004000
irqLine2D = 22
registerMemBase2D = 0x00134000
registerMemSize2D = 0x00004000
contiguousSize = 0x02000000
contiguousBase = 0x00000000
externalSize = 0x00000000
externalBase = 0x00000000
bankSize = 0x00000000
fastClear = -1
compression = 15
signal = 48
powerManagement = 1
baseAddress = 0x00000000
physSize = 0x80000000
logFileSize = 0 KB
recovery = 0
stuckDump = 0
gpuProfiler = 0
irqs = -1,
-1, -1,
-1, -1,
-1, -1,
-1, -1,
registerBases = 0x00000000,
0x00000000, 0x00000000,
0x00000000, 0x00000000,
0x00000000, 0x00000000,
0x00000000, 0x00000000,
registerSizes = 0x00000800,
0x00000800, 0x00000800,
0x00000800, 0x00000800,
0x00000800, 0x00000800,
0x00000800, 0x00000800,
chipIDs = 0xFFFFFFFF,
Build options:
gcdGPU_TIMEOUT = 20000
gcdGPU_2D_TIMEOUT = 20000
drv_init(734): Failed to create the GAL device: status=-16
galcore: probe of 130000.gpu failed with error -22

I got the kernel 4.9.88 from imx codeaurora git repo.

this is my gpu node in the dts:

gpu: gpu@00130000 {
compatible = "fsl,imx6dl-gpu", "fsl,imx6q-gpu";
reg = <0x00130000 0x4000>, <0x00134000 0x4000>,
<0x0 0x0>;
reg-names = "iobase_3d", "iobase_2d",
interrupts = <0 9 0x04>, <0 10 0x04>;
interrupt-names = "irq_3d", "irq_2d";
clocks = <&clks 143>, <&clks 27>,
<&clks 121>, <&clks 122>,
<&clks 0>;
clock-names = "gpu2d_axi_clk", "gpu3d_axi_clk",
"gpu2d_clk", "gpu3d_clk",
resets = <&src 0>, <&src 3>;
reset-names = "gpu3d", "gpu2d";
pu-supply = <&reg_pu>;

I am sorry if this question is really dummy, I am new in embbeded linux and gpu :smileyhappy:.

Thanks again for you great tutorial!

Marcos Lopez.

0 Kudos

Contributor I


I'm Fairly new to this so please bear with me. I have followed your procedure but am getting a segmentation fault:

[ 1016.185] (II) VIVANTE(0): Output DISP3 BG - DI1 connected
[ 1016.185] (II) VIVANTE(0): Using exact sizes for initial modes
[ 1016.185] (II) VIVANTE(0): Output DISP3 BG - DI1 using initial mode D:1920x10
80p-60 +0+0
[ 1016.185] (II) VIVANTE(0): imxDisplayPreInit: virtual set 1920 x 1080, displa
y width 1920
[ 1016.185] (II) VIVANTE(0): FBDevPreInit: adjust display width 1920
[ 1016.185] (**) VIVANTE(0): PreInit done
[ 1016.185] (--) Depth 24 pixmap format is 32 bpp
[ 1016.185] (II) VIVANTE(0): Init mode for fb device
[ 1016.268] (II) VIVANTE(0): hardware: DISP3 BG - DI1 (video memory: 16335kB)
[ 1016.269] (II) VIVANTE(0): FB Start = 0x753f9000 FB Base = 0x753f9000 FB Of
fset = (nil) FB PhyBase 0x25200000
[ 1016.269] (II) VIVANTE(0): reserve 8355840 bytes for on screen frame buffer;
total fb memory size 16727040 bytes; offset of shadow buffer 8355840
[ 1016.324] (II) VIVANTE(0): hardware: DISP3 BG - DI1 (video memory: 16335kB)
[ 1016.334] (II) VIVANTE(0): test Initializing EXA
[ 1016.334] (II) VIVANTE(0): (driver build from: -dirty)
[ 1016.334] (EE)
[ 1016.335] (EE) Backtrace:
[ 1016.335] (EE)
[ 1016.335] (EE) Segmentation fault at address 0x1f00000
[ 1016.335] (EE)
Fatal server error:
[ 1016.335] (EE) Caught signal 11 (Segmentation fault). Server aborting
[ 1016.335] (EE)
[ 1016.335] (EE)

The fault address is different every time it runs. I have no idea where to start debugging this, any help would be greatly appreciated.

System details:

Custom board based on IMX6Solo

Kernel version 4.9.88, plucked from Yocto

u-boot, plucked from Yocto

File system : ubuntu 18.04 create using debootstrap and qemu

I do have a Yocto image that works (hardware accelerated X) so I know it can work, but I want to use a proper distro for updates, etc.

Many thanks


0 Kudos

Contributor I

Small typo in the initial git clone command. It should be: 

git clone

0 Kudos

NXP TechSupport
NXP TechSupport

Good job! (y)

0 Kudos