AnsweredAssumed Answered

glTexDirectVIVMap leads GL_INVALID_ENUM if malloc is used

Question asked by Val Dåråsjuk on Aug 24, 2018
Latest reply on Aug 30, 2018 by Val Dåråsjuk

Hi, could somebody explain why glTexDirectVIVMap always leads GL_INVALID_ENUM error in following case?

Tried different formats, sizes, etc.

 

GLuint physical = ~0U;

void *bits = malloc(vF.width() * vF.height());

glBindTexture(GL_TEXTURE_2D, tmpTexId);

glTexDirectVIVMap(GL_TEXTURE_2D,vF.width(),vF.height(), GL_VIV_YV12, (GLvoid **)&bits, &physical); ASSERT(glGetError() != GL_INVALID_ENUM);

 

Originally the problem with GL_INVALID_ENUM happened when tried to play video/x-theora video using following pipeline:

 

filesrc ! oggdemux !  theoradec  !  qtvideorenderersink

 

Where qtvideorenderersink takes GstBuffer, maps it using gst_video_frame_map, takes data and use glTexDirectVIVMap to map data to texture. 

This pipeline produces GL_INVALID_ENUM and no video playback.

 

In case if format is video/x-h264,   following pipeline is used:

filesrc ! qtdemux ! h264parse ! imxvpudec ! qtvideorenderersink

Everything works, video is successfully played.

 

So if I change original x-theora pipeline to:

filesrc ! oggdemux ! theoradec ! imxipuvideotransformqtvideorenderersink

 

glTexDirectVIVMap works correctly without any error and video/x-theora content is shown.

 

So adding imxipuvideotransform makes some magic.

Does it work because it uses imxipuallocator ?

Or glTexDirectVIVMap does not allow to use CPU (not configured?) memory at all on some circumstances ?

 

imx6 nitrogen board on yocto.

 

Thanks

Outcomes