GStreamer pipeline to display hardware decoded h.264 frames to qt video sink failing
Hi everybody,
i'm trying to display hardware decoded h264 frames on my custom Qt QVideoWidget Surface using QMediaPlayer. I'm using gstreamer 1.0 as backend. All gstreamer plugins installed (bad, ugly, good, omx 1.16.2). According to Qt doc a video sink (qvideosink) is loaded on the fly and gst opengl support should be enabled by setting environment variable QT_GSTREAMER_USE_OPENGL_PLUGIN=1, thus adding a new pad capability video-x/raw (memory:GLMemory) to the element. I set it all as specified, all gstreamer pipeline get connected but omxh264dec is giving me the error "Could not configure supporting library."
GST_DEBUG=2
Hardware: raspberry pi 3 b+
Qt: Qt version 5.14.2
Platform Mode: eglfs
DOT GST Pipeline
https://file.io/p4GI3ssP
PNG GST Pipeline
https://file.io/ccRDBRpe
If someone else has faced the same problem and could give a tip on this i would be thankful.
Thanks in advanced.
i'm trying to display hardware decoded h264 frames on my custom Qt QVideoWidget Surface using QMediaPlayer. I'm using gstreamer 1.0 as backend. All gstreamer plugins installed (bad, ugly, good, omx 1.16.2). According to Qt doc a video sink (qvideosink) is loaded on the fly and gst opengl support should be enabled by setting environment variable QT_GSTREAMER_USE_OPENGL_PLUGIN=1, thus adding a new pad capability video-x/raw (memory:GLMemory) to the element. I set it all as specified, all gstreamer pipeline get connected but omxh264dec is giving me the error "Could not configure supporting library."
GST_DEBUG=2
Code: Select all
Attribute Qt::AA_ShareOpenGLContexts must be set before QCoreApplication is created.
videowidgetsurface: "creating myQAbstractVideoSurface"
videowidgetsurface: "supportedPixelFormats" GLTextureHandle
videowidgetsurface: "stop"
0:00:04.631441253 1579 0x6d251980 WARN qtdemux qtdemux.c:7790:qtdemux_parse_container:<qtdemux0> length too long (1507328 > 27)
0:00:04.632760050 1579 0x6d251980 WARN qtdemux qtdemux.c:3237:qtdemux_parse_trex:<qtdemux0> failed to find fragment defaults for stream 1
0:00:04.633326454 1579 0x6d251980 WARN qtdemux qtdemux.c:3237:qtdemux_parse_trex:<qtdemux0> failed to find fragment defaults for stream 2
0:00:04.633658068 1579 0x6d251980 WARN qtdemux qtdemux.c:9865:qtdemux_parse_segments:<qtdemux0> Segment 0 extends to 0:00:09.801455555 past the end of the declared movie duration 0:00:09.676333333 movie segment will be extended
0:00:05.765046970 1579 0x6dc08690 WARN audio-resampler audio-resampler.c:275:convert_taps_gint32_c: can't find exact taps
videowidgetsurface: "start" QVideoSurfaceFormat(Format_ABGR32, QSize(1366, 768), viewport=QRect(0,0 1366x768), pixelAspectRatio=QSize(1, 1), handleType=GLTextureHandle, yCbCrColorSpace=YCbCr_Undefined)
0:00:06.268449235 1579 0x6c807720 ERROR omx gstomx.c:3138:gst_omx_port_populate_unlocked:<omxh264dec-omxh264dec0> Populated egl_render port 221: Incorrect state operation (0x80001018)
0:00:06.268636318 1579 0x6c807720 WARN omxvideodec gstomxvideodec.c:1940:gst_omx_video_dec_loop:<omxh264dec-omxh264dec0> error: Unable to reconfigure output port
videowidgetsurface: "stop"
videowidgetsurface: "stop"
Error: "Could not configure supporting library."
appsrc: push buffer wrong state
Hardware: raspberry pi 3 b+
Qt: Qt version 5.14.2
Platform Mode: eglfs
DOT GST Pipeline
https://file.io/p4GI3ssP
PNG GST Pipeline
https://file.io/ccRDBRpe
If someone else has faced the same problem and could give a tip on this i would be thankful.
Thanks in advanced.
- 6by9
- Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator - Posts: 18476
- Joined: Wed Dec 04, 2013 11:27 am
Re: GStreamer pipeline to display hardware decoded h.264 frames to qt video sink failing
gst-omx tries to munge video_decode and egl_render into a single entity as omxh264dec, and doesn't seem to offer any options to deliver the raw frame as YUV. egl_render relies on the old firmware GLES driver which doesn't exist on the Pi4. In theory you could build gst-omx without egl_render, but it wasn't trivial when I tried.
We now have decode support via V4L2, so you should be able to use v4l2h264dec instead of omxh264dec. That doesn't have this weird combining of components.
We now have decode support via V4L2, so you should be able to use v4l2h264dec instead of omxh264dec. That doesn't have this weird combining of components.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.
I'm not interested in doing contracts for bespoke functionality - please don't ask.
Re: GStreamer pipeline to display hardware decoded h.264 frames to qt video sink failing
Hi 6by9, thanks for the fast reply.
Does the mentioned constraint also apply to the rpi3 model b+ ?
If so, how to enable decode support via V4L2 ?
best regards.
Does the mentioned constraint also apply to the rpi3 model b+ ?
If so, how to enable decode support via V4L2 ?
best regards.
- 6by9
- Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator - Posts: 18476
- Joined: Wed Dec 04, 2013 11:27 am
Re: GStreamer pipeline to display hardware decoded h.264 frames to qt video sink failing
Apologies, I hadn't seen the reference to 3B+ - this has been asked a number of times for Pi4, so I'd missed that you had said it was 3B+.
Under 3B+ egl_render should be present, as long as you haven't set gpu_mem to 16MB, but if you did that then you wouldn't have the video_decode component either.
It's not an error I'm aware of, but Google suggests insufficient gpu_mem - https://stackoverflow.com/questions/238 ... i#23869705
Under 3B+ egl_render should be present, as long as you haven't set gpu_mem to 16MB, but if you did that then you wouldn't have the video_decode component either.
It's not an error I'm aware of, but Google suggests insufficient gpu_mem - https://stackoverflow.com/questions/238 ... i#23869705
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.
I'm not interested in doing contracts for bespoke functionality - please don't ask.
Re: GStreamer pipeline to display hardware decoded h.264 frames to qt video sink failing
I see, but i don't think gpu memory allocated to gpu is the reason, because i can play the media using omxplayer without problem. I as soon as i get much time i'll give boot2qt a try and test my application on it.
root@raspberrypi3:~# vcgencmd get_mem arm
arm=576M
root@raspberrypi3:~# vcgencmd get_mem gpu
gpu=448M
Thanks.
root@raspberrypi3:~# vcgencmd get_mem arm
arm=576M
root@raspberrypi3:~# vcgencmd get_mem gpu
gpu=448M
Thanks.
Re: GStreamer pipeline to display hardware decoded h.264 frames to qt video sink failing
Hi all,
in case some else face the same issue, i have managed to pass the decoded frame trough gstreamer pipeline by disabling sync on Qt default pipeline hard coded in gsttools inside qtmultimedia module . Now i can play decoded frames. But another problem emerge from that, when playing small resolution videos its plays smoothly, on the other hand playing >= 720p takes much cpu time due copying frame from gpu to cpu memory. So another way to achieve a best performance takes place. The qvideosink pass the OpenGl Texture Id to my application, so i can associate it to a frame buffer object and read memory directly, but before that i need to correct add a that texture as fbo render color buffer.
my basic opengl code follows:
I'm getting Incomplete frame buffer error GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT (36054) on status despite all calls to built up fbo being executed successfully . So, how can i correctly bind the passed gst-omxdec texture id to my fbo render color target?
Thanks in advance.
in case some else face the same issue, i have managed to pass the decoded frame trough gstreamer pipeline by disabling sync on Qt default pipeline hard coded in gsttools inside qtmultimedia module . Now i can play decoded frames. But another problem emerge from that, when playing small resolution videos its plays smoothly, on the other hand playing >= 720p takes much cpu time due copying frame from gpu to cpu memory. So another way to achieve a best performance takes place. The qvideosink pass the OpenGl Texture Id to my application, so i can associate it to a frame buffer object and read memory directly, but before that i need to correct add a that texture as fbo render color buffer.
my basic opengl code follows:
Code: Select all
QImage image( currentFrame.width(), currentFrame.height(), QImage::Format_RGB16 );
GLint format = 0;
GLint type = 0;
QOpenGLContext* ctx = QOpenGLContext::currentContext();
QOpenGLFunctions* f = ctx->functions();
GLuint framebuffer;
GLuint depthRenderbuffer;
GLCHK(f->glGetIntegerv( GL_FRAMEBUFFER_BINDING, &prevFbo ));
GLCHK(f->glGenFramebuffers(1, &framebuffer));
GLCHK(f->glBindFramebuffer(GL_FRAMEBUFFER, framebuffer));
GLCHK(f->glGenRenderbuffers(1, &depthRenderbuffer));
GLCHK(f->glBindTexture(GL_TEXTURE_2D, texture));
GLCHK(f->glBindRenderbuffer(GL_RENDERBUFFER, depthRenderbuffer));
GLCHK(f->glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, texWidth, texHeight));
GLCHK(f->glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texture, 0));
GLCHK(f->glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthRenderbuffer));
status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if(status != GL_FRAMEBUFFER_COMPLETE) {
qCDebug(LOG_RESERV) <<"Problem with OpenGL framebuffer" << status;
} else {
qCDebug(LOG_RESERV) << "FBO creation succedded";
}
GLCHK(f->glReadPixels( 0, 0, texWidth, texHeight, GL_RGBA, GL_UNSIGNED_BYTE, image.bits() ));
GLCHK(f->glBindTexture(GL_TEXTURE_2D, 0));
GLCHK(f->glBindFramebuffer(GL_FRAMEBUFFER, prevFbo));
Thanks in advance.
Re: GStreamer pipeline to display hardware decoded h.264 frames to qt video sink failing
Hi all, i did some progress. i identified that gst_video_frame_map is called and for bigger Frames its takes much time >50ms which postpone the next frame delivery. Also, i saw a Video Core Shared Memory, but i simply can get it to work, my vcsm buffer keeps empty during all the rendering time. Does the Texture Id passed is valid, is it empty ? All OpenGl calls succeeded.
Qt Video Render Sink pass a textureId at following point.
videoBuffer is passed to my QAbstractVideoSurface::present method indirectly trought a QVideoFrame (frame) object later.
on my QAbstractVideoSurface::start method i initialize vcsm and create a EGLmageKHR.
Then later on my QAbstractVideoSurface::present method i try to grab the passed Texture content, but does work at all :(.
The texture bits simply don't seem to be mapped at all. A region is allocated but filled with 0x00 and 0xFF.
What am i doing wrong ?
Qt Video Render Sink pass a textureId at following point.
Code: Select all
GstGLMemory *glmem = GST_GL_MEMORY_CAST(gst_buffer_peek_memory(buffer, 0));
guint textureId = gst_gl_memory_get_texture_id(glmem);
videoBuffer = new QGstVideoBuffer(buffer, m_videoInfo, m_format.handleType(), textureId);
...
if (!videoBuffer)
videoBuffer = new QGstVideoBuffer(buffer, m_videoInfo);
QVideoFrame frame(
videoBuffer,
m_format.frameSize(),
m_format.pixelFormat());
QGstUtils::setFrameTimeStamps(&frame, buffer);
return surface->present(frame); // here
on my QAbstractVideoSurface::start method i initialize vcsm and create a EGLmageKHR.
Code: Select all
int w = Util::nextPOT(size.width());
int h = Util::nextPOT(size.height());
vcsm_info.width = w;
vcsm_info.height = h;
vcsm_init();
const EGLint attrib[] = {
EGL_IMAGE_PRESERVED_KHR, EGL_TRUE,
EGL_NONE, EGL_NONE
};
eglFbImage = eglCreateImageKHR(eglGetCurrentDisplay(), EGL_NO_CONTEXT, EGL_IMAGE_BRCM_VCSM, &vcsm_info, attrib);
if (eglFbImage == EGL_NO_IMAGE_KHR || vcsm_info.vcsm_handle == 0) {
qCDebug(LOG_RESERV) << QString("%1: Failed to create EGL VCSM image\n").arg(VCOS_FUNCTION);
}else{
qCDebug(LOG_RESERV) << QString(" *********** VCSM Image created with %1 %2 ").arg(w).arg(h);
}
Code: Select all
QOpenGLFunctions* f = ctx->functions();
GLuint framebuffer;
GLuint depthRenderbuffer;
GLint prevFbo;
GLenum status = GL_FRAMEBUFFER_COMPLETE;
GLuint texture = static_cast<GLuint>( currentFrame.handle().toInt() );
int texWidth = Util::nextPOT(currentFrame.width());
int texHeight = Util::nextPOT(currentFrame.height());
GLCHK(f->glGetIntegerv( GL_FRAMEBUFFER_BINDING, &prevFbo ));
GLCHK(f->glGenFramebuffers(1, &framebuffer));
GLCHK(f->glBindFramebuffer(GL_FRAMEBUFFER, framebuffer));
GLCHK(glActiveTexture(GL_TEXTURE0));
GLCHK(glBindTexture(GL_TEXTURE_2D, texture));
GLCHK(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST));
GLCHK(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST));
GLCHK(glEGLImageTargetTexture2DOES(GL_TEXTURE_2D, eglFbImage));
GLCHK(f->glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texture, 0));
GLCHK(glBindTexture(GL_TEXTURE_2D, 0));
status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if(status != GL_FRAMEBUFFER_COMPLETE) {
qCDebug(LOG_RESERV) <<"Problem with OpenGL framebuffer after specifying color render buffer: " << status;
} else {
qCDebug(LOG_RESERV) << "FBO creation succedded";
}
GLCHK(f->glFinish());
uint8_t *vcsmBuffer;
VCSM_CACHE_TYPE_T cacheType;
vcsmBuffer = (uint8_t*)vcsm_lock_cache(vcsm_info.vcsm_handle, VCSM_CACHE_TYPE_HOST, &cacheType);
if (!vcsmBuffer){
qCDebug(LOG_RESERV) << "Failed to lock VCSM buffer!";
}else{
unsigned char *line_start = (unsigned char *) vcsmBuffer;
for (int i = 0; i < texHeight; i++) {
QString s;
QString result = "";
for(int j=0; j < texWidth; j++){
s = QString("%1").arg(line_start[j],0,16);
result.append(s);
}
qCDebug(LOG_RESERV) << result;
line_start += texWidth;
}
}
vcsm_unlock_ptr(vcsmBuffer);
....
What am i doing wrong ?
Jump to
- Community
- General discussion
- Announcements
- Other languages
- Deutsch
- Español
- Français
- Italiano
- Nederlands
- 日本語
- Polski
- Português
- Русский
- Türkçe
- User groups and events
- Raspberry Pi Official Magazine
- Using the Raspberry Pi
- Beginners
- Troubleshooting
- Advanced users
- Assistive technology and accessibility
- Education
- Picademy
- Teaching and learning resources
- Staffroom, classroom and projects
- Astro Pi
- Mathematica
- High Altitude Balloon
- Weather station
- Programming
- C/C++
- Java
- Python
- Scratch
- Other programming languages
- Windows 10 for IoT
- Wolfram Language
- Bare metal, Assembly language
- Graphics programming
- OpenGLES
- OpenVG
- OpenMAX
- General programming discussion
- Projects
- Networking and servers
- Automation, sensing and robotics
- Graphics, sound and multimedia
- Other projects
- Media centres
- Gaming
- AIY Projects
- Hardware and peripherals
- Camera board
- Compute Module
- Official Display
- HATs and other add-ons
- Device Tree
- Interfacing (DSI, CSI, I2C, etc.)
- Keyboard computers (400, 500, 500+)
- Raspberry Pi Pico
- General
- SDK
- MicroPython
- Other RP2040 boards
- Zephyr
- Rust
- AI Accelerator
- AI Camera - IMX500
- Hailo
- Software
- Raspberry Pi OS
- Raspberry Pi Connect
- Raspberry Pi Desktop for PC and Mac
- Beta testing
- Other
- Android
- Debian
- FreeBSD
- Gentoo
- Linux Kernel
- NetBSD
- openSUSE
- Plan 9
- Puppy
- Arch
- Pidora / Fedora
- RISCOS
- Ubuntu
- Ye Olde Pi Shoppe
- For sale
- Wanted
- Off topic
- Off topic discussion