We use some essential cookies to make our website work.

We use optional cookies, as detailed in our cookie policy, to remember your settings and understand how you use our website.

14 posts • Page 1 of 1
luc4
Posts: 75
Joined: Mon Nov 12, 2012 12:28 am

egl_render on the vc4 driver

Sat Feb 22, 2020 11:04 am

Hello! I read that the OpenMAX egl_render component is not going to work on the new KMS stack based on the vc4 driver. Is this correct?
Do you plan to support it in the future or is it never going to be available anymore?
What are the options now to stream frames efficiently into an OpenGL application?
Thanks.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 18477
Joined: Wed Dec 04, 2013 11:27 am

Re: egl_render on the vc4 driver

Sat Feb 29, 2020 2:55 pm

egl_render can not be supported as there is no link from IL into the GL 3D stack.

The efficient mechanism for importing buffers into EGL is to use dma-bufs, and import them using eglCreateImageKHR, eg https://github.com/6by9/drm_mmal/blob/x ... mal.c#L479.

The preferred mechanism to get your dmabuf would be to use the V4L2 wrapper around the video codecs, where you can then call the V4L2 ioctl VIDIOC_EXPBUF. As it's a file descriptor, don't forget to close it when you're done.
It is possible to get vcsm to export dmabufs, but ideally we only want to support import there.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

luc4
Posts: 75
Joined: Mon Nov 12, 2012 12:28 am

Re: egl_render on the vc4 driver

Mon Mar 02, 2020 12:51 pm

There is one point that is not clear to me: do you mean I should rewrite my code entirely using V4L2 to decode and place decoded frames into dmabufs, or do you mean there is a way for me to keep my code unaltered and just ask OpenMAX to output to dmabufs, that will then be imported as EGLImage's?

I have another question for you, if possible: in the past I was also able to use two different dispmanx layers to show a video (with omxplayer) while an OpenGL application was running in a layer above. As you explained to me in the past, this technique was using the HVS instead of converting every frame in the GPU, resulting in much better performance. Is anything similar possible in KMS? You already explained that only one KMS client is allowed, but what about using multiple planes? Would that use the HVS?

Thank you for the info you provided!

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 18477
Joined: Wed Dec 04, 2013 11:27 am

Re: egl_render on the vc4 driver

Mon Mar 02, 2020 1:39 pm

luc4 wrote:
Mon Mar 02, 2020 12:51 pm
There is one point that is not clear to me: do you mean I should rewrite my code entirely using V4L2 to decode and place decoded frames into dmabufs, or do you mean there is a way for me to keep my code unaltered and just ask OpenMAX to output to dmabufs, that will then be imported as EGLImage's?
OpenMax can not support dmabufs. We have been advising against it for a fair while, mainly as it's a right pain as APIs go.
There is a route to use MMAL with dmabufs, but that would still be a rewrite.
V4L2 is the Linux kernel standard API for talking to codecs, and we are moving towards using standard APIs wherever possible on current and future models. This would be our recommendation, or abstract it away by using FFmpeg.
luc4 wrote:I have another question for you, if possible: in the past I was also able to use two different dispmanx layers to show a video (with omxplayer) while an OpenGL application was running in a layer above. As you explained to me in the past, this technique was using the HVS instead of converting every frame in the GPU, resulting in much better performance. Is anything similar possible in KMS? You already explained that only one KMS client is allowed, but what about using multiple planes? Would that use the HVS?
DRM/KMS supports multiple planes, with the exact number of planes being determined by the driver (up to a max of 32 currently).
Slightly differently from DispmanX, all planes have to be created by the same application. DRM/KMS does not allow random applications to jump in and add a random layer at a random z-position.
Currently we have a hybrid driver (fkms) where the DRM/KMS implementation is sitting on top of DispmanX, so you can subvert it. The full driver is in the works.
Yes, it is all composed by the HVS.

For an example of sending video directly to DRM, look at the master branch of drm_mmal. That was written first, and Eric converted it to support X11 and EGL.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

jannewmarch
Posts: 40
Joined: Thu Jan 17, 2013 12:45 am

Re: egl_render on the vc4 driver

Tue Mar 03, 2020 10:07 am

From sam_nazarko in the OSMC forum on 31 Oct 2019 at [Any update on the Pi4?][https://discourse.osmc.tv/t/any-update-on-the-pi4/82775]
I don’t have any news at this time. As the device is new and MMAL is being deprecated from Kodi, we need to support V4L2/GBM. This is still a work in progress by Raspberry Pi

In the interim we will support software decoding. It will be a long time (at least a year) before hardware video acceleration is working well using this stack.
So it looks like OpenMAX IL is on the way out (hurray!), MMAL isn't really in favour (at least with Kodi) and you might have to wait for a long term GPU-accelerated solution.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 18477
Joined: Wed Dec 04, 2013 11:27 am

Re: egl_render on the vc4 driver

Tue Mar 03, 2020 2:07 pm

jannewmarch wrote:
Tue Mar 03, 2020 10:07 am
From sam_nazarko in the OSMC forum on 31 Oct 2019 at [Any update on the Pi4?][https://discourse.osmc.tv/t/any-update-on-the-pi4/82775]
I don’t have any news at this time. As the device is new and MMAL is being deprecated from Kodi, we need to support V4L2/GBM. This is still a work in progress by Raspberry Pi

In the interim we will support software decoding. It will be a long time (at least a year) before hardware video acceleration is working well using this stack.
So it looks like OpenMAX IL is on the way out (hurray!), MMAL isn't really in favour (at least with Kodi) and you might have to wait for a long term GPU-accelerated solution.
Huh? OSMC post from October.

A V4L2 stateful codec API implementation is in the standard kernel builds for H264 decoding (and MPEG4, H263, MPEG2, and VC1 on earlier Pis with licence keys where appropriate) and has been since May 2019 (ie before launch of the Pi4). It's sitting on top of the old MMAL components running on the VPU.

A V4L2 stateless decoder driver for HEVC should be merged to our downstream tree within a couple of weeks. The LibreElec folk have been testing it for probably the last month.

GBM/DRM is already supported and has been since launch.
There is an improved driver for which the first revision has been sent to the dri-devel mailing lists to eventually get mainlined. It's likely to get merged to our downstream tree sooner than that.

MMAL will continue to be supported on Pi0-4, and it is nearly working on 64bit kernels and userspace too. There are no guarantees as to whether it will still exist on future platforms.
Yes, OpenMax will be dropped at some point in the future. I certainly do not intend to try converting it to support a 64bit userspace.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

jannewmarch
Posts: 40
Joined: Thu Jan 17, 2013 12:45 am

Re: egl_render on the vc4 driver

Fri Mar 06, 2020 12:44 am

Thanks 6by9. My understanding now is
  • OpenMAX should be deprecated and not used for new projects. According to Wikipedia Khronos entry, OpenMAX is now an inactive group and 6by9 says it will not be actively supported on the RPi
  • Some features of OpenMAX IL that worked on the RPi0-3 using DispmanX are not supported on the RP4 and never will be.
    So e.g. OpenMAX rendering to an OpenGLES surface won't ever work on the RPi4
  • If you want to write new code using h/w acceleration for the RPi0-4, use MMAL with V4L2 for decoding
  • MMAL may or may not be supported in later RPis
  • In the long term, h/w acceleration will use V4L2/GBM. According to OSMC, this will require the Linux 5.2 kernel at least, and Raspbian is currently 4.19
  • I'm not clear if V4L2/GBM will only be for the RPi4 or for all the family
  • Some projects such as LibreElec are using V4L2/MMAL. Others such as OSMC are waiting for V4L2/GBM on the RPi4 and in the meantime do video decoding in s/w
  • 6by9 is working on V4L2 with M2M - is this related to V4L2/GBM?
So I guess it is time for me to learn V4L2 :)

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 18477
Joined: Wed Dec 04, 2013 11:27 am

Re: egl_render on the vc4 driver

Fri Mar 06, 2020 9:19 am

jannewmarch wrote:
Fri Mar 06, 2020 12:44 am
Thanks 6by9. My understanding now is
  • OpenMAX should be deprecated and not used for new projects. According to Wikipedia Khronos entry, OpenMAX is now an inactive group and 6by9 says it will not be actively supported on the RPi
  • Some features of OpenMAX IL that worked on the RPi0-3 using DispmanX are not supported on the RP4 and never will be.
    So e.g. OpenMAX rendering to an OpenGLES surface won't ever work on the RPi4
  • If you want to write new code using h/w acceleration for the RPi0-4, use MMAL with V4L2 for decoding
V4L2 currently offers video decode (/dev/video10), encode (dev/video11), resize/format convert (/dev/video12), and the basic camera (normally /dev/video0). There are other use cases that MMAL can offer which V4L2 currently can't, but V4L2 should be used where possible.
The place that MMAL is ahead of our V4L2 implementation is interlaced video support. At present our decoder says it only supports progressive as I haven't sorted out the plumbing required to put the correct signalling on all the frames. We then probably need to add a deinterlacing device node as well.

For the equivalent of video_render or egl_render you need to be looking at DRM and EGL respectively.
jannewmarch wrote:
  • MMAL may or may not be supported in later RPis
We'll support it where possible, but hardware will change in the future, and we're moving towards standard APIs where possible.
jannewmarch wrote:
  • In the long term, h/w acceleration will use V4L2/GBM. According to OSMC, this will require the Linux 5.2 kernel at least, and Raspbian is currently 4.19
There are two V4L2 codec APIs - the stateful one (pass a raw bitstream in and get frames), and the stateless one (s/w has to do a load of work first to decode headers and manage reference frames).
The legacy H264 (and H263/MPEG4/MPEG2/VC1) codec block uses the stateful API, and that's been in for several years, although they've only just finished off the docs. It's available on our 4.19 branch.
The HEVC block on Pi4 will use the stateless API. MPEG2 and H264 stateless API are merged in around 5.3. HEVC will require 5.6/5.7 as the relevant controls haven't fullydefined yet. We'll be back porting that to 5.4 as the next Long Term Support (LTS) kernel release. 5.4 is nearing the state where it'll be available through "BRANCH=next rpi-update" for those wanting to try it out.
jannewmarch wrote:
  • I'm not clear if V4L2/GBM will only be for the RPi4 or for all the family
Supported on all.
The 3D won't be very usable on Pi0/1 as all the preparation and config work is done on the ARM core which is really too low on performance.
DRM/KMS and GBM allocations will be supported.
V4L2 will be supported.
jannewmarch wrote:
  • Some projects such as LibreElec are using V4L2/MMAL. Others such as OSMC are waiting for V4L2/GBM on the RPi4 and in the meantime do video decoding in s/w
Too many TLAs that get intertwined.
V4L2 - codecs and cameras
DRM - Direct Rendering Manager. Puts layers on the screen
KMS - Kernel Mode Setting. Sets up the screen resolution and timings.
GBM - Generic Buffer Management. Actually part of the Mesa 3D library, but generally is actually sitting on top of DRM doing the allocation. AIUI 3D will almost always be using GBM for the output resources.
When passing objects from V4L2 into DRM you generally pass a dmabuf around rather than copying pixels between the subsystems. A dmabuf is a file handle type thing that can be exported from one kernel subsystem and imported into one or more other subsystems. Either V4L2 or DRM can be the allocator/exporter.

LibreElec are using GBM for 3D and their GUI. V4L2 stateful API has a few niggles still in FFmpeg (you can't seek, and dmabuf support isn't in upstream). Stateless is generally working with the Pi4 HEVC block but again the patches haven't been pushed upstream.
We're mainly working with Libreelec as dom/popcornmix is both a Pi employee and a dev on the project.
jannewmarch wrote:
  • 6by9 is working on V4L2 with M2M - is this related to V4L2/GBM?
M2M = Memory to Memory, ie something that isn't an input or output device but still talks video. Codecs, scalers/converters, and deinterlacing are the main ones.
jannewmarch wrote:So I guess it is time for me to learn V4L2 :)
FFmpeg or GStreamer are probably the better options as it abstracts you from the nitty gritty.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

jannewmarch
Posts: 40
Joined: Thu Jan 17, 2013 12:45 am

Re: egl_render on the vc4 driver

Sat Mar 07, 2020 1:35 am

Thanks 6by9! This kind of detailed explanation of where things are and where they are going is invaluable.

luc4
Posts: 75
Joined: Mon Nov 12, 2012 12:28 am

Re: egl_render on the vc4 driver

Sat Mar 07, 2020 11:47 am

I agree. Thanks 6by9 for the info!

drgeoffathome
Posts: 13
Joined: Wed Sep 16, 2015 5:00 am

Re: egl_render on the vc4 driver

Sat Mar 14, 2020 7:57 am

In an attempt to explore the V4L2 -> DRM via dmabuf I came across some interesting code written by 6by9. The following should replicate what I did

Code: Select all

git clone https://github.com/6by9/drm-v4l2-test.git
cd drm-v4l2-test
make
#wahoo this is going great so far
./dmabuf-sharing -M vc4 -i /dev/video0 -b 2 -S 640,480
G_FMT(start): width = 640, height = 480, 4cc = YUYV
G_FMT(final): width = 640, height = 480, 4cc = YUYV
ERROR(dmabuf-sharing.c:542) : VIDIOC_REQBUFS failed: Invalid argument
Digging into that code (I think) shows that the invalid argument is from line 539

Code: Select all

535 struct v4l2_requestbuffers rqbufs;
536 memset(&rqbufs, 0, sizeof(rqbufs));
537 rqbufs.count = s.buffer_count;
538 rqbufs.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
539 rqbufs.memory = V4L2_MEMORY_DMABUF;
540 
541 ret = ioctl(v4lfd, VIDIOC_REQBUFS, &rqbufs);
542 BYE_ON(ret < 0, "VIDIOC_REQBUFS failed: %s\n", ERRSTR);
What I'm wondering is, is this "aspirational" code that one day will work but the driver doesn't support it today?

Thanks,
Geoff

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 18477
Joined: Wed Dec 04, 2013 11:27 am

Re: egl_render on the vc4 driver

Sat Mar 14, 2020 10:43 am

So https://github.com/6by9/drm-v4l2-test is a fork of code that did V4L2->DRM and I added MMAL->DRM. I must confess to not having checked whether I broke V4L2->DRM in the process.

Seeing as you are getting YUYV as your format, I'm guessing you're trying to use a USB webcam.
Three things there:
- There is no guarantee that a specific DRM implementation supports all V4L2 formats. The Pi composition hardware does not support the YUYV family of formats. (modetest is the simplest test tool, although not built by default - see below). I haven't got a Pi running that I can copy the output from, but the supported formats list comes from https://github.com/raspberrypi/linux/bl ... kms.c#L162 for fake kms, and https://github.com/raspberrypi/linux/bl ... lane.c#L36 for full KMS (they're basically the same).

- dmabufs are a wrapper for a block of memory. Some hardware devices have restrictions on the memory they can accept, typically requiring physically contiguous buffers. This is a requirement for all DRM buffers on the Pi, and also for the V4L2 codec drivers. UVC (USB VideoClass, ie USB webcams) has no such requirements, so exporting buffers from UVC to import into DRM will fail for definite. Export from DRM and import into V4L2 (which is what this is doing) gets around this restriction.

- V4L2 drivers do not have to support importing dmabufs (rqbufs.memory = V4L2_MEMORY_DMABUF; in the call to REQ_BUFS), although these are getting rarer as it is mainly handled by videobuf2 for you. The driver fills in supported values in the io_modes member of a struct vb2_queue. UVC didn't support dmabufs for ages, but it appears to now.

I suspect the reason for the failure is that whatever driver you're using doesn't like dmabufs, either that or it doesn't like the format/buffer sizing for some reason.
Software Engineer at Raspberry Pi Ltd. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

drgeoffathome
Posts: 13
Joined: Wed Sep 16, 2015 5:00 am

Re: egl_render on the vc4 driver

Sun Mar 15, 2020 10:28 pm

I did the test on a RPi4 (4GB) with the Pi Camera V2. Since your vibe seems to indicate it stands a chance I'll prod around a little further.

oomek
Posts: 38
Joined: Wed Oct 31, 2018 1:51 pm

Re: egl_render on the vc4 driver

Wed Oct 07, 2020 5:03 am

I've had a working implementation of the video player using ilclient that I could draw as an OpenGL texture with 0copy. Since it's now deprecated is there a way ( working example ) of how it should be done on PI4?

14 posts • Page 1 of 1

Return to "OpenMAX"

AltStyle によって変換されたページ (->オリジナル) /