Results 1 to 8 of 8

Thread: Displaying EGL surfaces with OpenWF

  1. #1
    Junior Member
    Join Date
    May 2008
    Posts
    23

    Displaying EGL surfaces with OpenWF

    I want to allocate and render into an EGL surface using GLES2 and then use OpenWF display to bind that surface to display hardware. However, the OpwnWF Display specification seems to leave the allocation of source images unspecified and clearly states it is outside the scope of OpenWF Display. This is fine, however I was wondering how it was envisaged this would work and if there were any EGL extensions being developed for this?

    As I see it, the first task is to allocate an EGL surface which is capable of being read by the display controller. I believe some display controllers are quite restrictive in this respect, for example some require the memory they "scan out" to be physically contiguous. Such buffers must be allocated (or at least reserved) at boot time before physical memory becomes too fragmented and are therefore a very scarce resource. In these cases, I think a new EGL surface type and entry point would be needed to differentiate surfaces which can be used directly by display hardware from other EGL surfaces. Such a concept already exists in the EGL_MESA_screen_surface extension, which defines a new surface type (EGL_SCREEN_BIT_MESA) and a new entry point for allocation of such surfaces (eglCreateScreenSurfaceMESA). While the rest of EGL_MESA_screen_surface goes on to define a different mechanism for modesetting, I think the screen surface can be re-used with OpenWF.

    Assuming I am able to allocate an EGL surface which is compatible with my display controller, I need a mechanism to take the EGLSurface and turn it into WFDSource. The first option, which seems to be hinted at in the OpenWF Composition spec, wraps the EGLSurface into a stream. I guess this kinda makes sense - every time I do eglSwapBuffers on the surface, a new image is submitted into the stream and a new back buffer is allocated (or the old front-buffer is re-used). However, another option would be to use an image source. This would mean creating an EGLImage for the front colour buffer and another for the back colour buffer of the surface. I'm unsure how this would work with respect to eglSwapBuffers however. While it would be cool to have control over the swapping of buffers, I think it would take too much freedom away from the implementation to do things like triple buffering, etc. The more I think about it, the more I think a stream is the correct way to get content rendered to an EGL surface into OpenWF Display. In fact, it might be as simple as casting the EGLSurface to a WFDNativeStreamType can plugging that into wfdCreateSourceFromStream().


    I guess another, totally different approach would be to treat OpenWF as a new EGL client API at the same level as OpenGL & OpenVG. I could create an OpenGL texture, create an EGLImage for it, use that EGLImage to create a WFDSource. I could then use an FBO to render into the GL texture and it's EGLImage siblings, including the WFDSource.


    Personally, I think I prefer the idea of a new EGLSurface type... But I'm curious how others see this working.

  2. #2
    Junior Member
    Join Date
    May 2008
    Posts
    23

    Re: Displaying EGL surfaces with OpenWF

    bump?

  3. #3
    Senior Member
    Join Date
    May 2008
    Posts
    100

    Re: Displaying EGL surfaces with OpenWF

    You can create an EGL image from a pixmap with the EGL_KHR_image_pixmap extension. This allows you to allocate a pixmap which is displayable and that complies with any display controller requirements. The buffer also needs to be allocated in accordance with any GPU requirements as well, such as pixel granularity.

    You can then create a texture from the EGL image with the GL_OES_EGL_image extension. Once you've created a texture from the image, you can render to the image using FBO's. Finally, you can create a WFD source from the image, and bind it to a WFD pipeline to be displayed.

    I don't think EGL surfaces will ever be used directly by WFD. The plans for streams are to have them be containers for EGL images.

  4. #4
    Junior Member
    Join Date
    May 2008
    Posts
    23

    Re: Displaying EGL surfaces with OpenWF

    So allocating the actual buffer which will be displayed is out-of-scope of both EGL and OpenWF? It is left to the window system to provide a function which can allocate a buffer suitable for the display controller. This feels like a bit of a chicken and egg type situation? OpenWF is designed to allow window systems to be implemented, yet it can't be used without API provided by the window system?

    Assuming I do have a magic "allocate a pixmap which can be addressed by the display controller" function available, why not just use eglCreatePixmapSurface to create an EGL surface wrapping that pixmap? Creating an EGLImage sibling and GL texture sibling for the pixmap, then attaching that GL texture to an FBO's colour render buffer seems awfully convoluted. It doesn't provide any facility to render using multisampling (how do you allocate the sample buffer and when do you trigger the resolve?). It also requires an EGL context to be current - how is a client supposed to do that? I guess it could create an EGL PBuffer, but that will still allocate a colour buffer which will probably never be used and just waste memory (which is probably pinned).

    Am I missing the point of OpenWF somehow? It feels like no one has given any thought about how a client would render to a buffer which is then displayed by OpenWF? I would have thought this is the primary use case of OpenWF?

  5. #5
    Junior Member
    Join Date
    Feb 2010
    Posts
    6

    Re: Displaying EGL surfaces with OpenWF

    Tom,

    There indeed were discussions around how to provide input data to the OpenWF APIs, but it was thought that EGL was the better place to define these mechanisms since it is sort of the "glue" layer that allows passing data between the different APIs. EGL images are certainly one means of this. There is an EGL streaming concept being worked on, but this is not something I have been actively following so it probably better that I don't comment else I'll just pass on my own confusion.

    I'll see if I can get someone else to respond here.

    Steve.

  6. #6
    Junior Member
    Join Date
    May 2008
    Posts
    23

    Re: Displaying EGL surfaces with OpenWF

    I agree the "glue" API which is missing probably belongs in EGL rather than OpenWF.

    Looking through the code, Wayland actually handles this in a similar way as described by by jpilon. It calls eglMakeCurrent with a valid context, but passes in EGL_NO_SURFACE for the read/write surfaces. I believe this is against the EGL spec, as it is currently worded in 1.4. However, from a GL perspective it does kinda make sense - you just loose the unnamed default "0" framebuffer object. To actually render, Wayland creates and binds a GL FBO, attaches a color & depth render buffer and sets the color render buffer storage to an EGLImage, using an extension called glEGLImageTargetRenderbufferStorageOES. While Wayland then goes on to use KMS APIs to display the EGLImage, the same mechanism could be used to render to an EGLImage which is then handed to OpenWF Display. The disadvantage with this approach (other than being a bit convoluted) is that the process of buffer swapping is far more verbose. Although it also provides the compositor with more control over the whole process.

    The only real alternative to this approach I can see at the moment is to add a new eglCreateScreenSurface and treat that surface as a stream of EGLImages. Calling eglSwapBuffers takes the current back buffer and adds it to the stream. API-wise this feels much cleaner, though I wonder if the compositor really does need more control than this. Only one way to find out.

    Anyway, I think both options would work and I have a much better understanding of how this should all hang together now, thanks. I certainly have enough of an idea to get on with some experiments.

  7. #7
    Senior Member
    Join Date
    May 2008
    Posts
    100

    Re: Displaying EGL surfaces with OpenWF

    Having two backends to EGL isn't bad. One composited and one display controller. In a composited environment, creating EGL window surfaces would consist of creating EGL images and putting them in an EGL stream. The client would be a producer of frames, and posting in the composited environment could consists of insert EGL images as front buffers, to be consumed by the compositor. The compositor could be EGL based as well, however he would use the display controller backend to EGL which would hand swapping frame buffers and displaying them.

  8. #8
    Junior Member
    Join Date
    Nov 2008
    Posts
    8

    Re: Displaying EGL surfaces with OpenWF

    The answer depends on what you are doing.

    If you are writing an application program then you would simply create a window (using your window system's function to do so) and call eglCreateWindowSurface(). The window system should take care of all OpenWF and compositing details when you call eglSwapBuffers().

    If, on the other hand, you are implementing the window system itself then it is more complicated.

    I will assume you are implementing the compositing window system itself. I do not know how this works with OpenWF. In other windowing systems that I have seen that use GLES to do compositing there is generally a call to create a 'fullscreen window'. Typically this call is only exposed to the window system itself, and a regular app would never see it. This function would return an object which basically acts very much like a window system window - that is, you can call eglCreateWindowSurface() on it to create an EGLSurface. Then the window system would draw into this EGLSurface (e.g. using the application window contents as textures) and use eglSwapBuffers to display the results on the screen. Typically the implementer of EGL and the implementer of the windowing system cooperate to provide this 'create fullscreen window' function. I think OpenWF display is attempting to standardize this, but I do not know the details. I would expect OpenWF to supply some function that returns a 'display' object or 'fullscreen-window' object which can then be passed to eglCreateWindowSurface(), but I do not know what that function is called, or even if that is how OpenWF actually works.

    (I realize this does not really help if you are actually trying to use OpenWF to implement a window system - apologies if the above is already obvious to you!)

    -Acorn
    -Acorn
    (Opinions are my own, and not necessarily shared by my company or others.)

Similar Threads

  1. Sharing surfaces across processes
    By amendola in forum Cross API and window system integration
    Replies: 2
    Last Post: 06-10-2009, 04:25 PM
  2. Blending on surfaces with No Alpha (565 specifically)
    By Ivo Moravec in forum OpenVG and VGU
    Replies: 3
    Last Post: 08-01-2006, 07:22 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •