[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] video/tv production issues
On Tue, Mar 12, 2013 at 9:50 AM, Florian Bösch <firstname.lastname@example.org> wrote:
> That's ok, if this gig plays out it's a possibility that there will be
> patches towards those features. However it also concerns a willingness to
> standardize them by this commitee and a willingness of UA vendors to accept
> the patches.
> On Tue, Mar 12, 2013 at 5:42 PM, Gregg Tavares <email@example.com> wrote:
>> I'm not trying to be snarky there but I don't know what to say except
>> "patches welcome".
>> There's nothing stopping those features from making into WebGL. It's just
>> a matter of time / people / effort. For example the Web Audio API has
>> recently been adding support for pro-audio features like multi channel
>> output (ie, > than 2 channels)
>> On top of finding people to have the time to implement the features the
>> specs have to be worked out, hopefully in a portable cross platform way.
>> For example I would personally like to see 3D display support added though
>> that seems more appropriate as a CSS attribute first and WebGL support
>> On Tue, Mar 12, 2013 at 4:06 AM, Florian Bösch <firstname.lastname@example.org> wrote:
>>> I'm getting some interest in the video/television industry to the effect
>>> of using WebGL.
>>> For realtime production a typical setup would grab video frames off a
>>> monitors output at the monitors refresh frequency. Often the signal is
>>> converted then to interlaced formats and the like. There are some
>>> difficulties in such a setup.
>>> Frame Tearing
>>> 1) When the page is being composited out of sync with the WebGL drawing
>>> 2) When the page is being composited out of sync with the monitors
>>> refresh frequency
>>> I think (though I'm not entirely sure) that the first source of frame
>>> tearing is being addressed with canvas double buffering and improvements in
>>> requestAnimationFrame (correct me if I'm wrong).
>>> As far as I'm aware the second source of frame tearing isn't being
>>> addressed right now (the browsers recompositing isn't synced to the monitors
>>> refresh frequency).
>>> Is the above an accurate description of the issue, and is there being any
>>> effort yet to solve it?
Browsers attempt to avoid tearing in general. When accelerated
compositing is on in Chrome (basically all the time now, and
definitely when WebGL is in use), a swap interval of 1 is used when
presenting to the screen. I think Firefox does something similar.
There may be problems when displaying to an external monitor. I've
noticed that the frame rate slows down and tearing occurs with Chrome
on Mac OS when presenting to an external monitor. More investigation
>>> Browsers implement a maximum framerate at which the WebGL canvas can be
>>> shown (the frequency at which they recomposit). This is usually capped at
>>> For realtime video production it is desirable to exactly match the
>>> outputs refresh rate. For instance for these usecases:
>>> To render time interspersed stereographic video (usually each frame
>>> contains a marker). So for instance for say 60hz stereoscopic video, a
>>> framerate of 120hz would be required.
>>> To provide output intended for interlaced composition
>>> To avoid frame tearing
>>> Would it be possible to have the browser match the outputs refresh
This is already the goal. Work is simply needed to test in various
scenarios and fix bugs.
>>> Currently WebGL does not allow for fine grained control over
>>> antialiasing. Mostly one gets 4x4 MSAA (and sometimes no AA like on OSX).
>>> There are a number of advanced AA options presented by GPU drivers (such
>>> as nvidias TXAA) that deliver fully hardware accelerated movie-like
>>> antialiasing at a fraction of the cost of brutal supersampling.
>>> Would it be possible to get finer grained control over AA?
Yes, certainly. Ideally the control would apply equally well to all
kinds of devices. Proposals welcome for how to incorporate this.
>>> Video production workstation video input textures
>>> Some of the GPUs used in these environments support direct video input ->
>>> texture (they usually sport multiple external video in ports).
>>> As far as I could ascertain this capability is exposed via proprietary
>>> extensions of the drivers of these GPUs.
>>> Would there be any possibility to gain access to these extensions?
Ideally there would be some better integration with the browser in
general (i.e., feeding the video through a video element), but
certainly let's get it working first and figure out how to best
specify it afterward.
>>> Dedicated multi-head GL output
>>> A common usecase for hardware accelerated video production is to dedicate
>>> one of the multi-head outputs of these GPUs to outputting the production
>>> signal on a dedicated head.
>>> This is usually achieved in interaction with the GLX/WGL/etc.
>>> Would there be any possibility to get this functionality for WebGL?
I think ideally this would be done for the entire browser somehow
rather than just the WebGL canvas in isolation.
>>> Streaming video textures
>>> For on-line video production it would be useful to be able to present a
>>> FULL-HD video in WebGL. I know that there is an extension in the works
>>> (dynamic textures) for this for WebGL.
>>> Is there any progress on that?
Let's work with Mark Callow to move his WEBGL_dynamic_texture
extension to draft status and get initial implementations in place.
>>> Streaming video output
>>> For on-line production it would be useful to be able to grab frames of a
>>> WebGL framebuffer and encode them to a video stream to put onto a
>>> websocket/webrtc etc. For off-line production this would probably just be
>>> raw frames, so no encoding would be required.
>>> Is there any possibility of introducing a video encoder to HTML that
>>> could perform the function of encoding video from a webgl drawing buffer
>>> using specified codec/container/quality parameters?
The WebRTC spec is probably the way to go here -- either that or a
combination of WebGL and browser extensions.
>>> Video Timing
>>> It is often desirable to match a given rendering to a video. I know that
>>> there's some timing information on videos, but I also know it's not terribly
>>> Is there any possiblity to get more precise video timing?
Before adding new APIs here we should see whether it's possible to
achieve the use cases with WEBGL_dynamic_texture plus possibly
HTMLVideoElement's existing time information plus DOMHighResTimeStamp.
Absolutely there is interest from browser vendors in making WebGL and
the web platform in general achieve more and higher-end use cases.
You are currently subscribed to email@example.com.
To unsubscribe, send an email to firstname.lastname@example.org with
the following command in the body of your email: