[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] video/tv production issues

Here's a peanut gallery vote in support of the features. They might be a little marginal and vertical but they would be incredible proof of browser-as-platform.


On Tue, Mar 12, 2013 at 9:50 AM, Florian Bösch <pyalot@gmail.com> wrote:
That's ok, if this gig plays out it's a possibility that there will be patches towards those features. However it also concerns a willingness to standardize them by this commitee and a willingness of UA vendors to accept the patches.

On Tue, Mar 12, 2013 at 5:42 PM, Gregg Tavares <gman@google.com> wrote:
I'm not trying to be snarky there but I don't know what to say except "patches welcome".

There's nothing stopping those features from making into WebGL. It's just a matter of time / people / effort. For example the Web Audio API has recently been adding support for pro-audio features like multi channel output (ie, > than 2 channels) 

On top of finding people to have the time to implement the features the specs have to be worked out, hopefully in a portable cross platform way.

For example I would personally like to see 3D display support added though that seems more appropriate as a CSS attribute first and WebGL support second.

On Tue, Mar 12, 2013 at 4:06 AM, Florian Bösch <pyalot@gmail.com> wrote:
I'm getting some interest in the video/television industry to the effect of using WebGL.

For realtime production a typical setup would grab video frames off a monitors output at the monitors refresh frequency. Often the signal is converted then to interlaced formats and the like. There are some difficulties in such a setup.

Frame Tearing
1) When the page is being composited out of sync with the WebGL drawing
2) When the page is being composited out of sync with the monitors refresh frequency

I think (though I'm not entirely sure) that the first source of frame tearing is being addressed with canvas double buffering and improvements in requestAnimationFrame (correct me if I'm wrong).

As far as I'm aware the second source of frame tearing isn't being addressed right now (the browsers recompositing isn't synced to the monitors refresh frequency).

Is the above an accurate description of the issue, and is there being any effort yet to solve it?

Browsers implement a maximum framerate at which the WebGL canvas can be shown (the frequency at which they recomposit). This is usually capped at 60hz.

For realtime video production it is desirable to exactly match the outputs refresh rate. For instance for these usecases:
  • To render time interspersed stereographic video (usually each frame contains a marker). So for instance for say 60hz stereoscopic video, a framerate of 120hz would be required.
  • To provide output intended for interlaced composition
  • To avoid frame tearing
Would it be possible to have the browser match the outputs refresh exactly?

Currently WebGL does not allow for fine grained control over antialiasing. Mostly one gets 4x4 MSAA (and sometimes no AA like on OSX).

There are a number of advanced AA options presented by GPU drivers (such as nvidias TXAA) that deliver fully hardware accelerated movie-like antialiasing at a fraction of the cost of brutal supersampling.

Would it be possible to get finer grained control over AA?

Video production workstation video input textures
Some of the GPUs used in these environments support direct video input -> texture (they usually sport multiple external video in ports).

As far as I could ascertain this capability is exposed via proprietary extensions of the drivers of these GPUs.

Would there be any possibility to gain access to these extensions?

Dedicated multi-head GL output
A common usecase for hardware accelerated video production is to dedicate one of the multi-head outputs of these GPUs to outputting the production signal on a dedicated head.

This is usually achieved in interaction with the GLX/WGL/etc.

Would there be any possibility to get this functionality for WebGL?

Streaming video textures
For on-line video production it would be useful to be able to present a FULL-HD video in WebGL. I know that there is an extension in the works (dynamic textures) for this for WebGL.

Is there any progress on that?

Streaming video output
For on-line production it would be useful to be able to grab frames of a WebGL framebuffer and encode them to a video stream to put onto a websocket/webrtc etc. For off-line production this would probably just be raw frames, so no encoding would be required.

Is there any possibility of introducing a video encoder to HTML that could perform the function of encoding video from a webgl drawing buffer using specified codec/container/quality parameters?

Video Timing
It is often desirable to match a given rendering to a video. I know that there's some timing information on videos, but I also know it's not terribly accurate.

Is there any possiblity to get more precise video timing?

Tony Parisi                             tparisi@gmail.com
CTO at Large                         415.902.8002
Skype                                     auradeluxe
Follow me on Twitter!             http://twitter.com/auradeluxe
Read my blog at                     http://www.tonyparisi.com/

Read my book! WebGL, Up and Running