I'm getting some interest in the video/television industry to the effect of using WebGL.
For realtime production a typical setup would grab video frames off a monitors output at the monitors refresh frequency. Often the signal is converted then to interlaced formats and the like. There are some difficulties in such a setup.
1) When the page is being composited out of sync with the WebGL drawing
2) When the page is being composited out of sync with the monitors refresh frequency
I think (though I'm not entirely sure) that the first source of frame tearing is being addressed with canvas double buffering and improvements in requestAnimationFrame (correct me if I'm wrong).
As far as I'm aware the second source of frame tearing isn't being addressed right now (the browsers recompositing isn't synced to the monitors refresh frequency).
Is the above an accurate description of the issue, and is there being any effort yet to solve it?
Browsers implement a maximum framerate at which the WebGL canvas can be shown (the frequency at which they recomposit). This is usually capped at 60hz.
For realtime video production it is desirable to exactly match the outputs refresh rate. For instance for these usecases:
- To render time interspersed stereographic video (usually each frame contains a marker). So for instance for say 60hz stereoscopic video, a framerate of 120hz would be required.
- To provide output intended for interlaced composition
- To avoid frame tearing
Would it be possible to have the browser match the outputs refresh exactly?
Currently WebGL does not allow for fine grained control over antialiasing. Mostly one gets 4x4 MSAA (and sometimes no AA like on OSX).
There are a number of advanced AA options presented by GPU drivers (such as nvidias TXAA) that deliver fully hardware accelerated movie-like antialiasing at a fraction of the cost of brutal supersampling.
Would it be possible to get finer grained control over AA?
Video production workstation video input textures
Some of the GPUs used in these environments support direct video input -> texture (they usually sport multiple external video in ports).
As far as I could ascertain this capability is exposed via proprietary extensions of the drivers of these GPUs.
Would there be any possibility to gain access to these extensions?
Dedicated multi-head GL output
A common usecase for hardware accelerated video production is to dedicate one of the multi-head outputs of these GPUs to outputting the production signal on a dedicated head.
This is usually achieved in interaction with the GLX/WGL/etc.
Would there be any possibility to get this functionality for WebGL?
Streaming video textures
For on-line video production it would be useful to be able to present a FULL-HD video in WebGL. I know that there is an extension in the works (dynamic textures) for this for WebGL.
Is there any progress on that?
Streaming video output
For on-line production it would be useful to be able to grab frames of a WebGL framebuffer and encode them to a video stream to put onto a websocket/webrtc etc. For off-line production this would probably just be raw frames, so no encoding would be required.
Is there any possibility of introducing a video encoder to HTML that could perform the function of encoding video from a webgl drawing buffer using specified codec/container/quality parameters?
It is often desirable to match a given rendering to a video. I know that there's some timing information on videos, but I also know it's not terribly accurate.
Is there any possiblity to get more precise video timing?