[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WEBGL_dynamic_texture extension proposal




On 19/07/2012 04:46, David Sheets wrote:
On Thu, Jul 12, 2012 at 12:54 AM, Mark Callow <callow_mark@hicorp.co.jp> wrote:

Ahh! I understand now. Would making this interface public be of use to
anything bar WebGL app's?
Is <http://www.khronos.org/registry/webgl/extensions/proposals/WEBGL_dynamic_texture/>
of use to anything except WebGL apps?
What I meant was of use to anything but WEBGL_dynamic_texture?
The browser has a handful of metrics useful to the page for estimating
time-to-render latency. The page ultimately knows best, though,
especially with more elaborate texture pipelines and shader program
changes. Is passing a time offset in milliseconds to Acquire
insufficient?

Exposing machine-dependent time-to-render latency heuristics seems
like a separate interface (host profile) from dynamic texture binding.
Do the browsers currently expose these metrics to JS apps? If so, where is the API documented?
"webglDynamicTextureOnAcquire(WebGLRenderingContext, WebGLTexture)" is
probably not going to cause name collisions or other heartache. Why
not ask for forgiveness rather than permission?
I've done something like this in the latest draft I've added dynamicTexture{Set,Get}ConsumerLatencyUsec  methods that should ideally be on the HTMLVideoElement or HTMLMediaElement. Currently they take an HTMLVideoElement as a parameter.
If we have different sampler types (RGB and YUV), we have different
sampler types. The present 'samplerExternalOES' type conflates two
separate aspects of external textures: lack of mipmap/LOD support and
colorspace conversion.

Perhaps "samplerExternalYUV" should be introduced if you want to
expose YUV colorspace to shaders? A function 'convertColorspaceRGB'
could be provided to produce 'samplerExternalRGB' from
'samplerExternalYUV' or 'samplerExternalRGB' (or 'sampler2D' from
'sampler2D' (identity)).

Consider: what if I have two videos with two different colorspaces
that I alternately bind to the same sampler? What if an author wishes
to operate on the raw YUV data (or YIQ, HSL, HSV, xvYCC, YPbPr...)? If
HTMLVideoElement decodes into a number of different colorspaces and
the conversion functions are pushed into user shaders, the conversion
permutation issue is still present if the sampler types are not
disambiguated and different HTMLVideoElement source media are bound.
This is deliberate.

We don't want to expose YUV colorspace to shaders because the fastest way to get video data to textures is hardware dependent. Web applications would not  be able to specify what they get. On some platforms it might not even be possible to access the YUV data. Applications would become responsible for querying the colorspace of the external texture and providing the correct shader which is not a trivial exercise. It is better for implementers to do it once that every author having to do it.

Also the raw YUV data is not currently exposed to JS applications and I don't think WebGL should expose it.

Lastly, if YUV were exposed in the shaders, people would push to YUV as format and internal format for textures which is something that is not possible with current hardware.

Is this (lack of) copy functionally observable? Why is a TEXTURE_2D
not equivalent to a paused video? 
A separate sampler type was chosen so as to enable implementations that may wish to insert code that does run-time selection of a shader branch to handle an external texture format without burdening all texture accesses with that extra code.
I suggest hyperlinking into the OES_EGL_image_external specification
that amends the OpenGL ES 2.0 API.
I did that in the latest draft.
Is conversion of all specifications into a standard hypertext format
on Khronos' agenda?
I don't think anyone has much enthusiasm for converting the roughly 700 page OpenGL specification from TeX to whatever you mean by "standard hypertext format." We did once plan to move all the specifications to DocBook format but the idea did not gain traction. From my own experience of using DocBook for a relatively simple document I can understand why.  It can be an absolute nightmare to change even seemingly simple things.

Perhaps some specific generated text that explicitly states the usable
extension names in JS and in GLSL? OES_standard_derivatives has a
similar but less severe discrepancy between OES_standard_derivatives
and GL_OES_standard_derivatives.
This "discrepancy" is standard practice for Open GL {,ES} extensions. The
GL_ prefix is only added to C API names in lieu of namespaces or objects. It
is normal that the extension string returned in getExtensions has "GL_"
while the name used with #extension does not.  I don't think we should
change that practice.
What you describe appears to be the opposite of the "standard
practice" WebGL has adopted. Extension names in the JS API do not have
"GL_" while OES_standard_derivatives is called
"GL_OES_standard_derivatives" in GLSL
<http://www.khronos.org/registry/webgl/extensions/OES_standard_derivatives/>.
The proposal under discussion does not use the "GL_" prefix in GLSL.
You are quite correct about the standard practice. I was incorrect. The specifications I was looking at that did not have GL_ in the #extension strings were incorrect and have now been fixed.

I'm using #extension WEBGL_dynamic_texture with added aliases in the latest draft.

Regards

-Mark

--