[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WebGL Extensions



On Jun 3, 2010, at 11:16 AM, Gregg Tavares wrote:

> 
> 
> On Thu, Jun 3, 2010 at 11:07 AM, Oliver Hunt <oliver@apple.com> wrote:
> 
> On Jun 3, 2010, at 9:24 AM, Gregg Tavares wrote:
> 
>> 
>> 
>> On Thu, Jun 3, 2010 at 5:12 AM, Oliver Hunt <oliver@apple.com> wrote:
>> 
>> On Jun 3, 2010, at 4:47 AM, Steve Baker wrote:
>> 
>> > Vladimir Vukicevic wrote:
>> >> Hm, I thought we didn't have this which is where the confusion came from, but looks like Chris put it in a while ago:
>> >>
>> >> https://cvs.khronos.org/svn/repos/registry/trunk/public/webgl/doc/spec/WebGL-spec.html#5.14.14
>> >>
>> >> Maybe it's just missing a sentence or two at the end explaining that extensions are WebGL specific, and if WebGL is built on top of an underlying OpenGL driver, that driver's extensions will not necessarily be exposed?
>> >>
>> > IMHO, it is essential that WebGL does NOT expose underlying driver
>> > extensions by default.  The reason being one of security.
>> 
>> For what it's worth the way that the JS/DOM bindings work in most (all?) browsers require every exposed function to be defined and implemented explicitly -- it would not be possible for an implementation to automate the exposure of an arbitrary set of unknown extensions.
>> 
>> Plenty of extensions only enable new ENUMs and new features to GLSL. No changes to the API are required so it's very possible to automate the exposure of an arbitrary set of unknown extensions unless WebGL specifically prevents that exposure.
> 
> How does the runtime _know_ that an extension is only exposing an new enum? How would the glsl validator _know_ what glsl features existed due to an arbitrary extension?  The implementation needs to handle every extension itself -- it can't do it automatically through the glGetExtensions API (or whatever it's called)
> 
> 
> In real GL the features just work if they exist. Querying is only their for your benefit.
> 
> In WebGL, the WebGL implementation has to do checking to make sure no extensions get passed through to the system's GL unless the user has explicit called ctx.getExtension (A WebGL function, not a GL function) for that extension.
> 
> So in other words, passing GL_FLOAT to texImage2D fails in WebGL. The implementation explicitly checks for that.
> 
> If we add a floating point texture extension then if the user calls ctx.getExtension("floating-point-textures") the WebGL code starts allowing GL_FLOAT to be passed to texImage2D.  
> 
> The same thing for the validator. If we add support for glsl 2.0 then you'll have to do ctx.getExtension("glsl-2.0") and internally some new flags will be sent to the validator to allow glsl 2.0 features to compile.

If an implementation supported, say 3D Textures, an author might want to simply supply the numeric code for TEXTURE_3D  to avoid enabling the extension and getting an object which contained the enum. I hope such a thing is explicitly prohibited. I hope it will not be possible by any means to use an extension that has not first been enabled.

-----
~Chris
cmarrin@apple.com





-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: