[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WEBGL_debug_shader_precision extension proposal



A specifications purpose is to describe a novel behavior of a piece of software/hardware, and how a user has to use those new capabilities. I believe that every specification for OpenGL, OpenGL ES and WebGL follows this idea, and I believe every extension does too (in any case, there is no functional difference between an extension and the core specification, since an extension modifies the specification). And so, it follows that any extension has to pass muster required to pass, as if it where to be included in the specification (because that's where it might end up in in due time).

The fixation on the 5-6-5 format primarily has one motivation: Depth data happens to be 16-bit for arbitrary reasons, and 5-6-5 happens to be a 16-bit internal format, so the first idea anyone would have, just mash them together and call it a day, no wait, draft an extension for it, too. It's an "it happens to work" cum extension.

There are in fact other 16-bit formats that depth could be packed into:
And many more coming with WebGL 2.0

Why 5-6-5 should be superior to any of the others is a mystery to me. Personally, I think luminance alpha is more convenient, because then conversion to a 0-1 scaled depth can be done much simpler:

vec2 texel = texture2D(mydepth, texcoord).xw;
float depth = texel.x + texel.y/255.0;

The real problem is, this isn't a technical specification about a behavior. This is a specification how the USER has to behave. It's a leaky abstraction. To my knowledge, there isn't any piece of specification or extension that resorts to a similar hack. It's a precedent.

A WebGL (or OpenGL or ES) extension modifies the specification. Let's suppose you where to encode that behavior in the core specification (that would never fly). But what would a core functionality likely do? Well let's assume that, for whatever reason, you're lacking an appropriate internal format for the kind of format you'd like to store.

Introduce a new internal format (gl.DEPTH), and a new external format (gl.UNSIGNED_SHORT_DEPTH) modify the gl.tex(Sub)Image2D call to accept these parameters, such that you've fully described a useful internal format (mipmappable, interpolatable, mixable, blendable, coveragable, renderable gl.DEPTH) and fully specified a transfer format (it's an unsigned short 16bpp/16bbc). You might introduce a new GLSL sampler type (samplerDepth) and a new texturing function (textureDepth), although that's a bit frowned upon I think, and instead the behavior would probably be that texture2D just returns the depth on all channels of the returned vec4. Afaik, this is the kind of thing that could pass muster for inclusion in the core specification. And in fact, it has. All of those things, have passed muster previously.

So I'm strictly against an extension, that encodes a "it happens to work" behavior, that is less about a technical specification, but an external format description and specifying how a user has to behave. That's not an extension, that's a hack.


On Wed, Nov 12, 2014 at 3:26 AM, Jeff Gilbert <jgilbert@mozilla.com> wrote:
I agree with Gregg.

I will add that if it's something that we feel is important enough as a working group, we could canonize the library and maintain it as part of our github repo.

-Jeff

----- Original Message -----
From: "Gregg Tavares" <khronos@greggman.com>
To: "Mark Callow" <khronos@callow.im>
Cc: "Florian Bösch" <pyalot@gmail.com>, "Jeff Gilbert" <jgilbert@mozilla.com>, "Olli Etuaho" <oetuaho@nvidia.com>, "Kenneth Russell" <kbr@google.com>, "public webgl" <public_webgl@khronos.org>
Sent: Tuesday, November 11, 2014 6:16:19 PM
Subject: Re: [Public WebGL] WEBGL_debug_shader_precision extension proposal

If this works just fine as a _javascript_ library why add it as an extension?

As an extension what it does has to be specifically specified.
As an extension it can't be upgraded without making and proposing a new
extension.
As an extension it passes all work to the browser vendors who each need to
implement it

As a library it can be updated and extended whenever
As a library it only needs one implementation and everyone can use it
As a library it can do whatever it wants, no spec needed

>From the discussion above it doesn't seems like it needs to be an
extension. It doesn't seem like there is some specific OpenGL functionality
that needs to be exposed to make it possible. It also doesn't sound like a
speed issue given that the resulting shaders are up to 10x slower.

Also as a library it should be easy to patch it the same way the WebGL
Inspector patches itself in or various other libraries that patch things
like WebGLRenderingContext.prototype.compileShader




On Tue, Nov 11, 2014 at 2:23 PM, Mark Callow <khronos@callow.im> wrote:

>
>
> > On Nov 12, 2014, at 7:19 AM, Florian Bösch <pyalot@gmail.com> wrote:
> >
> > What's wrong with it is that it does not allow you to isolate an issue
> with any of your shader code buried in use somewhere in your application.
> >
>
> You have to find either the buried shader code or the buried call to
> compileShader for that shader. These efforts may or may not be much
> different, depending on the structure of your code. I would not object to
> supporting both an API toggle and a pragma, getting the best of both worlds.
>
> Regards
>
>     -Mark
>
>
> -----------------------------------------------------------
> You are currently subscribed to public_webgl@khronos.org.
> To unsubscribe, send an email to majordomo@khronos.org with
> the following command in the body of your email:
> unsubscribe public_webgl
> -----------------------------------------------------------
>
>