A specifications purpose is to describe a novel behavior of a piece of software/hardware, and how a user has to use those new capabilities. I believe that every specification for OpenGL, OpenGL ES and WebGL follows this idea, and I believe every extension does too (in any case, there is no functional difference between an extension and the core specification, since an extension modifies the specification). And so, it follows that any extension has to pass muster required to pass, as if it where to be included in the specification (because that's where it might end up in in due time).
The fixation on the 5-6-5 format primarily has one motivation: Depth data happens to be 16-bit for arbitrary reasons, and 5-6-5 happens to be a 16-bit internal format, so the first idea anyone would have, just mash them together and call it a day, no wait, draft an extension for it, too. It's an "it happens to work" cum extension.
There are in fact other 16-bit formats that depth could be packed into:
- unsigned byte and luminance alpha
- rgba and unsigned short 4-4-4-4
- rgba and unsigned short 5-5-5-1
And many more coming with WebGL 2.0
Why 5-6-5 should be superior to any of the others is a mystery to me. Personally, I think luminance alpha is more convenient, because then conversion to a 0-1 scaled depth can be done much simpler:
vec2 texel = texture2D(mydepth, texcoord).xw;
float depth = texel.x + texel.y/255.0;
The real problem is, this isn't a technical specification about a behavior. This is a specification how the USER has to behave. It's a leaky abstraction. To my knowledge, there isn't any piece of specification or extension that resorts to a similar hack. It's a precedent.
A WebGL (or OpenGL or ES) extension modifies the specification. Let's suppose you where to encode that behavior in the core specification (that would never fly). But what would a core functionality likely do? Well let's assume that, for whatever reason, you're lacking an appropriate internal format for the kind of format you'd like to store.
Introduce a new internal format (gl.DEPTH), and a new external format (gl.UNSIGNED_SHORT_DEPTH) modify the gl.tex(Sub)Image2D call to accept these parameters, such that you've fully described a useful internal format (mipmappable, interpolatable, mixable, blendable, coveragable, renderable gl.DEPTH) and fully specified a transfer format (it's an unsigned short 16bpp/16bbc). You might introduce a new GLSL sampler type (samplerDepth) and a new texturing function (textureDepth), although that's a bit frowned upon I think, and instead the behavior would probably be that texture2D just returns the depth on all channels of the returned vec4. Afaik, this is the kind of thing that could pass muster for inclusion in the core specification. And in fact, it has. All of those things, have passed muster previously.
So I'm strictly against an extension, that encodes a "it happens to work" behavior, that is less about a technical specification, but an external format description and specifying how a user has to behave. That's not an extension, that's a hack.