Apparently this pass through is what happens today, but the default render buffers are supposed to be linear. That is the expectation/understanding behind the original fixed-function lighting equations and the math for the blending operations.The problem with attempting to support sRGB as a drawing buffer in WebGL arises from the sRGB texturing specification, which states that if you read a value (as in texture2D) out from an sRGB texture, it is converted to linear space.Because the default render buffer is supposed to be linear.If browsers are indeed passing through the data, effectively turning the buffer into an sRGB buffer, then what you have proposed is indeed implementable.
The question then is how many browsers would return something other than an sRGB curve to the getGammaRamp query? Whether it is a worthwhile addition for WebGL 2 depends on the answer. For WebGL 1 it seems like a worthwhile addition.
Browsers do not composit and operate in linear space. Evidence: http://www.4p8.com/eric.brasseur/gamma.html . Neither photoshop nor GIMP pass this test either.Hmm. I need to do some more research including finding a copy of “A ghost in a snowstorm”, but it is my understanding that all blending should be done in a physically linear space. NVIDIA’s recent GL_NV_blend_equation_advanced extension, which provides the blending functions specified by vector graphics standards such as PDF and SVG, does all the blending in linear space.