[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WEBGL_compressed_texture_s3tc_srgb



And I need to add, a true solution to the entire problem should be to pass a linear space 16-bit per channel luminance value to the GPU, and the GPU negotiates with the display device how that is to reproduce. It's a true solution because a LDR 16-bit per channel value is sufficiently fine-grained to avoid banding issues, and the GPU negotiating how to reproduce it on display is a transparent abstraction to the application programmer, so it's ideal to reduce complexity.

On Wed, Jun 15, 2016 at 12:00 PM, Florian Bösch <pyalot@gmail.com> wrote:
On Wed, Jun 15, 2016 at 11:45 AM, Mark Callow <khronos@callow.im> wrote:
On Jun 15, 2016, at 5:23 PM, Florian Bösch <pyalot@gmail.com> wrote:

 
If the the value of FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING is SRGB then the values will be compressed to non-linear before being written to the framebuffer.
Otherwise they are written as is.
This is incorrect. You need to distinguish between framebuffers, and framebuffer objects.

It is correct and you do not need to distinguish between framebuffers and FBOs. Both default framebuffers created with an EGL 1.5 setting for EGL_GL_COLORSPACE of EGL_GL_COLORSPACE_SRGB and FBOs having an sRGB texture as color attachment will have a FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING of sRGB.
It is incorrect to throw them together for a simple reason. EGL defines that sRGB framebuffers values are read out in sRGB space. This would extend to a browser as well being a "compositor". However WebGL 1/2 do not have explicit framebuffer setup, so the point is moot, to the browser the front framebuffer is never sRGB. An sRGB texture (such as you would attach to a framebuffer object) however is always read out on lookup as linear. So even if you somehow managed to smuggle an sRGB texture as the frontbuffer to the Browser (which you can't), it would use texture2D to lookup that texture upon compositing, and would get linear values. The browser being colorspace agnostic, would then just straight pipe the linear space into everything else it composits with, which is in nonlinear space, which is incorrect.

If you ever where to support sRGB front buffers in WebGL, the browser would have to re-encode the linearly read out value from the sRGB texture it uses as a stand-in for a frontbuffer from the WebGL context explicitely into sRGB space again manually (the OS isn't going to help him any with that). As it stands, that capability does not exist, and so the application programmer has to do that job, and the job is identical, perform a re-encode to sRGB from the linear value read out from the sRGB texture and blit it as non-linear value onto the WebGL front buffer. This is why the distinction matters. Because the browser (or the application programmer) can only emulate (manually) EGL agnostic behavior, with a non colorspace agnostic bitmap surface. This is a problem you do not have in native EGL because EGL is specified to be agnostic, so a native programmer doesn't have to care.

If you do not make this distinction clearly, and correctly, you will end up with garbage in your WebGL frontbuffer.

 
The canvas color space proposal together with sRGB rendering support is intended to resolve these and other color space issues.
It will not solve the underlying issue that everything that gets sent to a display is sent to the display in sRGB, because the display is forced to accept sRGB only because that's how it came into existence. 

One of the drivers for the proposal is support of HDR and wide gamut displays. It is no longer true that everything sent to a display is sRGB.
To my knowledge this is still the case throughout the entire stack (display IC, wire protocol, GPU, driver, OS and application). That does not mean that wide gamut displays do not exist, but what gets shunted to them always goes trough the limited straw of sRGB and 8bbc precision.

All applications that stretches their contrast across the range of available values (0 - 255) always use the maximum available gamut of the display device, regardless of what that is. The result just doesn't match of course (but wide gamut displays are advertised to the consumer as "prettier" and that's in fact how they look, regardless if it's a faithful reproduction, consumers don't seem to care much about that, even though we do).