[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] gamma/color profile and sRGB

I should perhaps better explain the problem.


Support sRGB well. Don't make sRGB mandatory. Support unmodified output. Support getGammaRamp/onGammaRamp.

deficiencies and history of sRGB, and why it should be supported well, but not be made mandatory.

sRGB was created at a time when most displays where crap. By that I mean, they where dim, had poor contrast and a miserable gamut. Consequently it represents a colorspace that's attempting to make things look good on the population of miserable displays that existed at the time, and accepting that better displays where getting the worse end of the deal. This was nearly 20 years ago. Meanwhile displays did get a bunch better, and it starts to show. sRGB isn't a bad choice for an output colorspace, but it's by no means a terribly good one either.

A particularly problematic area for sRGB is that its prime motivation was consumer grade CRTs from 20 years ago. The assumptions it made about the inevitable gamma ramp of these displays does not hold true anymore for most displays in use (because they're LCDs or OLEDs). In many of these cases, the "gamma ramp" of a display is purely emulated by the display driver (and by that I mean a piece of hardware embedded in the display that does the conversion).

So having established that sRGB isn't the best there is, hopefully, it's easy to see why there should be support for alternative gamma ramps/colorspaces and why there should be a way to pass non sRGB assumed to be colors out to the display. In fact, no OpenGL standard, despite supporting sRGB backbuffers, ever made it mandatory to use them, for precisely that reason. sRGB is basically legacy. Everybody knows it's going to go the way of the dodo eventually.

However, sRGB does have a lot of mementum, and it's a reasonable default choice. It's a choice that doesn't require any more thought on the side of developer. And for that reason, supporting it well is important. WebGL currently doesn't support it well (because it can't indicate that its output is sRGB). The proverbially best choice to indicate this would be a sRGB drawing buffer. The reason for this is simply that a sRGB drawing buffer will support appropriate blending and anti-alising (in linear space) and store the result in sRGB appropriately. An alternative would be for the application developer to output sRGB themselves, but there really isn't much point to that.

alternative color space acquisition

There are situations in which a better color space can be discovered automatically. I don't know exactly what those are right now, but I'm confindent they exist. So for this reason alone, a good way to pass an alternative colorspace to use to an application developer is important. This could be done by introducing a WhateverRGBDrawingBuffer or something, which would require support for it inside of OpenGL, as blending and anti-aliasing would need to account for it. It's unlikely support for such a scheme would be arriving on any GPU anytime soon, so an alternative to that idea is needed. An alternative is to pass an application developer an appropriate parametrization of a colorspace to use for encoding colors. That is the getGammaRamp/onGammaRamp idea. It allows to sidestep the issues of sRGB, and not require some fancy new GPU capability to work. And obviously, if no better colorspace can be discovered, the returned gamma ramp should be sRGB.

Of course this requires a little bit more work on the side of the application developer, because he's got to use the ramp texture for color translation. It's not a lot more work, but a little bit, and for that reason, this shouldn't be the only choice, or even the default one.

color calibration issues and user settings

Displays as they leave the factory are often not calibrated very well. Users might change the brightness, contrast and color temperature or any other setting that's exposed to them. For this reasons, the quality of rendering, even when having a perfect colorspace, would still be suboptimal in some cases.

A common way to deal with this problem is to offer a user some adjustment parameters inside the application. Things such as gamma, brightness and constrast are often found. And it's pretty easy to see why it should be so, for even as high-end systems like the PS4 cannot guarantee they have the right colorspace for a given display, or that even the user has the power to change it.

So if you would make sRGB mandatory, what it means in this case is that user adjustment would first be mangled into the ill-fitting sRGB colorspace, only to be mangled trough the ill-fitting user/factory setting, to come out at the display hopefully more or less as the application developer intended. That's not a good way to do it, as you'd be doing gamma(gamma), and it's nigh impossible to get anything useful out of that. To get a "clean" user setting for gamma, you'd have to reverse encode to sRGB (such that the sRGB conversion would cancel out). Of course this would introduce banding, and mess with blending and anti-aliasing.

If you want to support your own colorspace for whatever reason (because a user thought it necessary to drag on some sliders or whatever), it is quite useful not to heap any more colorspace conversions onto that for the sake of sanity/consistency/quality.

On Sat, Nov 22, 2014 at 3:59 AM, Mark Callow <khronos@callow.im> wrote:

Florian missed one important factor in scenes looking different on different displays: the viewing conditions. Some high-end displays have sensors they use to account for this. For this and other reasons, I do not think application developers should be trying to second guess the display calibration. If the factory can’t get it right (as Florian postulates), how on earth can you expect a developer, who has never even seen the display in question, to get it right. BTW, display response curves in this age of LCD and LED displays are largely artificial so variations are either small or deliberate. The latter would be accounted for in the color profiles.

GLFW does not support every platform. I suspect there will be some platforms where WebGL is supported that do not have an equivalent to getGammaRamp in the underlying OS. In that case, what to do, return a standard sRGB response curve?

I think the solution to this is to be able to create sRGB back-buffers and let the rest of the system’s color management do its stuff to map that standard sRGB data to the actual display. Yes, this can’t be done for WebGL 1.0 but once WebGL 2 is out, WebGL 1.0 will only linger on older mobile devices that cannot be updated.

I suggested in another e-mail thread that WebGL 2 should support only sRGB back-buffers. I can’t see any reason to support physically linear back-buffers as all OpenGL ES 3 hardware supports sRGB rendering and textures and most displays are close to sRGB. All you succeed in doing when rendering to a linear back-buffer is waste bits on data people can’t see and introduce banding. It is better to let OpenGL {,ES} perform the linear calculations in higher precision and compress the result to sRGB when storing in the frame buffer.