[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WEBGL_compressed_texture_s3tc_srgb



I know what gamma correction is and why it exists:) But one thing is missing in your argument: between a buffer an WebGL app's rendering to and screen an user looks at there's a browser or, to be more accurate, a compositor. And it may expect WebGL buffer to be linear to do, for example, blending or scaling. Also it's not specified whether the compositor itself corrects it's output.


14.06.2016, 13:08, "Florian Bösch" <pyalot@gmail.com>:
On Tue, Jun 14, 2016 at 11:20 AM, Kirill Dmitrenko <dmikis@yandex-team.ru> wrote: 
If you're referring to WebGL (and WebGL 2 for that matter), is there any info backing this statement? Because it seems to be not specified in either spec, and, moreover, browsers seem to threat image data differently (both recieved from a server and rendered with WebGL). There was an endeavour to specify browsers behaviour regarding gamma correction on decoding images for textures and compositing WebGL content to a page. But as far as I can tell, it has never gotten anywhere.

Gamma exists for two related, but ultimately independent reasons.
  1. CRT display technology (which was mankind's first display technology) used phosphorescence to produce light. To produce a picture an electron beam emitted by what is known as electron gun, was electromagnetically guided over the display panel. The electron source was cathode ray tube, which is a form of amplifying vacuum tube. These tubes (also known as triodes) have a nonlinear relationship to the voltage applied and the electrons emitted. 
  2. The human eye under normal lighting conditions has a response to light that is able to distinguish shades of dark easier than shades of bright.
These two facts combined in a lucky union to create what is known as "gamma space". Lucky, because it allows a form of compression of the signal (be it analog or digital) where a smaller portion of the signal is used to express differences in higher luminances and more is used to express differences in lower luminances, hence improving the quality of the display, while simultaneously building the necessary "decoding" into the physics of the display device at no additional cost.

Variance in gamma curves (and display gamut) became a problem however. And by 1996 Microsoft and HP in collaboration with many other industry partners introduced sRGB. A combined colorspace and gammaspace that was designed to fit CRTs of the period. This became the de-facto standard for color exchange between devices and CRTs.

Due to a complete lack of the management of colorspaces and gamma curves between devices and displays, this later turned out to be a problem for alternative displays such as TFTs, which did not share the CRT inherent technical reasons for gamma and gamut, and did in fact physically surpass CRTs in luminance response, luminance accuracy and gamut reproduction considerably.

However due to the fact that TFTs entered a field that was dominated by CRTs, which all serviced a de-facto standard that was below the technical abilities of the TFTs, TFTs had no choice but to adopt an (electronic) emulation of sRGB because that was the signal they got from devices.

To this day there does not exist any way for displays and devices to inter-operate to discover gamma curves, gamut coverage and precision. Consequently there does not exist any way for applications to communicate those properties with the host operating system.

For this reason, graphics APIs leave this property wholly unspecified, and it is implied that what you output has to be in gamma space, which most probably means sRGB.


--
Kirill Dmitrenko
Yandex Maps Team