[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Gamma correction and texImage2D/texSubImage2D

>> 2. What should the default value of this flag be? If it were false,
>> then for images uploaded from the browser to WebGL, the default
>> behavior would be for the pixels to be completely untouched. However,
>> this might be surprising behavior to applications displaying images on
>> screen with very simple shader code (no lighting) and expecting them
>> to look the same as surrounding browser content. Still, I am inclined
>> to suggest that the default for the new flag be false to match most
>> other OpenGL behavior, where anything other than a pass-through of
>> data is optional and disabled by default.
> I think it would be useful to have the unlit case behave the same as
> rendering to a 2D canvas, which would gamma correct. I believe the
> differences in the lit case would be subtle and it's only if authors are
> trying to be very precise that they will care. In that case, they can turn
> it off. But my opinion on this is not strong.

Look I hate to keep going on about this - but this is REALLY important and
you're going down completely the wrong track.  I have to make yet another
empassioned plea to NOT fuck this up for web rendering for all time.

If you gamma-correct in the compositor, as I recommend, then everything
works right - if you do it ANYWHERE else then you've provided a default
behavior that is mathematically and visually incorrect.  That would be
ridiculous decision.  I've seen some bad misunderstandings and
misimplementations of gamma correction in the 25 years I've been doing 3D
- but the idea of deliberately building incorrect gamma math into a major
new graphics standard is quite indefensible...particularly since the
"right" solution is easier and more efficient!

If you "correct" the gamma on the input images - then that implies that
you're NOT going to correct it on the output canvas.  If that's the case
then even correctly written software that turns off that input processing
will fail to have the right gamma on output...to fix that, I'd have to do
an additional post effect rendering pass on my final output.  But that's a
total waste because the browser is going to do a final-final pass when it
composites my image into the final screen.  THAT is where the gamma should
be corrected.  It's mathematically the right place - it's the right place
for good performance because I don't have to run another pass over the
screen to correct the gamma - and it means that every application (even
the badly written ones by uncaring users) will get nicely corrected gamma.

The differences are far from subtle.  Go play any video game with a dark
gloomy setting without corrected gamma and you'll immediately see the
difference because you'll pretty much be staring at a black screen!

You seem to be under the misapprehension that lighting is the only issue
here.  That couldn't be further from the truth.

Even in the simplest, "unlit" case - if you specify a 50% blend of two
textures (A and B) or a 50% alpha-blend overlay of one pre-corrected image
over another - the answer will be flat out WRONG because:

    pow(A/2+B/2,gamma) != pow(A/2,gamma)+pow(B/2,gamma)

In what way is that "subtle"?  It's basic arithmetic - and if you plug in
a realistic set of values for A, B and gamma, you'll see that the
numerical errors are extreme over some ranges of A and B.  If you want,
I'll do the analysis for you - but to be honest it's pretty damned

And it's not just to do with lighting.  ANY linear operator (add,
subtract, multiply, divide, lerp, texture-interpolation, etc) has to
happen in linear space before gamma because that's what the hardware does
and we can't change that.

Pick an even simpler example: If your WebGL code does nothing more than to
rotate your image 10 degrees or stretch it by 10% before drawing it, then
because the bilinear texel blend that the hardware does operates in linear
color space, if you pre-gamma the image then it'll alias.

Since both WebKit and FireFox are moving to use OpenGL for compositing,
sticking the gamma fix into the compositing shader is a freebie.  Now you
don't have to mess with the incoming textures and all of the math works
correctly in both the simple case AND the lit case.

Suppose we're rendering to a device that needs very different gamma (an
inkjet printer, for example).  The JavaScript code has no way to find out
what the correct gamma for the device is - and even if it could, with the
proposed 'fix' it would have to re-download and have WebGL re-convert all
of the textures.  But doing it in compositing means that the browser can
provide a user-option to set the desired gamma - and for it to default to
2.2 for the CRT and whatever it needs to be for printing.

Doing it in the compositor is pretty much free - and mathematically
correct - and convenient for printing - and applies to every application
by default. Doing what you are proposing is just bad mathematics, inept
graphics and an incredibly poor standardization decision.

We can allow fucked up math as an option - but there is no way it should
be the default.

Please, let's do this right.

  -- Steve

You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: