[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Gamma correction and texImage2D/texSubImage2D

Kenneth Russell wrote:
> On Fri, Sep 3, 2010 at 11:39 AM,  <steve@sjbaker.org> wrote:
>>> El 3 de septiembre de 2010 08:47, Chris Marrin <cmarrin@apple.com>
>>> escribió:
>>>> I think it would be useful to have the unlit case behave the same as
>>>> rendering to a 2D canvas, which would gamma correct. I believe the
>>>> differences in the lit case would be subtle and it's only if authors are
>>>> trying to be very precise that they will care. In that case, they can
>>>> turn it off. But my opinion on this is not strong.
>>> I agree that the unlit case should ideally behave the same as
>>> rendering to a 2D canvas. However, as Steve points out, this would be
>>> much better implemented as a context creation attribute that the
>>> compositor could respect.  It could default to having gamma correction
>>> turned on.
>>> Additionally, if you need a packing flag for texture loads, I think
>>> the most useful operation is the opposite of the one proposed--to
>>> transform non-linear input textures into the appropriate linear space
>>> for lighting.  Using non-linear textures as storage and input arguably
>>> gives you more color resolution in the dark part of the spectrum, so
>>> it might be useful to support that.  D3DSAMP_SRGBTEXTURE is an example
>>> of this sort of texture load flag.
>>> -enne
>> Yes - the reverse operation (to turn a pre-gamma-corrected image into a
>> linear color space texture) is much more useful - especially in an
>> environment where JPEG images are common and we might wish to take as
>> input other things that the browser has generated that might already be
>> gamma corrected.
>> At first sight, fixing pre-gamma'd images back to linear seems do-able in
>> the shader.
>> (Since the gamma operation is  Vout=pow(Vin,1.0/2.2) - the inverse of that
>> is Vin = pow(Vout,2.2)...which you can approximate as pow(Vout,2.0) -
>> which is Vin=Vout*Vout).
>> However, you can only do that after texture-lookup - and because that
>> entails a bunch of linear interpolations, you shouldn't really be doing
>> that in gamma-space.  So there is certainly justification for reversing
>> the gamma correction as the texture is loaded.  Moreover, many image file
>> formats actually tell you what gamma they were stored with - so the loader
>> could do a really excellent job by honoring that number.
> Based on your above descriptions and the above discussion I'm well
> convinced that the default behavior should not be to apply gamma
> correction to images uploaded via tex{Sub}Image2D. I don't yet
> understand what we'll need to do in order to support this though.
> For RGB(A) PNGs, is the desired behavior to simply pass through the
> pixel values in the file without regard to any gamma information in
> the file or the screen gamma? Or is the conversion to a linear color
> space more complex?
> I don't know where all of the places are in WebKit code which may end
> up modifying pixel values during image loading. Here's the code from
> WebKit's PNGImageDecoder.cpp that sets up the gamma in the PNG reader.
>     // Gamma constants.
>     const double cMaxGamma = 21474.83;
>     const double cDefaultGamma = 2.2;
>     const double cInverseGamma = 0.45455;
>     // Deal with gamma and keep it under our control.
>     double gamma;
>     if (png_get_gAMA(png, info, &gamma)) {
>         if ((gamma <= 0.0) || (gamma > cMaxGamma)) {
>             gamma = cInverseGamma;
>             png_set_gAMA(png, info, gamma);
>         }
>         png_set_gamma(png, cDefaultGamma, gamma);
>     } else
>         png_set_gamma(png, cDefaultGamma, cInverseGamma);
> If we want to pass through the data unmodified, would we want to call
> png_set_gamma(png, 1.0, 1.0)? Similarly, to convert to linear space,
> would we want to pass 1.0 instead of cDefaultGamma?
> I see nothing in the JPEGImageDecoder related to gamma. Is anything
> needed for this file format? I suspect people will not use JPEGs for
> anything they expect to be passed through verbatim to WebGL, such as
> encoding non-color information in the color channels of a texture.
> Do we need three values for this pixel storage attribute (pass through
> data verbatim, convert to linear space, and perform gamma correction)?
> Similarly, it sounds like we need another context creation attribute
> to optionally gamma correct WebGL's rendering results before placing
> them on the page?
> -Ken
> P.S. Steve, your earlier email is my favorite ever.
Well the current rules are something like this:

* The PNG file format stores things in linear color space.  If you plan
to display them on a NON gamma corrected medium - then you need to apply
gamma to it...which (I presume) is what that snippet of code that you
presented actually does.

* The JPEG format stores things in gamma space (because it allow denser
lossy-compression). When you simply display a JPEG (as is typically the
case in a browser), you don't do anything more to it...which is why you
can't find anything in the JPEG decoder.

However, that's only true when you're going to do NOTHING whatever to
the image on its way to the display.  If you plan to do (linear) math on
it (blending, MIPmapping, lighting, etc) then you have to have
everything in linear color space because our hardware can't do that
stuff in gamma space.

So what we need to do is to pass things in linear color space to the
shaders - let the graphics pipeline do it's thing in linear color space
- and then, at the very end of the process - perform gamma correction. 
In the case of WebGL - doing gamma correction in the compositor is
virtually a freebie (providing everyone carries through with plans to do
compositing using the GPU).

Hence, for 3D rendering (or anything else that requires processing of
the image data) we have to reverse those two rules:

* PNG files will not need any processing...they are in linear color
space, they go through the shader as linear...then are gamma corrected
(I propose) in the final compositor stage.  Very clean, efficient and
with minimal roundoff issues and mathematically correct gamma handling.

* Sadly, JPEG files are now a problem...they are stored in gamma space -
so we must REVERSE-gamma-correct them as we load them: 
Vout=pow(Vin,2.2); ...to turn them back into linear space - then pass
them through the shader in linear space - and finally, gamma correct the
result in the compositor using Vout=pow(Vin,1/2.2);

This preserves the existing behavior - but does the gamma correction
AFTER all of the linear processing and not before...HOORAY!!  That's a
massive quality win for PNG (and most other image file formats) - but
it's not so great for JPEG.  Doing reverse-gamma then rendering then
doing forward-gamma is not nice.  But (roundoff error aside) this does
preserve the current "canvas" behavior.

However, vanilla JPEG is highly ill-suited for use as 3D texture maps -
for many other reasons.  (My ancient rant on this subject is here:

Other file formats are trickier.  GIF (IIRC) makes no comment about
gamma - you can't tell whether it's gamma-corrected or not.  BMP is just
an unholy mess - a BMP can be just a wrapper for a JPEG or a PNG or some
Microsoft-specific mess.  Without a lot of messy decoding it could be
just about anything...a typical Microsoftian pig's breakfast!  TGA files
are normally in linear space - but there is an extension that supports
gamma-corrected files.  I don't think I've ever seen one that was.

I think if you reverse-gamma JPEG files and leave everything else alone,
you'll be OK.

  -- Steve

You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: