[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Gamma correction and texImage2D/texSubImage2D



On Sat, Sep 4, 2010 at 9:45 PM, Steve Baker <steve@sjbaker.org> wrote:
>
> I agree that there are undoubtedly going to be issues with roundoff
> error and precision.  In the short term, I'll do what I've always done -
> instructed my artists to paint textures in 'linear' color space and to
> have their monitors gamma color-calibrated every six months to ensure
> that they're seeing those textures optimally.

What format & depth are your artists saving their work in?  And then
what texture format(s) do you convert to?  How are you planning to get
this data into WebGL?

There are some relevant constraints on WebGL:

* browsers (currently) work with 8-bit sRGB_Alpha color buffers, so
that's the format that WebGL output ends up in.  I don't think WebGL
1.0 can realistically spec anything else for output.  (In the future,
perhaps browsers will be able to handle linear color spaces at higher
bit depths, but I don't know if anybody is seriously considering that
yet, so I doubt there's any point in spec'ing anything in WebGL 1.0.)

* WebGL has texImage2D calls that take raw buffers of data.  Currently
(default) behavior passes this data straight to the texture formats,
so any automatic gamma treatment would be a change.

* browsers can load image formats in PNG and JPEG, which are most
typically 8-bit sRGB.  WebGL behavior when using these formats is
definitely worth spec'ing.

* PNG has the ability to specify a per image gamma value (the gAMA
thing referenced earlier).  Browsers appear to handle this
differently.  see this reference page, at "Images with gamma chunks":
http://www.libpng.org/pub/png/pngsuite.html  On the Mac I'm using
right now, Chrome and Safari do not do gamma correction, while Firefox
does.  You can also clearly see the quantization errors in the Firefox
images with the lower gamma values.  The Chrome and Safari behavior is
(arguably) a bug.

* PNG has the ability to store 16-bit color depth.  However, my
understanding is that current browsers take all input images
(including PNG images with 16-bit color depth) and convert them to
sRGB_Alpha internally, before WebGL has a chance to see the data.
Also, the WebGL spec does not appear to have any texture formats that
have more than 8 bits per color component.  This would be a great
thing to improve, post WebGL 1.0, since hi-fi WebGL apps could make
good use of it.

It seems to me there are two unresolved questions for WebGL 1.0

1) Should WebGL attempt to nail down how browsers are supposed to
handle PNG's with a non-default gAMA value?  Viable options here are:

  a) leave it up to the browser (status quo, but behavior may differ
among browser; in practice apps will have to supply sRGB data and do
any additional conversions themselves).

  b) demand conversion to sRGB_Alpha based on PNG metadata (i.e.
converge on current Firefox behavior, however non-sRGB behavior will
be a corner case for browsers and smart WebGL developers may opt to
always supply sRGB data and do any conversions themselves)

  c) demand passing raw data straight through.  16-bit components
would be rounded or truncated to 8-bit.  (i.e. converge on current
WebKit behavior, similar caveats as option b)

2) Should WebGL add a PixelStore option that does some kind of gamma
conversion?  (Where the thread started.)  IMO the status quo (do
nothing) is pretty much fine.  Apps that want a specific
interpretation of their image data can either pre-process it so the
raw data matches what they want in texture RAM, or else do custom
processing via GPU with the existing facilities.

>> I believe EXT_texture_sRGB is designed to make it easier to be "linear
>> correct" under these circumstances.  It basically allows you to tell
>> OpenGL that your 8-bit texture data is in sRGB format, and that you
>> want OpenGL to an 8-bit-sRGB to higher-precision-linear-RGB conversion
>> when sampling the data from a shader.  Your shader will operate on
>> linear data (in floating point format), and the frame buffer is still
>> your shader's problem.
>>
> Yes - but it's an extension that we can't rely on always being
> implemented (I doubt ANGLE could emulate it either).

It can be trivially implemented on any hardware that has an internal
texture format with at least 12 bits per component; just convert the
data to linear and store in the higher-depth format.  The practical
problem is that it wastes texture RAM, hence the preference to have
lookup tables in the GPU.

-T

-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: