[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Gamma correction and texImage2D/texSubImage2D

Thatcher Ulrich wrote:
> On Sat, Sep 4, 2010 at 2:09 PM, Steve Baker <steve@sjbaker.org> wrote:
>> So can we agree on this?
>> 1) The WebGL color space shall be clearly defined to be a linear color
>> space.
>> This is essential for things like cross-platform shader code
>> compatibility - and it's what all GPU's do anyway so it's no extra
>> imposition.
> Hm.  I'm uneasy about this.  AFAIK OpenGL doesn't mandate a color
> space.  By default it defines its filtering operations *as if*
> everything is linear, but in practice, for framebuffer formats with
> 8-bit color components, the output is generally treated as sRGB.  That
> means the typical colorspace-ignorant app is taking sRGB input data,
> filtering it as if it is linear, and putting the results in an sRGB
> framebuffer to be displayed on an sRGB monitor.  Status quo.
> If you care about doing your lighting and filtering in true linear
> space, then you need to take some special measures (i.e. figure out
> how to get linear data out of textures with acceptable fidelity, and
> do a final linear-to-sRGB step when outputting to a framebuffer with
> 8-bit components).  WebGL should not diverge from OpenGL here -- the
> same special measures should apply to both WebGL and OpenGL.
OpenGL can afford not to explicitly state the color space because it
steadfastly does not involve itself with how files are loaded or how the
windowing system treats the output.  It does such-and-such arithmetic on
such-and-such data without ever saying what the data is.  (Although some
parts of the spec make a very strong implication that it's linear - and
most of the examples in the RedBook and OrangeBook rely on that being true).


WebGL includes file loading operations and ties itself very firmly to
one particular output mechanism (the canvas system) which in turn ties
down the nature of output operations.  By adding that into the OpenGL
spec, we've opened this particular can of worms...hence this entire issue.

What I'm advocating doesn't prevent people from being blissfully
ignorant of all of the gamma/color-space issues - they can turn off all
of the file loader color space conversion and the consequences will be
very much what they are if you don't take care to do it right in OpenGL
on the desktop.  You get kinda crappy graphics that you can sorta get
away with unless you're trying to make a really accurate simulation or a
gorgeous-looking AAA games title.   It's the difference between winding
up with something that looks like Mario64 - or something that looks like
Red Dead Redemption.

I agree that there are undoubtedly going to be issues with roundoff
error and precision.  In the short term, I'll do what I've always done -
instructed my artists to paint textures in 'linear' color space and to
have their monitors gamma color-calibrated every six months to ensure
that they're seeing those textures optimally.  That's what professional
graphics organizations do.  One day, we'll finally see the end of 16 bit
colors and less use of 24 bit colors - and start to use 36, 48 or even
floating point color for high dynamic range rendering (I do this already
in some applications).  Since WebGL will still be around then, we need
to get the specification right now.   HDR lighting shows up these kinds
of problems in sharp relief - and almost all 'realistic' 3D games use it
- it's the way of the future for sure.  When colors lie outside the 0..1
range, you really can't do things with sRGB.

So this is important and we can't just dismiss the matter by saying -
"Well, OpenGL doesn't do it".  OpenGL doesn't have to do it...we do.
> WebGL definitely should not do any automatic or default color space
> conversion that OpenGL doesn't do.  In particular, converting from
> 8-bit-sRGB to 8-bit-linearRGB loses fidelity, so it's not something
> that should ever be done without somebody asking for it specifically.
OpenGL doesn't do it because it doesn't load files.  WebGL must do it
because it provides file loaders to the application.  I agree that we
mustn't force people to accept conversions - I've been very careful to
say that they must be optional.

I would actually prefer that they were disabled by default - but I can
live with having them enabled by default if that's the consensus here. 
Perhaps the best thing is not to have a default at all - to require a
non-optional token in the file loader command that says either "I want
color-space conversions" or "I don't want conversions".   If there is no
default, everyone has to think carefully about what they want.  But I
don't particularly care.   So long as I can choose to turn them off, I'm
> I believe EXT_texture_sRGB is designed to make it easier to be "linear
> correct" under these circumstances.  It basically allows you to tell
> OpenGL that your 8-bit texture data is in sRGB format, and that you
> want OpenGL to an 8-bit-sRGB to higher-precision-linear-RGB conversion
> when sampling the data from a shader.  Your shader will operate on
> linear data (in floating point format), and the frame buffer is still
> your shader's problem.
Yes - but it's an extension that we can't rely on always being
implemented (I doubt ANGLE could emulate it either).   Also the
extension is poorly written - it gives the implementation the choice to
do interpolation linearly instead of 'correctly' for sRGB - and provides
no way for the application to tell whether it's done right or not.

    -- Steve

You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: