[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] Gamma correction and texImage2D/texSubImage2D
Chris Marrin wrote:
> On Sep 5, 2010, at 4:51 AM, stephen white wrote:
>> On 05/09/2010, at 8:26 PM, Thatcher Ulrich wrote:
>>> It seems to me there are two unresolved questions for WebGL 1.0
>> I'm not sure that my point was understood, so I'll try again...
>> When a browser composites an image onto the page, that image is the entire block of pixels. The browser can have simple rules because of that.
>> When WebGL draws to its canvas, it is using a number of images within the block of pixels. Therefore the number of different images may have different colour spaces and/or gammas.
>> The additional complexity is coming from multiple images used together, which is not a problem that simply displaying an image had to worry about.
>> As far as I can work this out, Steve Baker's suggestion of an option to reverse-map back to a linear colour space is the only mathematically valid option.
>> In practical terms, this would work out to a "WEBGL_MAP_TO_LINEAR" loading time option that doesn't do any work for linear PNGs but does fix up colour space JPGs.
>> If the option isn't set, then the JPGs are not touched and it's up to the programmer to do what they think is best.
> It's true that native OpenGL apps can make any decisions they want about the color space of incoming images, the represented color space of the drawing buffer, and the conversions done to the drawing buffer on display. But we have to be well defined on all aspects of color space management. That doesn't mean we have to be "correct" (if there is such a thing). We just have to be consistent.
We have to be both consistent AND correct if we want pretty
graphics...which (forgive me if I'm wrong!) is the ultimate goal here!
3D graphics is unforgiving of mathematical laxity. Most of the modern
advances in graphics technology are in the removal of mathematical
kludges made necessary by the primitive hardware that was around when
OpenGL and D3D were first specified. (Consider, for example, doing
per-pixel phong lighting and not per-vertex gouraud shading - or doing HDR).
Sadly, we can't mandate correctness because some of WebGL's low-end
client hardware is still pretty primitive - but we can (and I would
argue "must") write a specification that doesn't mandate incorrectness.
We hope that this specification will still be in active use 10 or 20
years from now...the core principles have to be mathematically sound or
we'll be patching and kludging stuff well past my retirement date!
> Given the text in the http://www.opengl.org/registry/specs/EXT/texture_sRGB.txt, it seems as though (at least most recently) OpenGL is assuming today's images are coming in linear. I think that's a bad assumption since, as Ollie points out, WebKit assumes JPEG images without a color profile to be sRGB. But I'm ready to accept that the should break with the 2D Canvas' tradition of representing the canvas as sRGB and maintain the WebGL canvas as linear.
> I'd like to get the discussion more focused. We have to decide several things:
> 1) What color space is the WebGL canvas?
> As I said, I agree that the drawing buffer should be considered to be linear. This implies that the drawing buffer will have to be color corrected when composited with the rest of the HTML page. This also implies that, when using a WebGL canvas as the source image for a 2D Canvas drawImage() operation, it has to be color corrected into sRGB space, since that's what 2D Canvas expects.
I don't see where the canvas spec says that canvasses are sRGB. I see
that it says that you have to convert your canvas to "device color
space" (which is likely to be sRGB) on output - but that's exactly what
we're proposing here. So as far as I can tell, doing what you (and I)
want here - isn't "breaking with the 2D canvas' tradition" - it's
"following the canvas spec to the letter".
Despite the sRGB extension - the GPU is still a linear-color space
engine. I boldly predict that it'll never be otherwise. GLSL has
operations like 'lerp' and would need some god-awful piece of
mathematics to replace it in order to do the equivalent thing in sRGB.
There is no arguing that point - there are not, nor ever have been sRGB
color-space GPU's. The best you could argue would be "The errors in
kludging sRGB through a linear hardware engine are acceptable" - and
that's something that I'd fight tooth and nail because it's demonstrably
> 2) What is the default incoming color space of images as textures, and what are the options to change that?
> I believe the choices are: unchanged (I will call that raw), linear and sRGB. I think we need to support raw and linear.If an when we support EXT_texture_sRGB, we will get the ability to support incoming sRGB images.
> This decision really concerns me. WebKit considers images without a color profile to be sRGB, but OpenGL (by statements in the EXT_texture_sRGB spec) assumes images without a profile to be linear. Which assumption do we make? Do we color correct all incoming images into linear space (assuming they have all been color corrected into sRGB already)? Or do we bring in images without a color profile unchanged, and only correct images that have a defined color profile. If we do the former, we will have different visual results than the corresponding native OpenGL program. If we do the latter, we will get results that are inconsistent with the rest of the HTML page.
Why don't we just let the application writer choose. When you load an
image into a texture, you say "RAW" (don't mess with my texels!) or
"AUTOMATIC" (please convert my texels to linear-color-space (if
necessary) according to whatever the file header says)...and in some
future release "sRGB" (please convert my texels to sRGB color space (if
necessary) according to whatever the file header says because I'm going
to use the sRGB extension).
I believe that all of the image file formats that we support have some
kind of gamma value.
We have all of the bases covered that way.
> 3) What is the color space of pixels coming out of the drawing buffer via toDataURL() and readPixels()?
> For consistency, I think toDataURL() should be color corrected into sRGB. But it seems more reasonable to leave the results of readPixels() in the same linear color space of the drawing buffer. This would have to be clearly spelled out.
I think we should follow the same rules as for file loading: "RAW" -
don't convert please. "AUTOMATIC" - please convert as necessary...let
the application decide.
My reasoning is that the specification should allow applications to be
mathematically correct - which implies a color space conversion.
However, if you were to do something like this on a cellphone with 5/6/5
textures and frame buffer - then the results would be horrible...also,
there are situations where this operation might be in a
performance-critical pathway. So the pragmatic option is to say "don't
mess with my pixels".
* Since I strongly disagree with forcing mathematical incorrectness into
the specification, we need to be able to convert between color spaces.
* Since I strongly require to be able to kludge around ugly roundoff
problems and to fix performance issues, and to do "non-traditional"
rendering (like doing physics and collision detection in the GPU!). I
need to be able to turn that off in my application.
A good specification 'allows' elegant, automatic correctness - but
doesn't mandate it where an individual application has performance or
quality needs that might suffer as a result.
You are currently subscribed to firstname.lastname@example.org.
To unsubscribe, send an email to email@example.com with
the following command in the body of your email: