[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] WebGL support for high bit depth rendering & display
I'd like to follow up on a few interesting points made:
I think it was implied supporting > 8-bits per channel output in WebGL would encompass all of the following:
1) Accept the 30-bit color to a 30-bit color buffer
2) deliver 24-bit color to GDI/Compositor
3) analyze the GDI output for the presence of your downsampled color buffer
4) paste together 24-bit GDI output cutting out the bits you have better bits for sampling from your 30-bit buffer
5) deliver over special cable to monitor
Uh-hum, that's gonna be fun trying to support in WebGL :)
Steps 2 through 5 are handled by the display driver & graphics HW, are already solved by AMD/nVidia (not to mention industry standards bodies such as VESA), and are out of scope for WebGL. I've seen several comments in this thread emphasizing the complicated setup that has to be in place to actually visualize 30-bits on the screen, but the truth is many people are already doing this today. And it's not hard. I have personally tried out 30-bit solutions from AMD, nVidia, and Matrox, providing 30-bit color out to the display inside of a window on the desktop. Chrome & Firefox are just another application on the desktop that can choose to take advantage of the 30-bit capability provided by the gfx drivers, or leave it alone. My hope is that it doesn't get left alone for too much longer.
Firefox at least uses 8-bpc RGB/RGBA throughout
I can easily imagine that to provide 10-bpc or higher throughout all of Firefox's display pipeline would be a lot of work. But perhaps we don't need to go nearly that far to get 90% of the use cases. I don't care to have my 2D lines drawn in 30-bit color, I don't care if my text gets anti-aliased with 30-bit precision, and I don't care if some obscure method of alpha-blending or overlaying other non-WebGL content could result in losing the additional bits -- I just want a way to ask the browser to please create a 30-bit backbuffer, after doing a simple check if the underlying driver/HW supports this. With that, I can now draw into that backbuffer how I see fit (with 30-bit source content if I have it), and I can take the responsibility to ensure my driver, display, cable, etc. is configured correctly to see 30-bits. And in the event it is not, the result is just truncated to 24-bits.
It's also extremely hard to test for if you don't have a rig like this.
Does Khronos, Chrome, or Firefox, or any other browser developer care to prove that 24-bit color is accurately transmitted to the display and all 8-bits per channel are perceptually unique to the eye? Of course not. That 24-bit image on a laptop screen is often down-converted to 6-bits per channel, and the browser doesn't care. Neither should the WebGL app/browser care how the 30-bits are treated. Note that there do exist means of reading back the desktop content (albeit slowly) if you want to validate that the browser is putting the appropriate pixel bits into the desktop -- so this kind of testing/validation could be automated.
Also, for any that doubt the benefit of > 8-bits per channel, consider this link:
See slide 48 in this presentation on just-noticeable-differences vs. luminance from WinHec 2008. You see that 8-bits was sufficient for low luminance levels of early displays, but as luminance increases, people are able to discern between 9 and 10 bits. I have looked at the difference between 8, 9, and 10 bit gradients myself many times, and can vouch for its visibility on a good display.
In summary, I propose that a (browser-specific if necessary) WebGL extension be defined, say WEBGL_deep_color_context, which if available would indicate to the app that the browser+graphics driver support certain deep color (>8-bit per channel) backbuffer pixel formats, and if used, would cause the browser to create it's WebGL context & backbuffer in a particular deep color format specified when requesting the context.
Thanks for your consideration and feedback!