[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WebGL support for high bit depth rendering & display



On Wed, Sep 19, 2012 at 7:22 PM, Won Chun <wonchun@google.com> wrote:
I stated this caveat pretty explicitly in the quoted text. I have not gotten all the fancy bits in place, but supposedly, yes it should work:

http://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf

I'm not sure if there is specific black magic going on here. For example, sometimes hardware video decoders bypass compositors. It could be there is a similar punch-through for 30-bit color.
That's a pretty powerful "punch trough" right there.

1) Accept the 30-bit color to a 30-bit color buffer
2) deliver 24-bit color to GDI/Compositor
3) analyze the GDI output for the presence of your downsampled color buffer
4) paste together 24-bit GDI output cutting out the bits you have better bits for sampling from your 30-bit buffer
5) deliver over special cable to monitor

Uh-hum, that's gonna be fun trying to support in WebGL :)