[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] Rendering to HDR displays (10bits per color component)
I'm not 100% sure how this is implemented nor on what platforms. The folks on email@example.com
will know. My understanding is that it's on Windows (using D3D under the hood) and Android, and possibly macOS. On each of these platforms I believe that a platform-specific GPU memory buffer is allocated with a higher bit depth, it's bound to an OpenGL texture using platform-specific APIs, and ultimately presented to the window system's compositor again using platform-specific APIs.
Afaik you cannot create a float16 frontbuffer with OpenGL because WGL's setPixelFormat function does not support a type argument (only the bitplanes) but from that it cannot infer what kind of buffer you where meant to have (other than an integer one).
It looks like you could in theory create a float16 frontbuffer with Direct3Ds DXGI_SWAP_CHAIN_DESC which supports the format argument of the type DXGI_FORMAT_R16G16B16A16_FLOAT. I have no idea what the hardware support for that is, or if it even works at all as intended.