[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] Rendering to HDR displays (10bits per color component)
ANGLE is capable of creating swap chains with D3D11 using DXGI_FORMAT_R16G16B16A16_FLOAT and DXGI_FORMAT_R10G10B10A2_UNORM for HDR displays, we had to support this for HDR video output in Chrome. There is also ARB_color_buffer_float for desktop GL which adds floating point backbuffer formats to WGL and GLX but I'm not sure about the guarantees on how these are composited.
I'm not 100% sure how this is implemented nor on what platforms. The folks on firstname.lastname@example.org
will know. My understanding is that it's on Windows (using D3D under the hood) and Android, and possibly macOS. On each of these platforms I believe that a platform-specific GPU memory buffer is allocated with a higher bit depth, it's bound to an OpenGL texture using platform-specific APIs, and ultimately presented to the window system's compositor again using platform-specific APIs.
Afaik you cannot create a float16 frontbuffer with OpenGL because WGL's setPixelFormat function does not support a type argument (only the bitplanes) but from that it cannot infer what kind of buffer you where meant to have (other than an integer one).
It looks like you could in theory create a float16 frontbuffer with Direct3Ds DXGI_SWAP_CHAIN_DESC which supports the format argument of the type DXGI_FORMAT_R16G16B16A16_FLOAT. I have no idea what the hardware support for that is, or if it even works at all as intended.