[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Rendering to HDR displays (10bits per color component)



Something to add is that HDMI 2.0a and forward supports HDR formats, so it should be possible by GPU manufacturers to add support for 10bits front/back buffers just by updating the drivers. Current HDR professional solutions rely on rendering to a FBO and downloading every frame back to RAM to send it through some special video card to the HDR display.

Anyway, I guess this request will have to scalate to OpenGL group.

Thanks for your time Florian.



On Thu, Jul 12, 2018 at 1:16 PM, Florian Bösch <pyalot@gmail.com> wrote:
On Thu, Jul 12, 2018 at 12:33 PM, Javi Agenjo <javi.agenjo@gmail.com> wrote:
but if Chrome supports HDR video rendering (as far as they say), there has to be some sort of pipeline going on outputing to 10bits, unless it is all happenning beyond the pipeline through some sort of decoding chip inside the GPU.

My guess is it's a feature of the hardware accelerated video decoder.
 
Im asking because Im working in an European project related to HDR (HDR4EU) and there are companies pushing HDR displays for consumers so there are reasons to expect changes in the near future, with better quality and gamuts. So it would be interesting to see some suggestions about how browsers can adapt to that change in the next years.

I would absolutely love HDR capability trough the pipeline. The 8-bit per channel convention is ridiculous nowadays because the actual display hardware (especially in OLED displays) is capable of many more graduations (even if the decoder chips is in the monitor aren't). Linear color space floating point rendering is becoming the norm, only for the result to be squashed together into a gamma/8-bit channel. It's nuts.