[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Rendering to HDR displays (10bits per color component)



On Thu, Jul 12, 2018 at 12:33 PM, Javi Agenjo <javi.agenjo@gmail.com> wrote:
but if Chrome supports HDR video rendering (as far as they say), there has to be some sort of pipeline going on outputing to 10bits, unless it is all happenning beyond the pipeline through some sort of decoding chip inside the GPU.

My guess is it's a feature of the hardware accelerated video decoder.
 
Im asking because Im working in an European project related to HDR (HDR4EU) and there are companies pushing HDR displays for consumers so there are reasons to expect changes in the near future, with better quality and gamuts. So it would be interesting to see some suggestions about how browsers can adapt to that change in the next years.

I would absolutely love HDR capability trough the pipeline. The 8-bit per channel convention is ridiculous nowadays because the actual display hardware (especially in OLED displays) is capable of many more graduations (even if the decoder chips is in the monitor aren't). Linear color space floating point rendering is becoming the norm, only for the result to be squashed together into a gamma/8-bit channel. It's nuts.