[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WebGL support for high bit depth rendering & display



Why is it difficult for the browser to composite its 24-bit DOM elements, along with the 30-bit webgl context, into a 30-bit backbuffer?

On Tue, Sep 18, 2012 at 8:14 PM, Jeff Gilbert <jgilbert@mozilla.com> wrote:

The main issue is that WebGL does not draw directly to the screen. We draw to an offscreen buffer, which is then passed to the compositing engine to process and (eventually, probably) get to the screen. Remember that DOM elements can sit on top of WebGL contexts, and even (in theory) apply css (even css3d) transforms to WebGL canvases.

I understand that this would be useful, and indeed should be relatively easy to add to the spec, but it would appear to be extremely nontrivial to implement.

-Jeff

----- Original Message -----
From: "Matt McLin" <mr.mclin@gmail.com>
To: "Florian Bösch" <pyalot@gmail.com>, "public webgl" <public_webgl@khronos.org>
Sent: Tuesday, September 18, 2012 3:47:40 PM
Subject: Re: [Public WebGL] WebGL support for high bit depth rendering & display



Hi guys,


I agree with Florian that 8-bit sRGB is not a substitute for linear 10-bit, for a variety of reasons. We could spend a very long time discussing & debating the need for > 8-bits per channel, but let's just assume for a moment it is necessary, and the question is how to achieve this via a browser?


Regarding challenge of getting 10-10-10 backbuffer to the screen, the challenge is really only in copying it into the desktop framebuffer -- which I don't think is really a challenge at all. If the browser were to allocate the backbuffer as 30-bit (a trivial change), and if it were to use a HW accelerated path to directly present that backbuffer (this is already the case), then the graphics drivers and hardware should already be handling the appropriate format conversions. As I mentioned, AMD, nVidia & others already provide solutions today for sending 30-bit content to the screen. In my opinion, the only missing link is in the WebGL spec, and the browser.


Florian, I'm not sure I understood your question about existing extensions. As far as I know, I don't need any particular extension to use 10bpc & higher texture formats in OpenGL 3.0. For OpenGL ES 2.0, I believe the following extensions provide the formats I'm looking for: GL_UNSIGNED_INT_2_10_10_10_REV_EXT and GL_EXT_texture_storage. Perhaps inventing a WebGL version of these extensions would be a step in the right direction?


Regards, Matt







On Tue, Sep 18, 2012 at 1:20 AM, Florian Bösch < pyalot@gmail.com > wrote:



On Tue, Sep 18, 2012 at 7:34 AM, Mark Callow < callow_mark@hicorp.co.jp > wrote:







Converting a RGB 10_10_10 image to 8-bit sRGB before displaying on the screen is a reasonable solution. The wider formats are used to permit computations in physically linear space without introducing as many artifacts as 8-bit. All the information discernable by the human visual system that is contained in 10-bit physically-linear images can be displayed by an 8-bit sRGB perceptually-linear encoding.
Medical imaging unfortunately applies somewhat more stringent requirements than "perceptability" or "looks pretty enough". And not only for reasons of a high-sense of correctness. If a doctor is using a pipeline that's tampering with the produced colors and I'm sure there's some arcane medical standard for this, they become open to liability trough malpractise.


-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email:
unsubscribe public_webgl
-----------------------------------------------------------




--
Jeff Russell
Engineer, Marmoset
www.marmoset.co