[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WebGL support for high bit depth rendering & display

Yep, what Jeff said. There isn't really any reason technically -- Windows supports this just fine, provided you have the drivers + monitor + cable to actually deliver 10 bit output. But you have to create everything in the path with the right OpenGL pixel format(s), and Firefox at least uses 8-bpc RGB/RGBA throughout. You also wouldn't want to use a 10bpc (or a 64-bit 10/10/10/10 path) in the general case, so you'd need to be able to swap the entire pipeline when 10bpc WebGL content shows up.

It's doable, but not trivial; it would certainly be very nice to support these use cases, especially since I can see how they would be useful in the context of the original request (medical imaging) to have all this be integrated directly into a web app, but it's a good amount of work.

    - Vlad

On 9/18/2012 10:12 PM, Jeff Gilbert wrote:

We don't have 30-bit RGB color paths throughout our rendering path, so while it may be possible to hack something in, there would need to be a very limited, specific set of situations where we could guarantee 30-bit color gets to the screen. We could likely not guarantee it in the general sense. I do not know how much work, if any, it would be to retrofit our layers code to deal properly with 30-bit formats. And all this would only work on systems which have hardware-accelerated compositing enabled, which is not nearly universal. We would also clearly need to switch to a drawing buffer that supports 30-bit color on the fly, which can be much easier said than done.

This is an interesting idea, but it is not nearly as simple as it
looks on paper. I'm not saying it can't be done, but rather that it
is not something easily (or quickly) done. However, patches and
proposals are always welcome.


----- Original Message ----- From: "Jeff Russell" <jeffdr@gmail.com>
To: "Jeff Gilbert" <jgilbert@mozilla.com> Cc: "Matt McLin"
<mr.mclin@gmail.com>, "Florian Bösch" <pyalot@gmail.com>, "public
webgl" <public_webgl@khronos.org> Sent: Tuesday, September 18, 2012
5:45:48 PM Subject: Re: [Public WebGL] WebGL support for high bit
depth rendering & display

Why is it difficult for the browser to composite its 24-bit DOM
elements, along with the 30-bit webgl context, into a 30-bit

On Tue, Sep 18, 2012 at 8:14 PM, Jeff Gilbert < jgilbert@mozilla.com > wrote:

The main issue is that WebGL does not draw directly to the screen. We
draw to an offscreen buffer, which is then passed to the compositing
engine to process and (eventually, probably) get to the screen.
Remember that DOM elements can sit on top of WebGL contexts, and even
(in theory) apply css (even css3d) transforms to WebGL canvases.

I understand that this would be useful, and indeed should be
relatively easy to add to the spec, but it would appear to be
extremely nontrivial to implement.


----- Original Message -----

From: "Matt McLin" < mr.mclin@gmail.com > To: "Florian Bösch" <
pyalot@gmail.com >, "public webgl" < public_webgl@khronos.org > Sent:
Tuesday, September 18, 2012 3:47:40 PM Subject: Re: [Public WebGL]
WebGL support for high bit depth rendering & display

Hi guys,

I agree with Florian that 8-bit sRGB is not a substitute for linear 10-bit, for a variety of reasons. We could spend a very long time discussing & debating the need for > 8-bits per channel, but let's just assume for a moment it is necessary, and the question is how to achieve this via a browser?

Regarding challenge of getting 10-10-10 backbuffer to the screen, the challenge is really only in copying it into the desktop framebuffer -- which I don't think is really a challenge at all. If the browser were to allocate the backbuffer as 30-bit (a trivial change), and if it were to use a HW accelerated path to directly present that backbuffer (this is already the case), then the graphics drivers and hardware should already be handling the appropriate format conversions. As I mentioned, AMD, nVidia & others already provide solutions today for sending 30-bit content to the screen. In my opinion, the only missing link is in the WebGL spec, and the browser.

Florian, I'm not sure I understood your question about existing extensions. As far as I know, I don't need any particular extension to use 10bpc & higher texture formats in OpenGL 3.0. For OpenGL ES 2.0, I believe the following extensions provide the formats I'm looking for: GL_UNSIGNED_INT_2_10_10_10_REV_EXT and GL_EXT_texture_storage. Perhaps inventing a WebGL version of these extensions would be a step in the right direction?

Regards, Matt

On Tue, Sep 18, 2012 at 1:20 AM, Florian Bösch < pyalot@gmail.com >

On Tue, Sep 18, 2012 at 7:34 AM, Mark Callow <
callow_mark@hicorp.co.jp > wrote:

Converting a RGB 10_10_10 image to 8-bit sRGB before displaying on
the screen is a reasonable solution. The wider formats are used to
permit computations in physically linear space without introducing as
many artifacts as 8-bit. All the information discernable by the human
visual system that is contained in 10-bit physically-linear images
can be displayed by an 8-bit sRGB perceptually-linear encoding.
Medical imaging unfortunately applies somewhat more stringent
requirements than "perceptability" or "looks pretty enough". And not
only for reasons of a high-sense of correctness. If a doctor is using
a pipeline that's tampering with the produced colors and I'm sure
there's some arcane medical standard for this, they become open to
liability trough malpractise.

----------------------------------------------------------- You are currently subscribed to public_webgl@khronos.org . To unsubscribe, send an email to majordomo@khronos.org with the following command in the body of your email: unsubscribe public_webgl -----------------------------------------------------------

You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email:
unsubscribe public_webgl