[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] plugging floating point holes

We largely fixed this:
If OES_texture_float is enabled, you can make 32F textures, and sample from them without filtering.
If OES_texture_float_linear is enabled, 32F textures may be filtered. (sampled from with LINEAR)
if WEBGL_color_buffer_float is enabled, RGBA32F becomes a color-renderable effective internal format for textures and renderbuffers. Also, RGBA/FLOAT supplants RGBA/UNSIGNED_BYTE as the default supported format/type for ReadPixel.

This same pattern works applies to half-floats as well.

Historically, it was pretty poorly specified, but we're in a better state today. Implementations which do not match what I state above are not (I believe) in compliance with the relevant extension specs.

On Fri, Aug 7, 2015 at 1:11 AM, Florian Bösch <pyalot@gmail.com> wrote:
A question I often receive from clients is:

How many people will have support for rendering to,  filtering and blending of any kind of float.

Unfortunately I can't answer that question, not with webglstats.com anyway.

I can't answer it because the extensions to answer these questions aren't universally implemented, and no clear path or intention on the part of some vendors to implement them either. In particular the color_buffer_float/half-float variants (which some vendors implicitly enable).

It's been suggested to me multiple times to just test if floating point textures can be rendered into. Unfortunately that is not a feasible thing to do for webglstats.com.

If I do that, I'd have to test RGB, RGBA, Float32, Float16, etc. I can skip over the less popular formats, but it still involves creating a buffer, creating a shader, creating a framebuffer object, creating a texture, attaching the texture to the framebuffer, setting up pointers, drawing a quad to fill the floating point texture, creating another framebuffer, creating a byte texture, rendering the result of the floating point texture to the byte texture (because floating point readback isn't necessairly supported either), read back the byte texture and finally see if rendering to worked.

This process (depending on machine) takes me something between 200-500ms. During that time a user-agent will be either unresponsive, or stuttery, and it might, or might not delay loading of the page.

As you can imagine, it's not feasible for webglstats to interfere with other peoples pages loading any more than necessary, and a multi-hundred millisecond impact is simply something I cannot do.

Additionally, reading back floating point data is unspecified altogether (there isn't an extension for it in any case).

So floating point textures are still in a state of being an incompatible, unknown, dysfunctional mess. That is more than 4 years after the introduction of WebGL, and the universal acknowledgement that floating point textures are an extraordinary useful feature.

This situation is a serious obstacle for me personally and WebGL as a whole. It's not the lack of hardware support that's missing, because some variant or other of floating point support is available on nearly any semi-modern GPU in existence. What it is is the completely botched nature of the software integration of floating point textures on the UAs side, which persists to this day.

This cannot go on, and WebGL2 isn't the answer. It isn't because it will be years and years before one can feasably drop WebGL1 support.

Could representatives of every UA please render their suggestions about how they plan to fix this?