[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WebGL 1.0 ratified and released



Speaking of 16-bit depth buffers... any chance of getting OES_depth24
ported to WebGL?

WebKit nightly appears to always use 16-bit depth buffers on a test
machine even when 24-bit depth buffers are available. On this same
machine (late 2010 iMac), Chrome renders with 24-bit depth but refuses
to create a 24-bit depth attachment. Firefox 4 beta 12 always upgrades
the precision.

iOS appears to always provide OES_depth24.

This behavior (especially Chrome's sometimes-upgrading) results in
e.g. nasty and confusing picking bugs where the visible render looks
OK but the pixel sampled is totally wrong.

Have I missed something? Is this the intended behavior? Are the 16-bit
buffers fixed-point or half-floats? If I recall the spec correctly,
fixed-point cannot be upgraded to floating point but floating point
precision is always upgradable.

Thanks,

David Sheets

On Tue, Mar 8, 2011 at 5:45 PM, Chris Marrin <cmarrin@apple.com> wrote:
>
>
> On Mar 8, 2011, at 1:30 PM, L. Van Warren wrote:
>
>> I’m an old graphics guy, who trained in the Utah Graphics Lab in the mid 1980’s.
>>
>> Looking over the new WebGL standard gives a moment for reflection of the great strides that have been taken.
>>
>> Public standards like this are really important.
>>
>> My only concern is that double precision arithmetic is not being supported.
>>
>> Perhaps with modern graphics GPU’s 64 bit arithmetic is a bottleneck or a hassle, I don’t know.
>>
>> I just recall a family of problems which went away when we computed in double precision.
>>
>> The images were much crisper and easier to filter properly.
>>
>> These issues came up  in intersection calculations that were numerically brittle, such as the intersections of nearly parallel lines in perspective, etc.
>>
>> Other than this it looks great.
>
> The Typed Array spec does support double precision floats and all numbers in JS are doubles. It's only WebGL that can't accept that data to communicate with the GPU and that is a limitation of OpenGL ES 2.0. The OpenGL API "syntax" deals with double precision data just fine and I have no doubt that a future version of GLES (and therefore WebGL) will support it. Heck there's still some embedded hardware that is limited to 16 bit depth buffers!
>
> -----
> ~Chris
> cmarrin@apple.com
>
>
>
>
>
> -----------------------------------------------------------
> You are currently subscribed to public_webgl@khronos.org.
> To unsubscribe, send an email to majordomo@khronos.org with
> the following command in the body of your email:
> unsubscribe public_webgl
> -----------------------------------------------------------
>
>

-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email:
unsubscribe public_webgl
-----------------------------------------------------------