[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Depth Texture Extension

Actually the OES extension just takes 16 and 32-bit integers as source data.  It doesn't *require* the precision to be preserved.

The spec says: 
        As per the OpenGL ES spec, there is no guarantee that the OpenGL ES implementation
    will use the <type> to determine how to store the depth texture internally.  
    It may choose to downsample the 32-bit depth values to 16-bit or even 24-bit.
    There is currently no way for the application to know or find out how the
    depth texture (or any texture) will be stored internally by the OpenGL ES implementation.
So INTZ is sufficient to implement *2D* depth-textures. 


On 2012-01-23, at 2:33 PM, Kornmann, Ralf wrote:

Unity uses either the API Hack INTZ Format or render to a depth buffer and a single channel float color texture at the same time. The INTZ Format uses a 24 bit integer depth value + 8 bit stencil. Both options doesn't match with the OpenGL ES extension that requires 16 or 32 bit integer depth values.


Von: owner-public_webgl@khronos.org [owner-public_webgl@khronos.org] im Auftrag von Florian Bösch [pyalot@gmail.com]
Gesendet: Montag, 23. Januar 2012 20:09
An: Gregg Tavares (勤)
Cc: public webgl
Betreff: Re: [Public WebGL] Depth Texture Extension

On Mon, Jan 23, 2012 at 7:40 PM, Gregg Tavares (勤) <gman@google.com> wrote:

I believe there are issues trying to emulate this in D3D.
According to this page http://unity3d.com/support/documentation/Components/SL-DepthTextures.html in Direct3D 9 a depth texture has the format R32F.

This page mentions that the formats D16 and D24 would also work: http://aras-p.info/texts/D3D9GPUHacks.html

In OpenGL support for them is defined by this extension http://www.opengl.org/registry/specs/ARB/depth_texture.txt supporting the format DEPTH_COMPONENT[16|24|32] (presumably floating point textures).

I'm also not sure if desktop hardware supports depth cubemaps which OES_depth_texture requires.
(not even sure what a depth cubemap is)
If you're attaching a cubemap side to an FBO, and render something in it, the cubemap depth texture would be the recepient of the depth values for that side.

I'm not sure that OpenGL/Direct3D actually lack the capacity to use a cubeside as a depth texture, other than the comment in the ARB extension there doesn't seem to be anything explicitly making it impossible. Likewise I'm not sure that any mobile actually has the capacity to use a cube side as a depth texture. These capabilities should be tested.

There seem to be these two issues:
#1 OpenGL ES specifies depth textures as unsigned int and short, OpenGL/Direct3D specifies them as floats. Correct me if this is a wrong impression.
#2 Support for attaching cube side targets as depth textures is doubtful.

I don't think that issue #2 should make this extension impossible. How to handle issue #1 (int vs. float) would be of some concern in situations where a user whishes to provide depth values to the texImage2D entry points, otherwise it shouldn't matter that much-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email:
unsubscribe public_webgl

                        Daniel Koch -+- daniel@transgaming.com
Senior Graphics Architect -+- TransGaming Inc.  -+- www.transgaming.com