[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] Render to texture
On Mon, Aug 30, 2010 at 12:20 PM, <email@example.com> wrote:
> So is going via the DOM/canvas mechanism likely to hurt performance with
> present implementations? Would the image be copied from place to place?
> (I confess my knowledge of what's going on here is a little vague - please
> educate me!) Is there no way to render directly into a texture without
> involving a canvas?
Yes, there will currently be a significant performance penalty
involved with rendering WebGL to one canvas and then uploading it as a
texture to another one (or even within the same context -- although
you can get the same effect with copyTexSubImage2D). This is true at
least in the WebKit WebGL implementation.
The recommended way to perform render-to-texture of the color buffers
in WebGL is to allocate an FBO and attach a texture as the color
attachment. This functionality is supposed to be guaranteed on any
platform claiming WebGL support. Note though the OpenGL ES 2.0 NPOT
> The render-depth-into-RGBA approach is perfectly OK for shadow mapping -
> (because you already need a separate render pass in order to render from
> the point of view of the light source). But for the other uses of depth
> textures, you need an entire extra render pass from the perspective of the
> camera to capture the Z data.
> That's not a bad idea on a C++-based OpenGL system since the savings you
> can get from occlusion testing pay for CPU & vertex cost of the extra pass
> and the benefits of using a cheaper shader on that pass pay - and fewer
> pixels hit on subsequent passes - pay for the extra fill rate costs. But
> that extra pass kinda large - and without the benefit of occlusion testing
> to win that back, it's a painful thing.
> -- Steve
>> Of course we're using WebGL so you the DOM available. You could
>> _technically_ rendering to one webgl canvas, and then load that webgl
>> canvas as a texture into another one (and presumably if it were deemed
>> important enough implementations would optimise that use case to avoid too
>> much copying)
>> On Aug 30, 2010, at 10:31 AM, Kenneth Russell wrote:
>>> On Mon, Aug 30, 2010 at 8:29 AM, <firstname.lastname@example.org> wrote:
>>>> Are there any examples out there of the correct way to set up for
>>>> rendering to texture - and (especially) setting up rendering to and
>>>> reading back from depth buffer textures? Getting the latter to work
>>>> portably is always a bitch...I'd like to get the 'official' way to do
>>> Rendering to a depth texture isn't supported in core OpenGL ES 2.0 and
>>> therefore not in WebGL without an extension (GL_OES_depth_texture).
>>> However, you can approximate it by writing your normalized depth value
>>> into the color channels of an RGBA texture. One example of this
>>> technique is in the Shadow Mapping sample of the O3D/WebGL library;
>>> see http://code.google.com/p/o3d/wiki/Samples . There's another in
>>> SpiderGL; see http://www.spidergl.org/example.php?id=6 .
>>> You are currently subscribed to email@example.com.
>>> To unsubscribe, send an email to firstname.lastname@example.org with
>>> the following command in the body of your email:
You are currently subscribed to email@example.com.
To unsubscribe, send an email to firstname.lastname@example.org with
the following command in the body of your email: