[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Adding internalformat param to all texImage2d variants



On 2010-05-20 12:35, Cedric Vivier wrote:
On Thu, May 20, 2010 at 17:30, Tim Johansson<timj@opera.com> wrote:
On 2010-05-20 11:14, Cedric Vivier wrote:
ES section 4.4.7 second paragraph defines conversion takes place.
The conversions rule used are once again the same as in the conversion
table proposed, ie. ES table 3.12, GL tables 3.11 and 3.20 (or 3.20/23
depending spec version).
So if you create a texture from an Image object, then bind it to and FBO and
render full RGBA data to it the result will be different depending on the
internal format?
If the original image was RGB the alpha channel could be lost when I render
to it, but it might not depending on the implementation.
Good point :)
But is this really an issue ?
I mean, this could happen only if the developer lets the
implementation decide the internalformat of a texture
(0/NONE/DONT_CARE) that he will be using for render-to-texture usage.
It implies the developer does not mind potential loss of information
on the render-to-texture result (meaning here, that the developer does
not need/want alpha).

Unspecified or incompatible behavior usually comes back to haunt us, even if we don't think it should matter much. I can think of a few cases when you might be tempted to do something like that, for example if you have a grayscale (luminance) image and want to render a red X on top of it, so I think will give us problems at some point if we don't specify it properly.

We could either consider that it is the responsibility of the
developer who performs render-to-texture to make sure the target
texture has been loaded with intended format, or - perhaps - we could
generate an error when attempting to FBO attach a WebGLTexture that
has been loaded with an internalformat of implementation's choosing,
to force this to be explicit. What do you think ?

Like I said above, I don't think we should leave it up to the developers.

The only options left I can think of are either making sure the internal format is always the same for given image (which could be done by removing the fallback to RGBA or by always creating RGB(A) textures) or your suggestion to not allow textures with "automatic" format to be used as render targets.

It doesn't seem like a big limitation to have to specify the internal format to use the texture as a render target, it would mean the webgl implementation has to remember if the texture format was automatically chosen or not (which is just a flag in the texture object). It might also be a bit confusing that rendering to some textures doesn't work, maybe we can solve that with better documentation.


If the gles manpages are correct there is also an issue with copyTexSubImage2D, it requires the internal format of the texture you are copying to to be a subset of the internal format of the framebuffer. If you have an RGB framebuffer it would be possible to copy the data from that to another RGB texture, but not to a RGBA texture. If the internal format varies between implementations copyTexSubImage2D could fail on some browsers which always stores the data as RGBA but not on others which uses RGB when there is no alpha channel.


The same limitation applies to copyTexImage2D, but since that changes the internal format it should only matter when the texture with automatic internal format is bound as a framebuffer. I think that case would be solved by fixing the binding to framebuffer issue in general.

I'm still a bit worried that there are more potential incompatibilities we have not thought about yet, so not specifying strictly what format will be used seems scary to me.

//Tim

-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: