[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Adding internalformat param to all texImage2d variants



On May 14, 2010, at 12:06 PM, Kenneth Russell wrote:

> On Fri, May 14, 2010 at 8:08 AM, Chris Marrin <cmarrin@apple.com> wrote:
>> 
>> On May 14, 2010, at 6:19 AM, Johannes Behr wrote:
>> 
>>> Hi Crhis,
>>> 
>>> thank you and everyone involved in this effort!
>>> 
>>> As far as the X3D/X3DOM relevance goes:
>>> 
>>> This proposal would certainly help to control the final format but
>>> we still would need more information about the original format.
>>> 
>>> How should we know if there where RGB-channels in the original format or not?
>>> 
>>> Remember: We do not have a controlled data-preparation step or process
>>> with X3DOM based application. All we get inside of the X3D-stream is
>>> a reference to the external image-data. No additional information on the
>>> format is provided inside of the XML-stream.
>>> 
>>> But, according to the X3D-spec, we must shade the geometry differently
>>> depending on the channel-setup of the image (wise or not):
>>> 
>>> One example:
>>> 
>>> If we have an A, L or LA image we use the diffuseColor from the material-node
>>> If we have a RGB or RGBA the diffuseColor is not modulated.
>>> 
>>> http://x3dom.org/x3dom/example/x3dom_imageChannels.xhtml
>>> 
>>> This can be easily done during runtime as part of your standard-shader
>>> but we just have to know it upfront.
>> 
>> Yes, this is the original discussion. But you only need to know the original format because that's how X3D defines it today. In fact X3D is deficient in that, if you have an RGB image and want to use it as an alpha channel you can't. X3D has no property to indicate that the image should be stored in texture memory in the requested format.
>> 
>> My proposal would change the rules of the X3D ImageTexture. You'd have to add a format property, where you specify what format you want the texture to be. There might be some incompatibilities in a small amount of X3D content, but I think you'd have a better ImageTexture spec.
>> 
>> The browser vendors (including me) should look at their implementations to understand if it's even practical to get and maintain the original source image format. If it is, we can certainly allow a "SOURCE_FORMAT" internal format value which would keep the format intact (although this would likely require conversion in some cases since most browsers keep images around as RGB(A)). In that case we could also add a get() method which would give you back the current format of a given texture object (maybe that method already exists?).
> 
> In desktop GL the internal format of a texture can be fetched with
> glGetTexLevelParameteriv(GL_TEXTURE_2D, /level/,
> GL_TEXTURE_INTERNAL_FORMAT, ...). Unfortunately
> glGetTexLevelParameter[if]v were removed in OpenGL ES 2.0, as was the
> GL_TEXTURE_INTERNAL_FORMAT enum. We could reintroduce them in WebGL
> but we'd have to emulate all of the functionality when running on
> native OpenGL ES 2.0 (and probably on desktop GL as well for
> consistency).


I would feel comfortable doing that if the browser implementors agree that this is information we can get from the HTML image element.

-----
~Chris
cmarrin@apple.com





-----------------------------------------------------------
You are currently subscribe to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: