[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] texImage2D changed...

On Mon, Aug 9, 2010 at 11:57 AM, <steve@sjbaker.org> wrote:
> On Mon, Aug 9, 2010 at 11:13 AM, <steve@sjbaker.org> wrote:

>> I looked
>> at the WebGL spec and some of the working example code to see how we
>> load
>> images from URL's nowadays...and as a quick test, I tried this:
>>    gl.texImage2D ( gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA,
>>                                      gl.UNSIGNED_BYTE, image);
>> That gets my program running again...but now my textures are all screwed
>> up.  I'm using PNG and my textures are a mix of RGB's and RGBA's - with
>> the occasional monochrome PNG tossed in for good measure.  So presumably
>> I
>> need to set the two format parameters according to what's actually in
>> the
>> PNG file.
> No, you don't have to set those to match the png file. The WebGL spec
> details that WebGL will convert the image to the format you specify. The
> new
> API is far more flexible than the old one.

Ah!  Thank goodness for that!  But the documentation is far from clear -
and what I think it says to do doesn't work.

The closest I could find to an explanation is:

  "The source image data is conceptually first converted
   to the data type and format specified by the format
   and type arguments, and then transferred to the OpenGL

...so I suppose it's saying:

  Actual File Format ==> externalFormat ==> internalFormat

Which would suggest that if I unconditionally set both format parameters
to gl.RGBA - then it should convert everything to 4 bytes per texel no
matter whether my PNG has 1,2,3 or 4 bytes - presumably setting A=1 if the
source image is an RGB-only PNG and spreading greyscale PNG's out into
full RGBA's.

 If so, that's kinda wasteful if the file is really a 1 byte
luminance-only thing.

Well, if you know it's luminance texture then you can chose GL_LUMINANCE as your format. WebGL couldn't do this automagically because the way channels are provided to the shader are different depending on the format. So, regardless, you have to tell it format you want since only you know how you want to use the texture. This also means you can load RGBA images as GL_LUMINANCE which gives you more flexibility because many browsers don't support 1 channel textures internally. The always expand to RGBA. So now you can even take a JPG, a GIF, a VIDEO, and convert to 1 channel or convert to RGBA and UNSIGNED_SHORT_5_6_5 or whatever if you want.  That's far more flexible than it was.

But perhaps the word "conceptually" in the spec means that it's not REALLY
going to allocate 4 bytes per texel - but merely arrange that when the
shader reads it, it'll appear as if there were 4 bytes present.  That
would make sense...but it really ought to be clearer on what it'll
actually do.  If we're planning on making this work on itty-bitty
cellphones, we can't afford to waste texture memory - so the spec needs to
be really clear on what will happen.

The wording should make it clear whether I can be lazy and always say
gl.RGBA and rely on the underlying implementation not to waste texture
memory - or whether I still have to parse the image file in order to ask
for the internalFormat that's efficient for whatever file format I happen
to have been handed by my art tools.

Also, if the conversion is automatic, then why do I have to provide both
an internalFormat and externalFormat parameter?  Seems like it should
ignore the externalFormat and assume that from the file header.

Because we are following the OpenGL ES 2.0 spec. In a future version of OpenGL ES those parameters will be allowed to be different and OpenGL ES itself will be doing conversions between format and internal_format.

Anyway - in practice, this isn't working.  Setting them both to gl.RGBA
produces textures that are squashed up (like it's trying to read 4 bytes
per texel when there are only 3)...setting them both to gl.RGB produces
some other screwed up mess.  Is there some magic value I need to use to
tell it "Do this automatically"?

That sounds like a bug.

I've converted all the demos on the wiki and all the conformance test. Most of them required changing 
gl.texImage2D(target, level, img) to gl.texImage2D(target, level, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, img)
and that was it.

A few others required adding
gl.PixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);

But that was it. They all worked.  Hope we can figure out what issue is causing trouble for you.

Bottom line: HELP!! What exactly do I type to get back the behavior I had

Thanks in Advance...

 -- Steve