[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] texImage2D changed...

----- Original Message -----
> I loaded up last night's Minefield-for-Windows 32 bit build (20100809
> Win
> 4.0b4pre) - and it crashed on loading my WebGL app
> (which had been
> working
> just fine 30 seconds earlier on a nightly build from a few days ago!)
> :-(
> In desperation I tried turning *off* the webgl.shader_validator

Ah OK, crash in the shader validator, you wouldn't be the only one hitting that:

> - and
> doing that got me a decent error message - which was that this line of
> JavaScript code:
> gl.texImage2D( gl.TEXTURE_2D, 0, image, true ) ;

So this is a completely separate, unrelated issue. The crash you mentioned above was Firefox's fault; this on the other hand is that your code is still using the old texImage2D API.

> ...has too few parameters. I can't imagine that the validator would
> know
> or care about that - so probably the validator is crashing for some
> unrelated reason.


> Anyway - I guess that form of texImage2D got obsoleted somehow.

Yes. Did you read this e-mail that Vlad recently sent to this list?

> I
> looked
> at the WebGL spec and some of the working example code to see how we
> load
> images from URL's nowadays...and as a quick test, I tried this:
> gl.texImage2D ( gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA,
> gl.UNSIGNED_BYTE, image);
> That gets my program running again...but now my textures are all
> screwed
> up.
> I'm using PNG and my textures are a mix of RGB's and RGBA's - with
> the occasional monochrome PNG tossed in for good measure.

OK so my guess would be that it's because we in Firefox are currently only supporting RGB and RGBA with 8 bytes per channel. Can you try this in Chromium, if it works there then that would confirm it. We are looking at reusing the nice reusable that Ken wrote there for Chromium.

> So
> presumably I
> need to set the two format parameters according to what's actually in
> the
> PNG file.

No no, you don't need to. First of all, like in OpenGL ES, the format and internalformat parameters are just required to be equal, so yes they are redundant.

If your image is an actual Image object or a HTML element, then you really don't have to specify the format it's in, the format parameter passed to texImage2D only specifies the format in which you want it to be passed to the GL; the WebGL implementation does the conversion for you.

The only reason why you'd have to care about the actual format of your image data, is if you're passing this data in a plain buffer/array.

(Disclaimer: i could be wrong, anyone here knowing better please correct me).

> Am I missing something here? Is there still a way to get WebGL to
> auto-detect the format?

As explained above, it does (except of course if you just pass a plain buffer).

You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: