[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] texImage2D changed...

----- Original Message -----
> > On Mon, Aug 9, 2010 at 11:13 AM, <steve@sjbaker.org> wrote:
> <snip>
> >> I looked
> >> at the WebGL spec and some of the working example code to see how
> >> we
> >> load
> >> images from URL's nowadays...and as a quick test, I tried this:
> >>
> >>    gl.texImage2D ( gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA,
> >>                                      gl.UNSIGNED_BYTE, image);
> >>
> >> That gets my program running again...but now my textures are all
> >> screwed
> >> up. I'm using PNG and my textures are a mix of RGB's and RGBA's -
> >> with
> >> the occasional monochrome PNG tossed in for good measure. So
> >> presumably
> >> I
> >> need to set the two format parameters according to what's actually
> >> in
> >> the
> >> PNG file.
> >
> > No, you don't have to set those to match the png file. The WebGL
> > spec
> > details that WebGL will convert the image to the format you specify.
> > The
> > new
> > API is far more flexible than the old one.
> Ah! Thank goodness for that! But the documentation is far from clear -
> and what I think it says to do doesn't work.
> The closest I could find to an explanation is:
> "The source image data is conceptually first converted
> to the data type and format specified by the format
> and type arguments, and then transferred to the OpenGL
> implementation."
> ...so I suppose it's saying:
> Actual File Format ==> externalFormat ==> internalFormat
> Which would suggest that if I unconditionally set both format
> parameters
> to gl.RGBA - then it should convert everything to 4 bytes per texel no
> matter whether my PNG has 1,2,3 or 4 bytes - presumably setting A=1 if
> the
> source image is an RGB-only PNG and spreading greyscale PNG's out into
> full RGBA's. If so, that's kinda wasteful if the file is really a 1
> byte
> luminance-only thing.
> But perhaps the word "conceptually" in the spec means that it's not
> going to allocate 4 bytes per texel - but merely arrange that when the
> shader reads it, it'll appear as if there were 4 bytes present. That
> would make sense...but it really ought to be clearer on what it'll
> actually do. If we're planning on making this work on itty-bitty
> cellphones, we can't afford to waste texture memory - so the spec
> needs to
> be really clear on what will happen.
> The wording should make it clear whether I can be lazy and always say
> gl.RGBA and rely on the underlying implementation not to waste texture
> memory - or whether I still have to parse the image file in order to
> ask
> for the internalFormat that's efficient for whatever file format I
> happen
> to have been handed by my art tools.
> Also, if the conversion is automatic, then why do I have to provide
> both
> an internalFormat and externalFormat parameter? Seems like it should
> ignore the externalFormat and assume that from the file header.
> Anyway - in practice, this isn't working. Setting them both to gl.RGBA
> produces textures that are squashed up (like it's trying to read 4
> bytes
> per texel when there are only 3)...setting them both to gl.RGB
> produces
> some other screwed up mess. Is there some magic value I need to use to
> tell it "Do this automatically"?
> Bottom line: HELP!! What exactly do I type to get back the behavior I
> had
> yesterday?

As I said in my previous e-mail: could you first try in Chromium just to make sure that your issues aren't just caused by Firefox's current lack of support for certain texture formats? Within a couple of weeks we should be there too.


> Thanks in Advance...
> -- Steve
> -----------------------------------------------------------
> You are currently subscribed to public_webgl@khronos.org.
> To unsubscribe, send an email to majordomo@khronos.org with
> the following command in the body of your email:
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: