[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] Weird texture problems.
I haven't tried painting it onto the screen directly - but I've mapped it
onto a simple cube with the simplest imaginable shader - and it still
doesn't come out right.
I'm away from my computer right now - but I'll put a simple demo of the
thing with some screen shots from the GeForce 6400 GO machine someplace
I tried all sorts of things last night to try to trick it out of this
behavior - I resized the array to a more 'normal' 128x128 - I tried the
PNG with and without an alpha plane. I tried sampling the texture with
".rgb" rather than just ".rg" - but none of those things helped.
If implementations are allowed to do things to my texture behind my back
(like changing the color space, doing compression or something else) -
then this needs to be highlighted in the spec.
Modern shaders VERY frequently use textures for all sorts of things other
than Red/Green/Blue images glued onto polygons. In this case, I'm using
the texture to tell the shader how a second, atlassed, texture is packed
(I do this in order that random sub-map selection can be done to make 100
identical single-mesh models look like 100 totally different models with
different colors and textures on each). I'm deliberately only using the
three or four high-order bits from each band of the map just in case it's
loaded in 4/4/4/4 or 5/6/5 mode under the hood...but the errors I'm seeing
are vastly bigger than could be accounted for like that.
DXT compression would certainly explain it - but are there REALLY
situations where the underlying graphics system is allowed to mangle my
texture like that without me telling it to? Is there some kind of option
to force it not to do that? Is there a way for me to query the system to
see whether it's going to do that?
> DXT sounds plausible. Does the texture look like itself if you just paint
> it to the screen? -T
> On Jun 29, 2010 7:53 PM, "Steve Baker" <firstname.lastname@example.org> wrote:
>> I'm testing my WebGL app on some ancient hardware - probing to see where
>> the compatibility envelope is - and I'm getting some really strange
>> texture mapping errors on simple RGB textures on an nVidia GeForce GO
>> 6400 with WinXP and Minefield.
>> The texture is a 16x1024 texels .PNG - (ultimately, it's a lookup table
>> for another calculation - but it goes wrong even when I don't use it
>> that way). Because this is really a lookup table, I'm only reading the
>> ".rg" components of the map and only using one axis of the texture - so
>> in the minimal test case, my shader is something like:
>> gl_FragColor = vec4 ( texture2D ( myMap, vec2(0.5, texCoord.y ) ).rg,
>> 0, 1 ) ;
>> If the '.b' component of the map is all zeroes, the colors come out
>> perfectly...but if I put non-zero data in blue, the red and green go
>> nuts...nothing like the data I put in (48,51,128)==>(22,65,128) and
>> (16,64,0)==>(0,79,0) ! It doesn't seem to be an addressing issue
>> because the colors I'm getting don't have the value of any of the texels
>> in the original map...and it doesn't look like a MIPmapping issue either
>> because the colors that I see are nowhere near the average of my texels.
>> The program/shader/texture works great on all manner of other hardware,
>> OS's, etc).
>> Any ideas? Is it possible that some kind of lossy compression might be
>> happening under the hood? I've seen this kind of thing with DXT
>> compression before...but this is uncompressed .PNG.
>> -- Steve.
>> You are currently subscribed to email@example.com.
>> To unsubscribe, send an email to firstname.lastname@example.org with
>> the following command in the body of your email:
You are currently subscribed to email@example.com.
To unsubscribe, send an email to firstname.lastname@example.org with
the following command in the body of your email: