[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Public WebGL] Weird texture problems.



I'm testing my WebGL app on some ancient hardware - probing to see where
the compatibility envelope is - and I'm getting some really strange
texture mapping errors on simple RGB textures on an nVidia GeForce GO
6400 with WinXP and Minefield.

The texture is a 16x1024 texels .PNG - (ultimately, it's a lookup table
for another calculation - but it goes wrong even when I don't use it
that way).  Because this is really a lookup table, I'm only reading the
".rg" components of the map and only using one axis of the texture - so
in the minimal test case, my shader is something like:

   gl_FragColor = vec4 ( texture2D ( myMap, vec2(0.5, texCoord.y ) ).rg,
0, 1 ) ;

If the '.b' component of the map is all zeroes, the colors come out
perfectly...but if I put non-zero data in blue, the red and green go
nuts...nothing like the data I put in (48,51,128)==>(22,65,128) and
(16,64,0)==>(0,79,0) !   It doesn't seem to be an addressing issue
because the colors I'm getting don't have the value of any of the texels
in the original map...and it doesn't look like a MIPmapping issue either
because the colors that I see are nowhere near the average of my texels.

The program/shader/texture works great on all manner of other hardware,
OS's, etc).

Any ideas?   Is it possible that some kind of lossy compression might be
happening under the hood?  I've seen this kind of thing with DXT
compression before...but this is uncompressed .PNG.

  -- Steve.

-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: