PDA

View Full Version : Possible error in paletted texture spec



mike260
07-23-2004, 07:14 PM
Hello,

Apologies if I'm missing something, but I don't get table 3.17.1 in this spec. Shouldn't there be 4 x 8bit texels in the PALETTE8_xxx diagram rather than 8 x 4bit?

Also, I humbly submit that showing the packing of 8bit texels into a 32bit word is misleading in any case - if I packed a word as shown and stored it to memory, wouldn't the result depend on CPU endianness? Why does 32bit packing come into this at all?

hmwill
07-25-2004, 03:58 PM
This extension has a couple of problems. Besides the issues you raise, the reference implementation performs an alignment to byte boundary between the individual data rows if you use 4-bit indices. I have not been able to derive this alignment from the specification. Furthermore, the requirement of the palette formats to be used as internal formats contradicts the statement in the main specification, according to which only base internal formats with implementation defined internal represntation need to be supported. For the latter I asked to review board for clarification.

- HM

mike260
07-27-2004, 08:48 AM
Is the reference implementation GLESonGL?

hmwill
07-27-2004, 02:12 PM
It's here: Downloads (http://www.khronos.org/developers/code.html)

Not sure if it's exactly the same package, since it's called differently.

- HM

mike260
07-28-2004, 05:11 AM
Yep, that's the one I was looking at.

As long as we're sticking the knife in, I should point out that level>0 and level<0 are both handled incorrectly by GLESonGL's glCompressedTexImage2D; level>0 is (incorrectly) accepted whereas level<0 just generates an error and fails.

Anyway, I just submitted all this stuff to the GLESonGL bug-reporting address.

mike260
03-16-2005, 02:34 PM
Bump.
The reference implementation is still broken.

And I still think that defining texture layout in memory as endian-dependant is a bit weird.