is this correct? The best accuracy of readPixels in WebGL is GL_UNSIGNED_BYTE (8Bit for every color channel) ?

Actually I use readPixels to get the surfacepoint of an object. I draw the object with colored objectcoords. (left point X is dark red, right point X is bright red, Y is green and Z is blue) I get the RGB Color and I can calculate the surface point.

But with 8Bit there are only 256 different points per axle. The gl_fragcolor use float per color.

There are only shortInt packed formats not implemented yet? All colors packed into 16 Bit?

So I have to combine the 8 Bits three times to get 16Mio Points per axle and redraw the object 3 times for every axle?

Sorry about my english.