[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Public WebGL] Why have readpixels support 5_6_5?

OpenGL ES 2.0 specifically only allows readpixels to work with 2 formats.

2: Implementation defined RGB or RGBA format

I'm just guessing but the only reason for the second format is it's the "no conversion" path. In other words, if the hardware is rendering to 4_4_4_4 the second format on that implementation will be RGBA / UNSIGNED_SHORT_4_4_4_4 since no conversion = no memory needed + fast.

WebGL on the other hand is currently speced as requiring the second format to be RGB / UNSIGNED_SHORT_5_6_5 so that the second format is consistent across browsers.

The question I have is why even support a second format in WebGL? If the only reason for the existence of a second format is to avoid conversion and get speed then requiring a specific format in WebGL defeats that goal. A WebGL implementation will have to query if 5_6_5 is supported and if not do the conversion removing all the speed and memory benefits.

With that benefit removed why not just change the WebGL spec only support RGBA / UNSIGNED_BYTE as a readpixels format?