PDA

View Full Version : About supporting shader binary format on pc emulation.



danielkim
02-24-2010, 12:14 AM
Hi.
When calling glGetBooleanv(GL_NUM_SHADER_BINARY_FORMATS, &numBinaryFormats), Its implementation returns 0 to numBinaryFormats on my pc using powervr lib. Doesn't It support binary format on pc??

Could you know some bulletin board website that having lively discussion and sharing Info about ES 2.0 subject??

danielkim
02-24-2010, 02:40 AM
glGetBooleanv(GL_NUM_SHADER_BINARY_FORMATS, &numBinaryFormats)
I just misspelled the upper function in this board. glGetIntegerv() is right.

Xmas
03-01-2010, 06:10 AM
Hi.
When calling glGetBooleanv(GL_NUM_SHADER_BINARY_FORMATS, &numBinaryFormats), Its implementation returns 0 to numBinaryFormats on my pc using powervr lib. Doesn't It support binary format on pc??
No, it doesn't. You'll have to use source shaders on PC emulation.