[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Why does the set of pname arguments for which the behavior of getShaderParameter is defined not match GL ES?





On Tue, Apr 17, 2012 at 9:20 PM, Boris Zbarsky <bzbarsky@mit.edu> wrote:

On 4/18/12 12:16 AM, Glenn Maynard wrote:
It could be specified in each function that takes an enum argument, eg.
replace the getShaderParameter definition with something like:

"If /pname/ is not present in the following table, generate
GL_INVALID_ENUM.  Otherwise, return the value for /pname/ given
/shader/, using the specified type."

I think this would be most user-friendly, yes.


Not exactly: to test this fully, you'd need to test every value which
isn't explicitly supported.

Hmm... Yeah, fair.  Good catch on extensions.


It's probably reasonable to test every unsupported [0, 0xFFFF] value,
though, for every function that takes an enum.

Um, no, this would not be reasonable IMO. I think you'll find that some browsers have a lot of overhead with errors from logging them to help developers find their errors to propagating them through multiple subsystems and that checking 65536 values will be far too slow, especially for every function that takes an enum. Some take 2-5 enums. Testing every combination will likely be too slow. 
 

Yeah, probably.  65000 is just not that big a number nowadays.  ;)


-Boris

-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email:
unsubscribe public_webgl
-----------------------------------------------------------