[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Why does the set of pname arguments for which the behavior of getShaderParameter is defined not match GL ES?



On Tue, Apr 17, 2012 at 10:56 PM, Boris Zbarsky <bzbarsky@mit.edu> wrote:
Constant values not mentioned in the spec should always be invalid

Saying that in a normative statement somewhere would cover this case, yes.  Makes for slightly confusing reading, of course, given the spec's "defer to ES" general structure: you have to know that you're supposed to defer to ES except in these various cases which are defined somewhere far far away from the definition of the actual method you're implementing....

It could be specified in each function that takes an enum argument, eg. replace the getShaderParameter definition with something like:

"If pname is not present in the following table, generate GL_INVALID_ENUM.  Otherwise, return the value for pname given shader, using the specified type."

This makes it explicit that only the listed enums are supported, puts that requirement where it'll be seen (in the function definition), and is brief enough that this language can be used in all functions like this.

(That language is a little awkward, and can probably be improved a bit.)

It's probably worth having tests to make sure the typical underlying
constants for those intentionally-unsupported enums are actually
disallowed, at least.

Yes, like every normative requirement.

Not exactly: to test this fully, you'd need to test every value which isn't explicitly supported.  (For example, ES extensions might add enums to these tables beyond those in the core spec.  With WebGL, that is only permitted if the extension is explicitly enabled with getExtension.)  That would make for unreasonably expensive tests.

It's probably reasonable to test every unsupported [0, 0xFFFF] value, though, for every function that takes an enum.

--
Glenn Maynard