[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] specifying the allowed character range for extension strings



On Fri, Jun 1, 2012 at 7:05 AM, Benoit Jacob <bjacob@mozilla.com> wrote:
I can't see anywhere in the spec mentioning this, but it seems clear that extension strings must match

Â[a-zA-Z0-9_]+

right? In particular they must be in the 7-bit ASCII range.

Not really. ÂYou should probably recommend that they be, either non-normatively or as a SHOULD, but I don't think there's any need to mandate it, since extensions are only added by implementations anyway.
Â
Do you agree that this should be mentioned in the spec in 5.14.14 "Detecting and enabling extensions" ?

What should be the behavior of getExtension in presence of illegal characters: generate INVALID_VALUE in addition to returning null?

Just return with the same conditions as any other extension string with no match. ÂThere's no point to adding another error path.

Notice that that isn't just an abstract issue as getExtension has to do a case-insensitive comparison and so it has to do a nontrivial operation on this input, so it is important to limit the allowed range of that input.

That's not needed. ÂThe usual approach is to not restrict the characters allowed, and define the comparison as being done in an "ASCII case-insensitive manner". Âhttp://www.whatwg.org/specs/web-apps/current-work/#ascii-case-insensitive This allows specs to avoid having to worry about restricting the valid range, while still giving basic case-insensitivity.

--
Glenn Maynard