[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Public WebGL] Invalid string parameters in WebGL APIs accepting DOMStrings
- To: firstname.lastname@example.org
- Subject: [Public WebGL] Invalid string parameters in WebGL APIs accepting DOMStrings
- From: James Robinson <email@example.com>
- Date: Fri, 10 Dec 2010 17:44:27 -0800
- Dkim-signature: v=1; a=rsa-sha1; c=relaxed/relaxed; d=google.com; s=beta; t=1292031872; bh=a8j3GXFEsw6UfJb6MWfkIZbCioo=; h=MIME-Version:Date:Message-ID:Subject:From:To:Content-Type; b=ExiqR0F8Cv29pL+AV2CIBmwjC8pOd69X45vyQHEddxPUqexftZCtkRlnGSS2x1++M JMux5Mndx+D890zt7CiFQ==
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=beta; h=domainkey-signature:mime-version:received:received:date:message-id :subject:from:to:content-type; bh=g8KHCJRLhhe2oOWFepGbxBFHJIrR3gMUUkCymRDnAH4=; b=bhzoUHVgXS0MQqjioMNQSnqVVrtY08Jrnhv7w2fEfw1/XrdqnhYs2YJ2WNw/NttFnQ JY4kJhQ9DL+0mWcUQjhg==
- Domainkey-signature: a=rsa-sha1; c=nofws; d=google.com; s=beta; h=mime-version:date:message-id:subject:from:to:content-type; b=bU6uor/EaXuZDpjqFswlsmKaJJIzrRD9qm5qhJuaPAHJgjZZorpLav4E3A0Q4m0Ur4 QDGL2DtYnS5CP71WOuXA==
- List-id: Public WebGL Mailing List <public_webgl.khronos.org>
- Sender: firstname.lastname@example.org
A number of WebGL APIs accept string parameters as DOMString: getExtension(), bindAttribLocation(), getAttribLocation(), getUniformLocation(), and shaderSource(). Several of these functions have corresponding getters either directly (such as shaderSource() / getShaderSource()) or indirectly (a uniform name is specified by shaderSource() and then queried later by getUniformLocation()). Unfortunately, WebGL does not specify any encoding or charset requirements for these functions which leads to some inconsistencies in current implementations and some more serious hypothetical concerns.
The OpenGL ES 1.00 specification version 17 defines the source character set for shaders as a subset of ASCII and defines the source string as sequence of characters from this set ((http://www.khronos.org/registry/gles/specs/2.0/GLSL_ES_Specification_1.0.17.pdf
, sections 3.1 and 3.2). Current WebGL implementations do not appear to strictly enforce this character set. In Minefield 4.0b8pre and Chrome 9.0.597.10 on linux I can successfully compile a shader that contains characters in comments from the ASCII set but outside the allowed range, the extended ASCII set, unicode but not in ascii, and unmatched surrogate pairs. The shader source round-trips inconsistently, though - in Minefield the shader source string does not round trip through shaderSource()/getShaderSource() losslessly if the source string contains characters outside of ASCII although it does round trip characters in ASCII that are not in the set allowed by OpenGL ES. Additionally it appears that at least some of this validation is being performed by the underlying GL implementation, not by the WebGL bindings layer, so the behavior might vary depending on the exact driver implementation.
I propose that we adopt OpenGL ES's allowed character set definition and generate an INVALID_VALUE error if a specified DOMString contains any characters outside the set to ensure consistent behavior and to be extra sure that we don't pass data to lower-level systems that they may not expect. Proper unicode handling is tricky business and can often lead to subtle bugs. For example, if code confuses the character count of a string with the number of bytes in the string memory access errors can be introduced that will not be exposed by testing that uses only ASCII characters. Some operations on unicode strings are lossy and map multiple inputs to the same output which could confuse validation logic. Some drivers may simply not expect to receive non-ASCII data at all.