[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] Invalid string parameters in WebGL APIs accepting DOMStrings
- To: Oliver Hunt <oliver@apple.com>
- Subject: Re: [Public WebGL] Invalid string parameters in WebGL APIs accepting DOMStrings
- From: Kenneth Russell <kbr@google.com>
- Date: Mon, 13 Dec 2010 12:55:17 -0800
- Cc: James Robinson <jamesr@google.com>, public_webgl@khronos.org
- Dkim-signature: v=1; a=rsa-sha1; c=relaxed/relaxed; d=google.com; s=beta; t=1292273720; bh=sldT3ni7AzN+10LseF+YbY/AhEQ=; h=MIME-Version:In-Reply-To:References:Date:Message-ID:Subject:From: To:Cc:Content-Type:Content-Transfer-Encoding; b=q59Er19LB7DawwrqUaDB8L7h5fryxT4n8s8IEAJ7UOI+7vc1gqpcFQIMLq4X+hfIr 9mu7Pr55eXJ9bOlq9jRPg==
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=beta; h=domainkey-signature:mime-version:received:received:in-reply-to :references:date:message-id:subject:from:to:cc:content-type :content-transfer-encoding; bh=/JSYk26JiUR8s3seT/+PAdedWyMipr31zD+lf6FZZ08=; b=XeW/ZIhVEjyepMH9dtZbXzYZFHuNSR9cEG4vSANNKUdQ5QCwhBbqvm4L6ovk3wrnA/ uPKR+NzI5kU/kwyCeptg==
- Domainkey-signature: a=rsa-sha1; c=nofws; d=google.com; s=beta; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :cc:content-type:content-transfer-encoding; b=ZfoIv34ypMW0tD2/9fzyPNBMV36H6epyPTD7yC0MgILWPPH8XlJ99pAhA8vOxAIJPV w1rVz6RV9NRnMeu+9jwQ==
- In-reply-to: <F692BF7D-91CC-4C5A-BC17-DCD1C725E55F@apple.com>
- List-id: Public WebGL Mailing List <public_webgl.khronos.org>
- References: <AANLkTi=9TpFcX0S5K2M6mMPeEH_x49MXSGf8tfrB=5YU@mail.gmail.com> <F692BF7D-91CC-4C5A-BC17-DCD1C725E55F@apple.com>
- Sender: owner-public_webgl@khronos.org
It does seem that this is more of an implementation bug than something
that needs to be added to the spec. See in particular
https://cvs.khronos.org/svn/repos/registry/trunk/public/webgl/doc/spec/WebGL-spec.html#SUPPORTED_GLSL_CONSTRUCTS
. I've filed https://bugs.webkit.org/show_bug.cgi?id=50929 on the
WebKit side to track this. We could add an explicit note to the spec
if desired, for example in Section 2.
-Ken
On Fri, Dec 10, 2010 at 5:59 PM, Oliver Hunt <oliver@apple.com> wrote:
> This seems sensible, although it's probably actually disallowed by the spec
> already. The WebGL spec does mandate strict adherence to the GLSL ES 1.0
> spec, which disallows those characters, so it seems that this is actually
> just a bug in the shader validation.
> --Oliver
> On Dec 10, 2010, at 5:44 PM, James Robinson wrote:
>
> Greetings,
> A number of WebGL APIs accept string parameters as DOMString:
> getExtension(), bindAttribLocation(), getAttribLocation(), getUniformLocation(),
> and shaderSource(). Several of these functions have corresponding getters
> either directly (such as shaderSource() / getShaderSource()) or indirectly
> (a uniform name is specified by shaderSource() and then queried later by
> getUniformLocation()). Unfortunately, WebGL does not specify any encoding
> or charset requirements for these functions which leads to some
> inconsistencies in current implementations and some more serious
> hypothetical concerns.
> The OpenGL ES 1.00 specification version 17 defines the source character set
> for shaders as a subset of ASCII and defines the source string as sequence
> of characters from this set
> ((http://www.khronos.org/registry/gles/specs/2.0/GLSL_ES_Specification_1.0.17.pdf,
> sections 3.1 and 3.2). Current WebGL implementations do not appear to
> strictly enforce this character set. In Minefield 4.0b8pre and
> Chrome 9.0.597.10 on linux I can successfully compile a shader that contains
> characters in comments from the ASCII set but outside the allowed range, the
> extended ASCII set, unicode but not in ascii, and unmatched surrogate pairs.
> The shader source round-trips inconsistently, though - in Minefield the
> shader source string does not round trip through
> shaderSource()/getShaderSource() losslessly if the source string contains
> characters outside of ASCII although it does round trip characters in ASCII
> that are not in the set allowed by OpenGL ES. Additionally it appears that
> at least some of this validation is being performed by the underlying GL
> implementation, not by the WebGL bindings layer, so the behavior might vary
> depending on the exact driver implementation.
> I propose that we adopt OpenGL ES's allowed character set definition and
> generate an INVALID_VALUE error if a specified DOMString contains any
> characters outside the set to ensure consistent behavior and to be extra
> sure that we don't pass data to lower-level systems that they may not
> expect. Proper unicode handling is tricky business and can often lead to
> subtle bugs. For example, if code confuses the character count of a string
> with the number of bytes in the string memory access errors can be
> introduced that will not be exposed by testing that uses only ASCII
> characters. Some operations on unicode strings are lossy and map multiple
> inputs to the same output which could confuse validation logic. Some
> drivers may simply not expect to receive non-ASCII data at all.
> For reference, DOMString is defined by WebIDL to map closely to the
> Javascript string type: http://www.w3.org/TR/WebIDL/#idl-DOMString. A
> DOMString is composed of a sequence of unsigned 16 bit numbers that
> typically represent a UTF-16 encoded string. However, DOMString does not
> require that the contents be a valid UTF-16 string and it's quite possible
> to construct strings in javascript that are not valid UTF-16. For example
> the snippet "a𐅃c".substring(0,2) will produce a string containing an
> unmatched surrogate pair. This is significant in that it makes some
> safe-seaming transformations (like converting a DOMString to UTF-8)
> non-invertible.
> - James
>
-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: