[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WebGL vs Khronos GLSL compiler: illegal chars in inactive #if / #endif blocks



Thanks John for the clarification.

Floh, if it isn't that much trouble for you to eliminate the invalid
characters then it would be easiest to leave the current, tested,
behavior as is. It's not trivial for WebGL implementations to run the
preprocessor separately from shader compilation and thereby eliminate
invalid characters that would be in #if blocks eliminated during
preprocessing.

-Ken


On Tue, Jun 16, 2015 at 8:26 AM, Floh <floooh@gmail.com> wrote:
>
> Thanks for clearing that up! Since I'm doing my own preprocessing on
> the shader sources during the build process I think I will simply add
> a check for invalid characters there (in addition to invoking the
> Khronos reference compiler which is really doing a wonderful job of
> catching errors for the different GLSL versions).
>
> Cheers,
> -Floh.
>
> On Tue, Jun 16, 2015 at 1:42 AM, John Kessenich <cepheus@frii.com> wrote:
>> The early specs say it in terms of "...character set used for the OpenGL ES
>> shading languages...".
>>
>> The ambiguity here is that the "shading language" is arguably what's left
>> after preprocessing, not before.  It doesn't say "$" is disallowed from
>> shaders, only that is not a valid character in the shading language, which
>> is arguably a different language than the C++ preprocessor language.
>>
>> ES 3.1 resolved this ambiguity (as did desktop) by saying
>>
>> "The source character set used for the OpenGL ES shading languages is
>> Unicode in the UTF-8 encoding scheme. Invalid UTF-8 characters are ignored.
>> After preprocessing, only the following characters are allowed in the
>> resulting stream of GLSL tokens:..."
>>
>> That may leave things to the practical problem of how existing drivers
>> interpreted the early specs, but it's clear what the long term view is.
>>
>> JohnK
>>
>>
>> On 6/15/2015 2:56 PM, Kenneth Russell wrote:
>>
>> First, sorry about the poor error message -- for Chrome at least,
>> please feel free to file a bug with a test case about it. It should at
>> least state that the string doesn't conform to the ASCII subset
>> allowed by GLSL ES.
>>
>> The restriction is correct, and comes from section 3.1 "Character Set"
>> in the GLSL ES 1.0.17 spec. "$" isn't a legal character in any GLSL ES
>> shader, regardless of whether it's in an #if block that's compiled
>> out. WebGL inherits this behavior from GLSL ES. I agree that
>> https://www.khronos.org/registry/webgl/specs/1.0/#6.19 could be
>> clearer. We don't want to unnecessarily replicate the table from the
>> GLSL ES spec, but perhaps there could be a more explicit mention of
>> the GLSL ES rules.
>>
>> -Ken
>>
>>
>>
>> On Sat, Jun 13, 2015 at 3:47 AM, Floh <floooh@gmail.com> wrote:
>>
>> Hi,
>>
>> I just noticed that the WebGL compilers at least in Firefox, Chrome
>> and Safari on OSX reject shader source code when #if/#endif blocks
>> that are not taken contain illegal characters (in my case a '$'),
>> while the Khronos reference compiler doesn't complain about this. Now
>> I know that WebGL might do additional shader validation that the
>> Khronos compiler doesn't know/care about, but when looking for similar
>> bugs in the FF and Chrome bug trackers I found that a similar problem
>> existed a few years ago with illegal characters in comments.
>>
>> My question is whether shader source in inactive #if/#endif blocks
>> should be treated like a comment and not be included in the validation
>> (for instance the code in inactive #if/#endif blocks could be meant
>> for other platforms, if the shader source was generated and contains
>> conditionally compiled code for multiple platforms). Basically,
>> whether the current behaviour is an implementation bug (which I'm
>> happy to write tickets for), or whether it 'works as intended'?
>>
>> The WebGL spec is not very clear, and only mentions comments, and
>> non-ASCII characters (but a '$' is clearly a valid ASCII character):
>>
>> https://www.khronos.org/registry/webgl/specs/1.0/#6.19
>>
>> Also, the error reporting could be better. Chrome and Safari just say
>> "WebGL: INVALID_VALUE: shaderSource: string not ASCII" in the JS
>> console, Firefox at least tells me the ASCII code of the character I
>> should look for: "Error: WebGL: shaderSource: String contains the
>> illegal character '36'", but both don't tell me the location in the
>> shader source code where the problems occurs.
>>
>> Cheers,
>> -Floh.
>>
>> -----------------------------------------------------------
>> You are currently subscribed to public_webgl@khronos.org.
>> To unsubscribe, send an email to majordomo@khronos.org with
>> the following command in the body of your email:
>> unsubscribe public_webgl
>> -----------------------------------------------------------
>>
>> -----------------------------------------------------------
>> You are currently subscribed to public_webgl@khronos.org.
>> To unsubscribe, send an email to majordomo@khronos.org with
>> the following command in the body of your email:
>> unsubscribe public_webgl
>> -----------------------------------------------------------
>>
>>
>
> -----------------------------------------------------------
> You are currently subscribed to public_webgl@khronos.org.
> To unsubscribe, send an email to majordomo@khronos.org with
> the following command in the body of your email:
> unsubscribe public_webgl
> -----------------------------------------------------------
>

-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email:
unsubscribe public_webgl
-----------------------------------------------------------