[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] GL_RENDERER string needed for performant apps



On Tue, Jan 14, 2014 at 2:19 PM, Boris Zbarsky <bzbarsky@mit.edu> wrote:

Wait.  The whole point of the discussion I've seen so far is that people want to use GL_RENDERER to decide whether to run particular code or somewhat different code or not even try WebGL at all.  Not because the API is different but because the implementation in the graphics hardware is different in a way they care about (performance mostly, sounds like).  Am I just completely misunderstanding the proposal?  If not, then I don't understand your argument here...

I agree with Boris on this point, and I'm not entirely sure I understand the argument you're making James.

The whole point of exposing this string is so that developers can make informed decisions about what rendering paths to take. Any given feature (including WebGL as a whole) may be viewed in different tiers of support: Not supported at all, barely supported, and supported well. Browsers manage the difference between supported and not, this feature is to allow developers to choose how to handle the difference between well supported and badly supported.