[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] about the VENDOR, RENDERER, and VERSION strings



On Thu, Dec 2, 2010 at 12:53 AM, Vladimir Vukicevic
<vladimir@mozilla.com> wrote:
>
> ----- Original Message -----
>> The caps could be affected by driver and OS as well. Also Steve makes
>> a good point about extensions.
>>
>> Coming back to the RENDERER string question for Firefox -- for the
>> purposes of invading privacy, bits obtained via gl.get and gl
>> extensions are just as good as bits obtained from getString(RENDERER),
>> and in practice will be highly redundant with the bits from RENDERER.
>> On the other hand, the bits from RENDERER have extra value to WebGL
>> authors and users, because they will help characterize performance,
>> and allow workarounds for bugs in certain implementations.
>>
>> Unless FF plans on crippling gl.get and gl extensions, hiding the
>> RENDERER info is just punishing the WebGL community, for no privacy
>> benefit.
>
> That's pretty harsh -- there's certainly no desire to punish the webgl community.  Instead, the goal is to try to remove roadblocks towards webgl adoption and future compatibility.  Privacy is a big consideration; these are definitely more fingerprint inputs.  But, as you point out, unless we block/sanitize/whatever gl.get results, blocking the renderer string doesn't help much.  I don't know that this is necessarily true -- for example, there is a huge range of video cards whose max texture size is 2048 (or 4096 or 8192).  The same holds for many of the other get params.  That's not an argument that these aren't exposing something that can be used as input for fingerprinting, but potentially less than the aggreate get params.  The security exploit targetting argumentalso exists, but it's nothing more than a speedbump.  As has been said, the exploit can be attempted against all systems -- unless, perhaps, on systems without the flaw the exploit results in a visible effect (hang, crash, etc.), but is silent on systems where it does succeed.  In that case, it would be worth considerably more if it was able to remain undetected for longer.
>
> However, Mozilla's current decision is mainly based around painful experience with the browser user agent string, which we are now only very slowly able to claw back under control.  If every app developer on the web was a good actor and implemented things correctly, there'd be no problem; we could expose lots of details that would let people fine tune their apps.  Unfortunately, that's not the case.  For example -- given the string "Firefox/3.5" in the UA, we've had web sites break when we bumped the version, because thy checked for Firefox by matching on "Firefox/3.5" explicitly.  We've had sites break because they were sniffing for "Firefox" (and not the Gecko renderer) when accessed via nightly/beta builds, purely due to the UA string.
>
> Given the extremely wide range of renderers out there, it seems to me that this issue will be horribly compounded.  You'd have webgl apps that sniff for nvidia, and ignore a qualcomm or powervr gpu that might benefit from the same optimization.  Someone might sniff for a specific AMD card to work around a bug, even if that bug has been fixed in a newer version of the drivers -- and the fix happens to break the app's workaround.  And on and on.  That, IMO, is a much worse situation.

I agree that userAgent has been and still is often misused.  It's a
real problem on the Internet.  But, all the moaning about the real
problems can obscure the fact that there are benefits to userAgent.
For example, I believe one spectacular win that userAgent has
contributed to is the success of Firefox and standards-based HTML.  In
hindsight it looks inevitable, but if mainstream sites didn't have an
easy way to implement fallbacks for users on IE6 etc, would things
have played out the same way?

> For performance considerations (e.g. the vertex shader texture fetch issue), apps can do some perf testing on first run.  That data can be cached in the user's browser, so it should have minimal impact.  If these issues become extremely common, we can look at adding a webgl extension that potentially tries to expose some of this information (in conjunction with a GL extension that would be needed to get the info in the first place).

The problem with perf testing is that it is expensive (in user time)
to do it accurately.  I.e. the user has to sit there, not enjoying
your app, while you do it.  In my opinion one of the great promises of
WebGL is that 3D experiences can be launched very quickly.  Speed is
the name of the game on the web.  I personally would never make a new
user sit through even 500ms of benchmarking if I could reasonably
avoid it.  With no hints from RENDERER or similar, I would instead
probably start myapp with minimal features, and adaptively ramp them
up until frame rate starts to suffer.  But that's not great for users
either.  It would be much better to do one table lookup to pick good
defaults, and then give the user some way to change the defaults if
they don't like them.

My other hot button is that I'm excited about the ability to
characterize the universe of WebGL implementations using webgl-bench
(or something like it).  What a huge boon for devs to be able to
figure out what hardware capabilities and speeds are loose in the
wild.  Great for users too, to be able to compare browsers and cards
and OS's.

> For bug workaround issues, there isn't necessarily an easy workaround, unless you can check for the bug at runtime.  If you can, great; problem solved.  If you can't for whatever reason, then perhaps that driver can't really support WebGL, and it should be blacklisted until the underlying bugs are fixed.

On the one hand I know what you mean -- for security purposes alone,
WebGL needs to be locked down enough that bugs are rare, and security
vulnerabilities fixed ASAP.  On the other hand, I know that in the
real world, bugs will happen.  If a bug isn't a true security
vulnerability, it won't (and shouldn't) warrant a blacklisting of
someone's hardware.  So we devs will be stuck needing to identify the
problematic hardware to do a workaround.  If you make it hard to
identify the hardware, we're going to have to spend effort on
alternatives (whether it's "please use a different browser", or a bag
of voodoo code-based diagnostics, or labor intensive troubleshooting).
 In many situations it will be the users who suffer, threatening the
reputation of WebGL.

> For support issues, Firefox has a wealth of data in about:support that could be used to figure out hardware details.  This can be accessed by the user via the menu -> Help -> Troubleshooting Information, from where they can click a button to copy all the info to the clipboard for pasting into a support request.

That's good.  Automation is way way better.

> Having more detailed RENDERER etc. info to work around these problems starts going down a long train where the short-term fix might feel good, but it will be extremely painful to undo in the long run.  I'd rather not recreate another UA string, especially given the emphasis that has been placed on cross-browser WebGL compat.

Hopefully I'm making it clear that there are real costs to not having
it.  I understand if you weigh the cost/benefits differently but I
want to be clear about what they are.

-T

-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: