[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] about the VENDOR, RENDERER, and VERSION strings



On Tue, Nov 30, 2010 at 5:33 AM, Benoit Jacob <bjacob@mozilla.com> wrote:
> *** Note: there's a concrete proposal below, search for ACCELERATION_STATUS in this email ***
>
> ----- Original Message -----
>> > ----- Original Message -----
>> >> Is there really any significant benefit in hiding the true
>> >> information?
>> >
>> > It's a matter of taste or a political question, but some people do
>> > care about anonymity and/or privacy and will frown if WebGL does
>> > poorly in this respect.
>>
>> If users or browser vendors feel that this divulges too much
>> information, they can quite simply put WebGL content under a per-site
>> whitelist as popups and cookies are now.
>
> Most users don't know what WebGL is so they can't make an informed decision. Even knowledgeable users can't tell beforehand if it's legitimate for a given page to try to use WebGL. Moreover, if WebGL becomes popular, then that will become very tedious for the user.

WebGL is "3-D rendered content". I will not use a production web
browser that doesn't let me turn WebGL on only for the pages I choose.
If I'm trying to get text content, why should a page be allowed to max
my GPU to 3-d render an ad? Especially on my phone, I do not want to
drain battery unless I am using the 3-d content interactively. WebGL
should never be popular enough that every page I go to will want to
use it. The use cases just aren't there. If an entire domain or app
wants to use WebGL, that's handled easily by the whitelist. Of course,
there's nothing stopping vendors from deciding they want to allow
wildcard whitelists as well if they think their users don't understand
"This web page would like access to your 3D renderer. Allow?"

Yes, WebGL presents a large attack surface. No, it doesn't have to
present this to every page.

>>
>> >>
>> >> For application authors, there is immense value to be had from
>> >> being
>> >> able to determine which card and drivers the user has - both at run
>> >> time
>> >> (so the application can work around bugs)
>> >
>> > It seems to me that we can realistically aim for good enough WebGL
>> > implementations that this shouldn't be needed. I would be very
>> > interested in the list of graphics-card-specific things that you
>> > need to do on Minefield, I would try to handle these as bugs in our
>> > implementation.
>>
>> With at least 4 parties (webdev, browser vendor, card driver vendor,
>> operating system) involved in providing a seamless 3-d web experience,
>> there will always be bugs and incompatibilities. Yes, the WebGL
>> implementation is the critical multi-platform abstraction that should
>> hide these issues but limiting the information available to
>> application developers is the wrong place to enforce the abstraction
>> for pragmatic reasons.
>
> The reason why I personally prefer not to expose graphics card information, is not to enforce abstraction onto web developers, it is to protect user privacy/anonymity.
>
> I realize that web developers can make use of the graphics card info to improve the user experience, but perhaps there are ways that we can give app developers what they need without giving the full graphics card info.
>
> Indeed, if app developers really need (that remains to be seen) a way to identify slow graphics cards, we could add a getParameter() pname allowing to get information. We shouldn't try to detail exactly for each feature if it's fast or slow, because 1) that would be unmanageable for us and 2) that would end up giving away lots of bits of user information. But we could have a single integer getParameter() pname taking maybe 2, 3 or 4 possible values, giving a hint about slowness issues: maybe that would be good enough! E.g.
>
>   var slowness_info = webgl.getParameter(webgl.ACCELERATION_STATUS);
>   if (slowness_info == webgl.COMPLETE_ACCELERATION) {
>      // this is the case with recent cards
>      runGameWithFullGraphicsExperience();
>   } else if (slowness_info == webgl.INCOMPLETE_ACCELERATION) {
>      // this is the case with certain Intel chips
>      runGameWithAFewAdvancedFeaturesDisabled();
>   } else if (slowness_info == webgl.LIMITED_OR_NO_ACCELERATION) {
>      // software rendering
>      displayBigWarning();
>      runGameWithFullGraphicsExperience(); // we have nothing to lose
>   }
>
> Wouldn't that be good enough? Sure, that would no allow you to tweak as finely the details of what you enable/disable, but does that really matter? Also, letting browser implementers take the decision will probably result in better decisions being made overall (it is lots of work to maintain the code to decide which cards are slow).
>
> This would only give away 1 or 2 bits of user-identifying info, which is far less than the strings would give, so that could be a compromise if it's needed (which remains to be seen).
>
>> > There are better ways in which users can easily get such
>> > information. In Firefox's case, tell them to go to about:support,
>> > that will actually give you much more detailed info. The browser is
>> > aware of the system's details, it only doesn't expose this info to
>> > web content.
>>
>> A very small number of affected users will ever get that far.
>
> Here I was replying to a comment about people reporting issues. These people have already taken the step to report the issue, so they can be asked to go to about:support.
>
>> Will Adobe Molehill provide this diagnostic facility? Quite possibly.
>> Make no mistake that WebGL is in competition with Flash, Silverlight,
>> Unity3D and the rest when application developers decide what platform
>> to build for. I don't want to use these proprietary platforms but if
>> they let my company actually diagnose problems in the field they will
>> have a huge leg up.
>
> We'll never be as deeply integrated into the system as platforms that rely on the user to install binaries (such as Flash Player), even if we expose full information about the graphics card; but I don't necessarily see this as a weakness, it can be considered a strength, in terms of putting the user in control of his web experience (privacy/anonymity is part of that) and in terms of having more perennial (less platform dependent) content.

You may not see this as a weakness but commercial game developers do.
Commercial game devs want as much information and control as possible.
If you make it unnecessarily difficult to diagnose and patch arbitrary
hardware/software combinations, Flash will continue to win. No one
wants to see that.

If you are letting me run shaders on your GPU, you are letting me
exercise your graphics drivers. The trust implied here is many times
greater than the trust implied in reading card and driver information.
Allowing GPU access without providing a way to read the graphics
system information makes the commercial implementors' maintenance job
much harder. Without access to these strings, implementors will resort
to timing and benchmarking and many other tricks to figure out just
what hardware is giving them trouble. This is a tax that only WebGL
content will pay. Adobe will do the easy thing and provide this
information and Flash devs will not pay this tax.

User de-anonymization will not be what makes or breaks WebGL. Not
providing access to necessary information for commercial deployment
may very well see WebGL languish and in 2015 when Flash 12 is standard
and WebGL is dead it will not matter that WebGL maybe prevented
possible attackers from getting 6 more bits of identifiable
information from users.

Access to my graphics drivers is a privilege. If I give you the
privilege to run custom shaders on my GPU, why wouldn't I tell you
what software and hardware is processing your shaders?

Best,

David Sheets

-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: