[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] GL_RENDERER string needed for performant apps



On Tue, Jan 14, 2014 at 10:07 AM, Mark Callow <callow.mark@artspark.co.jp> wrote:
Privacy was not the only concern.

What is going to prevent this becoming another user-agent string fiasco where implementers have to start spoofing the string in order to get applications to run?


That's a good question, and the answer is probably a combination of community pressure, audience reach, and looking at existing behavior.

First off, it's been the case for a while now that pages which artificially block access based on user strings draw the ire of the web community pretty quickly. I would expect that many WebGL apps (especially games) will draw a more technically aware audience, and as such the backlash would be even worse. I don't think it's a negative thing to have the community shame bad actors in this regard. It's also worth pointing out that it would be pretty trivial to write a browser extension that spoofs the RENDERER/VENDOR with the page being none the wiser, which would make it trivial to spot cases like this.

Imagine what the internet would do if the latest Starcraft worked only on Nvidia cards, but some quick driver hacks proved that it ran just as well on AMD? They'd be skewered by the online community, and rightly so. The web doesn't yet have it's own Blizzard or Valve, but it's not hard to project this situation onto a hypothetical "Farmville 3", which would surely draw just as many (if not more) complaints. That brings up the second point, which is audience reach. 3D applications are hard to build, and there's very few financial motivations for excluding part of your audience. Until just recently, actually, one of the biggest concerns about WebGL I consistently heard was that the audience reach was limited because IE and mobile Safari weren't on board. (Fingers still crossed on iOS...) I have a hard time imagining those same developers will willingly deny any large portion of their audience the ability to use their product unless there are known issues that simply prevent the app from working correctly, in which case it's no worse than the blacklisting that browsers already do internally.

There is an argument to be made that hardware vendors may use this as a method to lock showcase apps into their own hardware, or pay other developers to target their hardware. I could easily see Samsung artificially restricting the Hello Racer demo that was recently put out to only run on Galaxy devices, for instance, or Nvidia putting out demos that block anything but a Tegra device. This happens on the desktop all the time, of course, and nobody complains. The big difference is that on the web it would be much easier to work around that kind of block and call out the developer for artificially restricting the audience. (See point 1.)

Finally, it's worth noting that you can already easily segregate a large chunk of your audience without these strings simply by requiring an extension (MRT) that doesn't have widespread browser or device support yet. This leads to demo applications that can't run on a given device, but anyone that's trying to build something more than a simple effect demo and reach a large audience will either avoid the extensions or code fallback paths. Additionally, right now this is only available in Chrome and still requires the standard extension query. This means that authors either must resign themselves to building something that only works for a percentage of a percentage of the browser market (See point 2) or must include a fallback. It's far more likely, given that, that authors will use the information given to improve the experience where they can rather than blacklist. That may change if access to the extension becomes ubiquitous, but I still feel the other motivations are strong enough to counteract most bad behavior in that case.

--Brandon