On 15 Jan 2014, at 9:33 am, Florian Bösch <email@example.com> wrote:
WebGL is so low-level that a developer can definitely write their content to perform well given the information above. At the same time, that knowledge can be useful to someone writing any complex Web application, given some understanding of the implementation (most of which are open source). Yet we typically don’t expose that information.
And the details in Google’s proposal really only mentioned evaluating final frame rate based on the card name. They didn’t mention doing anything in Maps code regarding the items listed above, other than disabling AA for some hardware. Maybe Jennifer could give more details?
If the state of the art in Google Maps is to query the GPU id and turn off either AA or WebGL entirely, then it seems the more important problem was that the user experience was degraded *before* they were able to measure performance. Since people don’t update their computer too often, one way to improve this would be to use local storage to remember what the performance was. Then you’d only have to do a real test every so often. [NOTE: I’m completely aware how ridiculous it is for me to make suggestions like this - Google probably spent hundreds of engineer hours trying everything they could]
I’m not trying to make an argument one way or the other here (yet). I’m just asking the questions that will be asked of me if I attempt the very difficult task of exposing user information in Safari. I expect the first comment will be along the lines of “Hi, I noticed you just spent $10000 on a brand new Mac Pro!!”