[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] GL_RENDERER string needed for performant apps

Let me know if there are other things you think would be useful after reading the doc, I'm happy to add more information where possible.


On Tue, Jan 14, 2014 at 1:03 PM, Dean Jackson <dino@apple.com> wrote:

On 15 Jan 2014, at 7:54 am, Brandon Jones <bajones@google.com> wrote:


Did you read the document linked in Jennifer's original post?

Oops! I didn’t! I’m sorry Jennifer!

It contains some examples of the numbers you are talking about. Specifically:

  • ~15% of WebGL sessions have a low performing GPU
    • Low performing GPU categorized where the 70th percentile was < 30 FPS, meaning at least 70% of sessions on that GPU had a framerate below 30 FPS
  • 80% of sessions with a poorly performing GPU are represented by relatively common GPUs
  • Sessions under 10 FPS are reduced by 55% by blacklisting
  • Sessions under 20 FPS reduced by 33% by blacklisting

We made a conscious decision not to share data about specific low-performing GPUs, but perhaps we can make some of that information available through private channels if you believe that it will help better make the case within Apple?

I’ll start with what you shared.




On Tue, Jan 14, 2014 at 12:44 PM, Dean Jackson <dino@apple.com> wrote:
Hi Jennifer,

On 14 Jan 2014, at 10:39 am, Jennifer Maurer <maurerj@google.com> wrote:

The Google Maps and Chrome teams recently conducted an experiment where the GL_RENDERER string was made available via WebGL for 5% of users, so that the performance implications could be quantified. Using this data we have conclusively determined that access to the GL_RENDERER string is required in order to provide a good user experience in the new Maps product, which uses WebGL as its preferred rendering API.

In response, Google Chrome plans to enable access to the WEBGL_debug_renderer_info extension universally, allowing all WebGL applications to access this information. Ideally, we would like all WebGL implementers to follow suit, in order to provide the best user experience for the new Maps and other complex WebGL applications.

Apple takes a very strong position on user privacy, and is reluctant to expose any user information to third parties (in this case the web page/server). I expect any decision to enable these extensions in Safari will require a lot of internal discussion.

I’m not formally objecting to removing the security warnings because I see the benefits of a developer knowing the hardware capabilities, but I ask the group to hold off making a decision on this for at least a couple of weeks while I can get some people to look at the proposal.

One thing that would really help, both Apple and the community, would be some specific examples of how you used the GPU identification to improve performance, and to what degree it helped.

e.g. About 10% of Maps users were hitting XYZ bug, because they were using ABC GPU. This caused their system to hang. So we disabled MNO for ABC and found user sessions to last approximately 200% longer.

Without this it’s hard to evaluate the benefit. One could infer from this message that WebGL itself is requiring too much from the popular hardware out there, and that the specification should be scaled back (obviously not what you’re suggesting).

(When I said “specific”, I meant the examples - the numbers themselves can be blurry. Also, I’m not asking for exhaustive examples - just a few)



In addition, we are petitioning the WebGL working group to remove the security warnings from this and the WEBGL_debug_shaders extensions, as we believe that exposing this information is not as privacy-sensitive as once feared, and the benefits of making it available to real-world WebGL applications have been clearly demonstrated.

Please send feedback on this plan and the proposal for the WebGL extensions to the mailing list.

For further information, please see http://goo.gl/dxRdGS