I use it to detect and work around known bugs in drivers, like Apple not supporting row-major matrices ( https://bugs.chromium.org/p/angleproject/issues/detail?id=2273 ), as well as providing better error messages to users (e.g. "Google SwiftShader"). The immediate impact if this extension was removed would be that all Apple devices would fail to render.
From: email@example.com <firstname.lastname@example.org> on behalf of Evan Nowak (email@example.com) <firstname.lastname@example.org>
Sent: Friday, May 3, 2019 7:56:27 AM
To: Public WebGL
Subject: Re: [Public WebGL] Uses of WEBGL_debug_renderer_info
At Onshape we use WEBGL_debug_renderer_info for a few purposes:1. To work around driver-specific interactions with our system. More often than not this involves disabling a feature of our renderer that is exhibiting poor performance on a particular platform/GPU combination. If possible, we will try to find an alternative solution that performs well on all platforms, but this is not always feasible.
2. To inform users when they may be experiencing reduced performance due to the use of an integrated GPU over a discrete one. We have seen a significant number of users complain about rendering performance only to discover later that they have a more powerful, discrete GPU going unused. We try to shorten this pain period by detecting systems that may have a discrete GPU alongside an integrated one, and notify the user when degraded rendering performance is detected. We are looking forward to a wider implementation of the “powerPreference” option so the guesswork of which GPU is running will be avoided.
3. To collect statistics about the hardware makeup of our user base. This is helpful in understanding the spectrum of performance capabilities amongst our users.