[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Is WebGL2 Effectively Dead?



On Wed, Mar 7, 2018 at 7:42 PM, Nicolas Capens <capn@google.com> wrote:
It's not a mystery. 36% of mobile devices running Android don't support OpenGL ES 3.0.

Since most people replace their mobile every 2 years or so, it would be interesting to know why more than a third of handsets more than 6 years after the introduction of a new standard (7 if you count the first draft, 8 if you count first planning phase), are not compatible with that industrywide standard.

 
 Similarly, the desktop and laptop GPU market has been dominated by integrated GPUs for the last decade, and a large chunk of them have issues that require blacklisting WebGL 2 support.

Similarly it would be interesting to know why Intel is selling garbage hardware en-masse (and continues to do so). Surely getting the kinks out of its hardware before production/shipping should be a reasonable expectation. Let alone do it at least once in more than 10 years... They're not helping anybodies bottom line by shipping bugged garbage hardware (not their own, and not anybody who writes applications for them).


Need I remind anybody of the fact that this is not acceptable? Imagine if you will if there was 3 or 4 different brands of CPUs, each of which required application developers to write a different application for them, and each generation of each brand shipping with different bugs necessitating each application developer to tailor their application best they can around the bugs of those CPUs and if they can't, ruefully inform users of said hardware (of which they have no idea which they have) that for reasons beyond their control their application will not function with their hardware.

Imagine if the google play store, ios app-store, windows store etc. needed to check for each application for each cpu if it actually could give you that application. Imagine the frustration that users would have trying to obtain any application. Why exactly is it this should be acceptable for GPUs? What do you expect to gain from screwing things up this badly that writing applications for GPUs is basically an exercise in futility, in the year 2018?!

Surely you must realize that there's something very, very wrong here.
 
In both cases support is slowly increasing at a rate between 1-3%/month (a little less on windows, a bit more on android). At these rates it can be expected for WebGL2 to reach WebGL1 levels of support (98%):
  • Windows: June 2019 to September 2020
  • Android: February 2019 to August 2019
That would actually be phenomenally soon. Since the issue is GPU hardware, and the prevalence of each generation follows an S-curve, we're going to have a long tail before ~98% of support is reached.

I'm aware that it's an S-curve, I'm just hoping that the part that takes us most of the way is reasonably flat.
 
That said, SwiftShader has recently reached the milestone of passing the dEQP test suite for OpenGL ES 3.0, running entirely on the CPU, and we're making quick progress on WebGL 2 conformance. So for the needs of casual games and educational content we could have broad support much sooner.

As nice as that is technically, and congratulations on that achievement, for application developers it's very near useless. Spanning a performance gap of nearly 100:1 between high end desktops and low-end (but functioning) mobile GPUs is already pretty challenging. Spanning a performance gap of 1000000:1 or thereabouts is just about futile.