I don’t know anything about why WebGL apparently needs to be blacklisted on a large chunk of integrated GPUs.
It's because of the myriad of exotic bugs that exist, that are corner cases until your software runs into them. Since WebGL has an extensive conformance test suite, and since we set a high bar of expectation of "things working" on the web, it means that for exotic bugs which you might not exercise in your application (but somebody does), you'll get it blacklisted. It's not an unreasonable approach to dealing with things that are fundamentally broken.
The question though is why a 14-16 year old trimmed down standard is the only thing that you can even remotely describe as universally working. There's some parallels with that to x86, but unlike x86 new machine features (and by that I mean machines over a decade old) are inaccessible to that standard. That's a pretty bad thing for application developers and GPU makers, because GPU makers pour all that money into developing fancier machines, and application developers are desperate to use new machine features, but use of these features is largely relegated to the margins for more than a decade.