Yes, TypedArrays started here but they were correctly passed off to JS. I doubt they'll see any traction though for pretty much the same reason nothing like that exists in any other language I know of. it's a niche feature. You can have a C/C++ array of float, short, unsigned short, int, unsigned int etc but there's no half AFAIK
The fix for this issue exists, it's color_buffer float extensions. I've filed bugs for you to implement it. You want me to file again?this is never going to be fixed for WebGL1 as it would break too much content. The whole issue was WebGL shipped for > year with OES_texture_float and the ability to use them as a framebuffer attachments before Mark Callow pointed out that was wrong and that we needed EXT_color_float_buffer. So it was added but the original method (make floating point texture, attach, check for FRAMEBUFFER_COMPLETE) was left.
It's fixed in WebGL2 because that will not break any content but it can't be fixed in WebGL1 without breaking sites
The specification of color-buffer-float clearly specifies that area, it states that: "The format and type combination RGBA and FLOAT becomes valid for reading from a floating-point rendering buffer."Agreed: Add a conformance test?