[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WEBGL_debug_shader_precision extension proposal



Paragraph 4.5.2 of the OpenGL ES standard specifies the minimum precisions for the qualifiers, but in principle they all could have the same precision.
Isn't this similar to how it is done is C / C++ : short int, int, long int, long long int ?
In principle they all could have the same count of bits, but hardware vendors are free implement them as long as the next type is at least as big as the preceding type ?

But there is no way to query the bit size of the in C / C++ (you can figure it out with some tests), at least in WebGL you can query the bit size of the significand and exponent
(although most of the time the reported significand bit count is one less compared to what it should be).



2014-11-17 17:42 GMT+01:00 Florian Bösch <pyalot@gmail.com>:
I would, if at all feasible, prefer it to be a library, because then it can be used to run it across every browser. Which is kinda important because the architecture of the backends trough ANGLE, and in IE11 is probably quite different, and may lead to different results.

I'd rather see this happening in some fashion, no matter what fashion though, than not.

As a sidenote, I wanted to comment on precision qualifiers. It's always struck me that the OpenGL specification is deficient in its definition of numerical types (as in that it didn't specify which type has what precision and which standard it'll have to implement). I understand that this is largely historical. I also understand that the precision modifiers came to be later on as the GL API was adopted for mobiles. But the combination of no implementation guarantee, no precision guarantees, no numerical implementation guarantee, and "types" which can change precision depending on a specifier, strikes me as a particularly bad idea. I cannot recall any statically typed language that would've followed the same logic (though there might be one). And that's probably because most statically typed languages designers thought, that would've probably been a wonky and bad idea.