[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] WEBGL_debug_shader_precision extension proposal
Paragraph 4.5.2 of the OpenGL ES standard specifies the minimum precisions for the qualifiers, but in principle they all could have the same precision.
Isn't this similar to how it is done is C / C++ : short int, int, long int, long long int ?
In principle they all could have the same count of bits, but hardware vendors are free implement them as long as the next type is at least as big as the preceding type ?
But there is no way to query the bit size of the in C / C++ (you can figure it out with some tests), at least in WebGL you can query the bit size of the significand and exponent
(although most of the time the reported significand bit count is one less compared to what it should be).