[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] The miserable state of affairs of floating point support



On Fri, Mar 24, 2017 at 6:02 AM, Gregg Tavares <khronos@greggman.com> wrote:> 

That's a _javascript_ issue, not a WebGL issue. This is the WebGL forum. If you want Float16Array you'd need to go petition the _javascript_ guys (I see that you have)

Yes, TypedArrays started here but they were correctly passed off to JS. I doubt they'll see any traction though for pretty much the same reason nothing like that exists in any other language I know of. it's a niche feature. You can have a C/C++ array of float, short, unsigned short, int, unsigned int etc but there's no half AFAIK

These are the numeric data types that GPUs (which come with every consumer device sold on this planet) support trough APIs like the GL family:
  • Unsigned byte -> Uint8Array
  • Signed byte -> Int8Array
  • Unsigned short -> Uint16Array
  • Signed short -> Int16Array
  • Unsigned Integer -> Uint32Array
  • Signed Integer -> Int32Array
  • Fixed -> ???
  • Half-Float -> ???
  • Float -> Float32Array
  • Double -> Float64Array
All numbers in JS are doubles. JIT code may accelerate things if it detects paths that only deal with certain types, but JIT'ing is not part of the JS specification. It's part of an implementation ooptimization.

Data that is passed off to the GPU or read back from it is often processed on the CPU side for various purposes. This is simple to understand, CPUs need to understand the numbers they put into and get from the GPU to productively work with it.

_javascript_ is a dynamically typed language, this has some advantages. For instance you want to write some code that modifies a collection of numbers in some way:

var addToArr = function(arr, n){
  for(var i=0; i<arr.length; i++){
    arr[i] += n;
  }
}

To deal with half float and fixed in JS you can write some converter functions like toHalf and fromHalf. You can also write an object that pretends to be an Array, however you cannot provide the subscription operator, which means that if you want your routine to support both built-in numerical arrays and "pretend" arrays you need to do:
var addToArr = function(arr, n){
  if((arr.buffer instanceof ArrayBuffer) || (arr instanceof Array)){
     for(var i=0; i<arr.length; i++){
       arr[i] += n;
     }
  }
  else if(typeof(arr.__get__) == 'function' && typeoff(arr.__set__) == 'function){
    for(var i=0; i<arr.length; i++){
      arr.set(i, arr.get(i) + n);
    }
  }
  else{
   throw new Error("sigh");
  }
}

I don't believe it can be in the interest of ECMA (if they realize it or not) or UAs to promote this kind of code everywhere that a machine supported native data-type is being dealt with.

Yes I have asked ECMA to introduce Float16Array to no avail. By far and large it's my impression they don't give a shit about GPUs. From this I can only conclude that WebGL (and the various other forums for GPU related browser development) should not give a shit about ECMA.

UAs have led the way in introducing machine compatible numerical arrays so in order to be able to work with GPUs. UAs can in the same vein complete this support (which should have happened from the start), and ECMA's ass will follow.

So the question stands: "Why not?". Why delegate the problem to be solved to people who have no interest or qualifications to even understand the problem, much less have any stake in solving it. It's not a niche problem if 100% of every consumer device comes with hardware that supports these types.


The fix for this issue exists, it's color_buffer float extensions. I've filed bugs for you to implement it. You want me to file again?

this is never going to be fixed for WebGL1 as it would break too much content. The whole issue was WebGL shipped for > year with OES_texture_float and the ability to use them as a framebuffer attachments before Mark Callow pointed out that was wrong and that we needed EXT_color_float_buffer. So it was added but the original method (make floating point texture, attach, check for FRAMEBUFFER_COMPLETE) was left.

It's fixed in WebGL2 because that will not break any content but it can't be fixed in WebGL1 without breaking sites

You didn't understand the issue. The isn't that these extension would break things if they're interpreted literally. That's not a problem. The problem is that UA (such as chrome) have elected to not support this extension, even though they support the functionality exposed by the extension. This a problem. Because the absence of the extension does not indicate the absence of the functionality.

If you want to stay with the traditional behavior, there is nothing preventing you from doing just that, just implement the extension, enable it by default as per usual. But right now, they're meaningless, which makes everybody jump trough idiotic hoops for no gain.

 
The specification of color-buffer-float clearly specifies that area, it states that: "The format and type combination RGBA and FLOAT becomes valid for reading from a floating-point rendering buffer."

Agreed: Add a conformance test?

The conformance test already exists. It's the one for the color-buffer float extensions.