[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Proposal: Generate INVALID_VALUE if value >= MAX_TEXTURE_IMAGE_UNITS on uniform1f(v) for samplers

I like this idea. And yes, "x versus gl.TEXTUREx" is at the top of my list of common WebGL mistakes so anything we can do to help with that is useful.


A developer ran into a bug today where their app was working on Linux but not Mac and Windows.

The issue was they were calling

   gl.uniform1f(someSamplerLocation, gl.TEXTURE0);

when they should have been calling

   gl.uniform1f(someSamplerLocation, 0);

You could say this is their fault for writing bad code but the thing is there are no errors for this condition defined by OpenGL ES or WebGL AFAIK.

It just happens that on Linux calling gl.uniform1f(someSamplerLocation, 33984) uses texture unit 0 and on Mac and Windows it does something else.

Given that uniforms are program specific and given that at runtime we know whether or not a particular location is a sampler, should we generate an INVALID_VALUE
if the value set for a sampler uniform is greater than or MAX_TEXTURE_IMAGE_UNITS?