[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Public WebGL] Defining GLintptr and GLsizeiptr as long long is incompatible with GLES2 on 32-bit systems



Hi all,

WebGL defines GLintptr and GLsizeiptr to always be 64-bit, but in GLES2 they can actually be either 32-bit or 64-bit depending on the platform. And in practice Chrome and Firefox treat for example bufferData GLsizeiptr size parameter as 32-bit (tested on latest 64-bit stable versions on Linux). It would seem like changing the spec to define pointer values as 32-bit would make sense for compatibility. The obvious downside is that buffers would be limited to 2GB, but this should not at least break existing applications. Any potential practical application of >2GB buffers in WebGL is still probably years away as well. Should the spec be changed, or are there any other ideas how this could be handled?

Reference below.

WebGL spec:
typedef long long      GLintptr;
typedef long long      GLsizeiptr;

WebIDL spec:
The long long type is a signed integer type that has values in the range [−9223372036854775808, 9223372036854775807]. 

gl2.h:
typedef khronos_intptr_t GLintptr;
typedef khronos_ssize_t  GLsizeiptr;

khrplatform.h (signed long int is typically 32-bit on 32-bit platforms, and typically 64-bit on 64-bit platforms):
typedef signed   long  int     khronos_intptr_t;
typedef signed   long  int     khronos_ssize_t;

-Olli Etuaho