[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: [Public WebGL] Defining GLintptr and GLsizeiptr as long long is incompatible with GLES2 on 32-bit systems



Oh, my bad, I had mistakenly thought that 1 << 31 would be positive in JS, which caused the odd negative size error messages I was getting. So there is after all support for >2GB buffers in current browsers. And I agree that these kinds of resource constraints would be atypical of JS. If browsers just report OUT_OF_MEMORY on 32bit architectures when trying to allocate a >2GB buffer, no spec changes are needed.

However, there's another issue: it's currently extremely easy to DOS Firefox with a bufferData call at least on Linux. Normal JS watchdog functionality doesn't seem to be catching the runaway allocation and initialization of N gigabytes of data to zero which the browser tries to pass to the GPU. You may try this, varying the amount of gigabytes allocated, but prepare to force kill the browser:

<html>
    <head>
        <script type="text/javascript">
            function main() {
                var gl = document.getElementById("c").getContext("webgl");
                if (!gl)
                    gl = document.getElementById("c").getContext("experimental-webgl");
 
                var buffer = gl.createBuffer();
                gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
                var gigabytes = 4;
                gl.bufferData(gl.ARRAY_BUFFER, (1 << 30) * gigabytes, gl.STATIC_DRAW);
            }
        </script>
    </head>
    <body onload="main()">
        <canvas id="c" width="128" height="128"></canvas>
    </body>
</html>

Chrome suffers from some variations of this as well, though I didn't get it to become completely unresponsive on Linux like Firefox. The solution could be doing the initialization of the buffer in smaller pieces, and relying on the driver to report out of memory conditions when initially reserving the buffer, though I don't know to what extent all drivers can be trusted on this. On Linux it is also possible to use mmap to get a buffer of zeroes on the CPU side without doing any heap allocations.

-Olli
________________________________________
From: owner-public_webgl@khronos.org [owner-public_webgl@khronos.org] On Behalf Of Benoit Jacob [bjacob@mozilla.com]
Sent: Monday, September 02, 2013 5:46 PM
To: public_webgl@khronos.org
Subject: Re: [Public WebGL] Defining GLintptr and GLsizeiptr as long long is incompatible with GLES2 on 32-bit systems

On 13-09-02 10:18 AM, Benoit Jacob wrote:
> On 13-09-02 06:56 AM, Olli Etuaho wrote:
>> Hi all,
>>
>> WebGL defines GLintptr and GLsizeiptr to always be 64-bit, but in GLES2 they can actually be either 32-bit or 64-bit depending on the platform. And in practice Chrome and Firefox treat for example bufferData GLsizeiptr size parameter as 32-bit (tested on latest 64-bit stable versions on Linux).
> That sounds like an interesting bug, could you please provide a
> testcase? I'm looking at Gecko code now, WebGLContext::BufferData takes
> a WebGLintptr which is a typedef for int64_t, so I don't see a bug out
> of hand.
>
>>  It would seem like changing the spec to define pointer values as 32-bit would make sense for compatibility.
> That would indeed improve compatibility by preventing taking advantage
> of the extra memory available on 64-bit systems. But that's not really
> how everything else works; everything else not just in WebGL but
> generally around JavaScript allows taking advantage of all the memory
> available on the client system, e.g. you can create a TypedArray bigger
> than 4G.
My bad, the typed arrays spec actually says that the byteLength of an
ArrayBuffer is unsigned long, so the above example is wrong.

Still, many other things in the Web platform don't enforce a 4G byte
limit. For example, ordinary JS arrays can have 4G elements, which is
more than 4G bytes.

Benoit

>
> Benoit
>
>>  The obvious downside is that buffers would be limited to 2GB, but this should not at least break existing applications. Any potential practical application of >2GB buffers in WebGL is still probably years away as well. Should the spec be changed, or are there any other ideas how this could be handled?
>>
>> Reference below.
>>
>> WebGL spec:
>> typedef long long      GLintptr;
>> typedef long long      GLsizeiptr;
>>
>> WebIDL spec:
>> The long long type is a signed integer type that has values in the range [−9223372036854775808, 9223372036854775807].
>>
>> gl2.h:
>> typedef khronos_intptr_t GLintptr;
>> typedef khronos_ssize_t  GLsizeiptr;
>>
>> khrplatform.h (signed long int is typically 32-bit on 32-bit platforms, and typically 64-bit on 64-bit platforms):
>> typedef signed   long  int     khronos_intptr_t;
>> typedef signed   long  int     khronos_ssize_t;
>>
>> -Olli Etuahob����.����\����&�v�)��bs��Y!��貊�N�����r��zǧu�ޙ�������ݢj$��'��+����^~�e����&��݊{ay�ʇ��ޙ���{.n�+�����bs�gl==
>
> -----------------------------------------------------------
> You are currently subscribed to public_webgl@khronos.org.
> To unsubscribe, send an email to majordomo@khronos.org with
> the following command in the body of your email:
> unsubscribe public_webgl
> -----------------------------------------------------------
>


-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email:
unsubscribe public_webgl
-----------------------------------------------------------