[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Addition to WebGLContextLostEvent wrt extensions

On Tue, Apr 6, 2010 at 23:42, Gregg Tavares <gman@google.com> wrote:
On Tue, Apr 6, 2010 at 7:03 PM, Cedric Vivier <cedricv@neonux.com> wrote:
I don't think this is an issue at all in WebGL due to the simple single-threaded event model of _javascript_.
Afaik contrary to C-like signals which are preemptive, once the loop is started it will have to finish before _javascript_ can handle any WebGLContextLostEvent or WebGLContextRestored event.
If the context is lost at 120, the loop will continue and quickly generate GL errors at every subsequent call until it stops; then when the current _javascript_ chunk has been completed the registered WebGLContextLostEvent handler will be ran then the WebGLContextRestored handler, which will likely restart the whole loop again (assuming this loop is in e.g init).

The WebGLContextLostEvent and WebGLContextRestored event may be single threaded but GL itself is not. The user can press the hibernate button at any time.  If the model is WebGL restores its self which is what WebGLContextRestored suggests that this is a real case. I can keep trying to come up with examples where the context going from bad to good without app control is bad idea.

I was assuming it would be possible for WebGL implementations to restore the context ONLY just before the _javascript_ handler is executed (e.g an internal native event listener that resets the context when the dispatched event is actually ready for execution in the _javascript_ context).
Afaik *if that's possible* (?) then WebGLContextRestored would have no drawbacks but only advantages (simplicity in both concept and user code implementation) compared with a WebGLContextReady+resetContext() right? (since the order of execution would be obligatory the same in both cases)

The problem with a zero length array is it has to be programmed for and it doesn't match OpenGL.  In OpenGL I'd do this

   static const char buffer[8*8*3];

   glReadPixels(0, 0, 8, 8, GL_RGB, GL_UNSIGNED_BYTE, buffer);

   // do something with 8x8 RGB pixels.
   printf ("the red value of pixel at 7x7 is %d\n", buffer[(7*8+7)*3]);

Most programs don't care if there was an error. The buffer will be zero or the contents of the last successful glReadPixels.

In WebGL the same should be true.

   var buf = ctx.readPixels(0, 0, 8, 8, GL_RGB, GL_UNSIGNED_BYTE);
   document.write("the read value of pixel at 7x7 is " + buffer[(7*8+7)*3]);

It seems to me, in the spirit of GL that should not throw because (7*8+7)*3 is out of range nor should it return undefined.

True, actually WebGL spec currently does not rule out returning 0 here... the spec says an implementation can do any of these, that should be clarified maybe?