[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] OT: exception handling
On Fri, Apr 6, 2012 at 10:50 AM, Florian Bösch <email@example.com>
No. The effort required to handle this is stupendous, and the best-case scenario is that you'll incur considerable reload times for no discernable reason to the user, WebGL living in a webpage being particularly disadvantaged there due to inability to control file-system access (I'm aware of the work-arounds/hacks/hopefuls to "solve" this, no need to discuss them).
(This doesn't really make sense. If you can load the data in the first place, you can reload it; WebGL isn't at any particular disadvantage here. It's a pain and WebGL doesn't make it as easy as it might have, but the solutions--reload data as needed, keep HTMLImageElements around--aren't hacks any more than the native equivalents are.)
On Fri, Apr 6, 2012 at 11:09 AM, Gregg Tavares (勤) <firstname.lastname@example.org>
We've all dealt with C code which carefully checks for errors from each function call, and we learned how easy that is to get wrong. The industry learned from that, and languages moved to the exception model, precisely to avoid those problems. That's why it's so strange to me that WebGL seemed to jump back a decade or two to using the C style of error handling. We have better error handling models now.
This is not about C vs languages with exceptions. This is about OpenGL/WebGL, a rendering API.
You missed the point. This is about error handling models, not languages. We learned--collectively, as an industry, during the many years where C was predominant--that return-value error handling is error-prone and doesn't work well. We switched to exception-based error handling as a result of that experience. That applies equally to all APIs. WebGL isn't "special"; that style of error handling is just as error prone in WebGL as it is everywhere else.
No you don't have to be careful or do lots of special handling. It's very simple.
For 99% of WebGL programs you have to do 3 things
1) Don't attach stuff to WebGLObjects
2) When compiling/linking if you get failure, ignore that failure if the context is lost
3) If calling getActiveAttrib or getActiveUniform check for null
You're demonstrating why this is such a problem: in step 3 you forgot about getParameter, getVertexAttrib and getShaderPrecisionFormat, just on a quick random sampling.
That's it. Follow those rules and you're done.
99% of WebGL programs have no reason to call any other get function. If you do not attaching things to WebGLObjects then you don't care if createXXX returns null. the API is designed so that everything is a no-op on lost context.
create* functions aren't the issue, since we already have a way to eliminate null results from those without adding anything new (which is what we're discussing in the other thread). We're talking about other functions that return data.
You don't need special cases, and you don't put exception handlers right around the functions; you put them around the larger, higher-level functions causing it to be called. This is how exceptions are used.
But I have to deal with the exception. Where as it is now I have to deal with nothing. The logic of my code doesn't change. If I have an exception then I get into situations where for example I expected A,B,C,D,E,F to get called in order. I get an exception at C and now D, E, F are left undone. All kinds of side effects can occur. Maybe A, and B pushed work do be done into some queues that D, E and F were expected to process.. Maybe A and B created some temporary collision objects and D, E and F were expected to release. With exceptions all of those come into play. With WebGL's design none of those come into play.
If you don't need to do anything with an exception, then catch and discard it. This is basic exception handling.
You're not really arguing against exceptions in WebGL; what you're saying is that you just don't like exceptions. That's not a reason for WebGL not to use them; that argument ended years ago. Exceptions are the standard way of dealing with errors in modern APIs.
All your code will function, WebGL calls will just be no-ops. If you get lost context at C then C,D,E,and F still execute. Since for 99% of WebGL programs all they are doing is pushing data to WebGL, they have no need to query and no need to get anything then it's no different than if the user turned his monitor off or minimized the window. Things don't render but all the code still executes.
float alpha = getParameter(BLEND_COLOR); // explodes
var maxX = getParameter(MAX_VIEWPORT_DIMS), maxY = getParameter(MAX_VIEWPORT_DIMS); // explodes
if(gl.getSupportedExtensions().indexOf("WEBKIT_WEBGL_compressed_textures") != -1) // explodes
if(getShaderPrecisionFormat(FRAGMENT_SHADER, MEDIUM_FLOAT).rangeMax < expected) // explodes
As pointed out above, it is exceedingly easy to write this code because you can ignore NULL.
It's exceedingly easy to write *wrong*, subtly broken code exactly because you can ignore NULL. It's very hard to be sure that you've handled it in every case you need to. (It's also hard to test those code paths, but exceptions wouldn't fix that problem.)
If you want this it's easy. There's already a wrapper here (http://www.khronos.org/webgl/wiki/Debugging
) and you don't need to call getError. You can just call gl.isContextLost. Though if you want to call getError there's an example of wrapping it in that code.
It doesn't need to be done after every call, just the ones that return null on context loss. I think this is probably a practical thing to do in production code, not just debugging. (Calling getLastError() like this is only useful for debugging, of course, due to the performance issues.)
On Fri, Apr 6, 2012 at 11:15 AM, Gregg Tavares (勤) <email@example.com>
That has nothing to do with WebGL. It has to do with the current state of GPUs, drivers and OSes. GPUs are not CPUs. Their memory isn't easily swapped or protected and currently none of them are preempt-able. None of the browser vendors have control over that. Even Apple doesn't write their own drivers or make their own GPUs.
OpenGL in Windows has always handled this transparently, probably by storing the context state when losing the context, blocking during a context loss and then restoring the context transparently. Unfortunately, that doesn't happen on all platforms. It's inherently harder to do on mobile, too, since you may not have enough memory to do that.