[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Shader validation and limitations

Here are a couple of samples

This one hangs my MacbookPro for 10-30 seconds. Sometimes the desktop gets garbage but a click usually restores it. On my Windows7 box the after a few seconds the screen will flash several times and then a notification will appear that the driver has been reset. Your mileage may vary.

It's using about the simplest textured shader possible but it's drawing 100000 polys at 1024x1024 in 1 draw call.
It might be possible to have WebGL break up the draw calls. Unfortunately that could add a significant slowdown to apps since each call to glDrawXXX has a large overhead. It would be good to test that I guess as a possible solution.

Here's another

This one just allocates some largo FBOs. It doesn't have to render anything to basically kill OSX on my MacBook Pro. On my Windows7 box it eventually runs, at least on my machine. Again your mileage may vary. I suspect this is more of a driver issue, not a GPU issue.

I'll see if I can make another with one with far less polygons but a slightly more complex shader. I don't believe any loops are necessary. 

Also the shader toy, which at this point it time is sadly offline, is able to hang or dos several GPUs. 

Note: I only tested those samples on top of tree WebKit on OSX and Chromium on Windows7. The first one requires a WebGL that matches the current spec for texImage2D (the updated API version)

I understand the concern about all of this but I guess I kind of see it as a self limiting problem. If I go to site xyz.com and it locks my machine I'm not going to go back to site xyz.com anymore.  You can't use these issues to steal user info.  While maybe some kind of cross scripting hack thing could get several sites to serve up something like this it seems like it's got limited usefulness. 

It can't be used to install a virus or execute code so all it really does is DOS the user's machine and make him not want to go back to that site. It seems like a person with malicious intent would be far more likely to serve some kind of real exploit. Any site that could be hacked to serve this could also have been hacked to serve far more malicious code. 

It's more likely in the short term that some Safe Browsing system is the best solution since it will take time for OSes and Drivers to handle this.