[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WebGL performance...seems like I have no hardware acceleration?!?



I see this performance problem on Chrome and Safari too :|

On Thu, May 27, 2010 at 9:18 AM, Steve Baker <steve@sjbaker.org> wrote:
I've been thinking about porting some of my old OpenGL games over to
WebGL - and sticking them on my website for free to help to promote
WebGL (which is a noble and important cause!) and I'm seeing some weird
performance issues.  (I've been an OpenGL programmer since it was
pronounced "IrisGL" - and I work in the games industry as a senior
graphics programmer - so I'm not entirely clueless).

Now, I expect _javascript_ to be slow - but we're using a real, hardware
graphics card - right? - so my expectation would be that if I push as
much functionality onto the GPU as possible and keep things simple on
the CPU/_javascript_ side, I should be able to get decent frame rates.

To test, I wrote a really minimal application - it sets up matrices, it
clears the screen and renders a few simple objects and uses
setTimeout("draw()",1) to try to get the best framerate I can.  I'm
getting like 10Hz. :-(

Since I don't know whether my _javascript_ is somehow doing something
nasty, I tossed out all of the 3D rendering and did nothing but clear
the screen.  Doing this test at a number of different canvas sizes, I get:

1280x1024: 15Hz.
 800x600 : 35Hz.
 200x200 : 90Hz.
 100x100 : 180Hz.
  8x8   : 180Hz.

Commenting out the clear-screen and doing no OpenGL calls at all in my
main loop still gets me 180Hz - so I guess I'm CPU-limited at that frame
rate.

This is exceedingly surprising.  If we have hardware acceleration - then
the screen should clear in WAY under a millisecond on my modern nVidia
graphics card...I'd expect that to be dwarfed by the _javascript_ maximum
loop rate of 180Hz even at 1280x1024.

So it looks like we're either not running with hardware acceleration -
or there is some kind of software operation on the raster going on
that's crippling the frame rate.  I'm running the latest daily builds of
FireFox "minefield" - and I've double-checked that I have software
rendering disabled in the 'about:config' system.  I'm running Linux on
one machine, WinXP on another and Windows-7 on a third - and getting
pretty consistent results on all three machines.

But even at that - the difference between 200x200 and 100x100 is 30,000
pixels rendered in 6ms or 5Mpixels/sec ...which for a simple:  gl.clear
( gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT )  would be slow even for
software rendering!

The machine I'm rendering on is a 2.8GHz quad-core with a dual nVidia
GeForce GTX 285 GPU - but I get almost identical times on my ancient
2.6GHz single-core with a dusty old GeForce 6800!  An even more ancient
machine with a 1GHz CPU gets roughly half the frame rate across the
board...again, suggesting we're seeing some software performance cap here.

Any ideas?

 -- Steve

-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: