[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] webgl tests seem to require 24/32-bit canvas buffer

The difference between OpenGL and WebGL

OpenGL: Expose a native platform dependent graphics API to C
WebGL: Expose OpenGL in as much as possible in a platform independent API

OpenGL developers are expected deal with all the issues of every platform.
Web developers are not. They expect to author a webpage and it should just work in every browser that supports the standards they used.

It's bad enough that there are so many variables already with WebGL but we've tried to smooth over what's possible to smooth over. 

That's also why there's a call for requiring certain framebuffer attachment combinations to work.

Going to 4/4/4/4 or 5/6/5 works for Unity because it's display only. But, it would break any app that calls canvas.toDataURL expecting an 32bit PNG to be created that represents WebGL content like (http://evanw.github.com/webgl-filter/) or (http://lkzf.info/main2.html)

And there's no easy recourse. You can't call toDataURL on a framebuffer object and all though I'm sure some devs could re-write toDataURL in _javascript_ using readPixels then implementing libpng and zlib in JS that doesn't sound like something we should force on any dev that wants to be cross platform

maybe add a creation attribute?  allowLowFidelity: true?  require8bitDrawingBuffer: true?

On Wed, Jul 11, 2012 at 6:05 AM, Julian Adams <[email protected]> wrote:
Unity 3D (for example) defaults to < 8 bits per pixel on iOS, with the option to run at 8 bits per pixel. I've not noticed any issues with our game running like that, and neither has our artist. That's across iPhone 3GS, iPhone 4, iPad 1,2 and 3. There's some performance hit for the higher bit depths, even on expensive hardware. Personally I'd prefer a compliant WebGL implementation not to force a performance drop comapred to native OpenGL, although I can see it makes testing harder.


On 11 July 2012 06:27, Mark Callow <[email protected]> wrote:
On 11/07/2012 04:51, Gregg Tavares (社用) wrote:

Then honestly I'd prefer to see WebGL not on those phones and hopefully that will be one more reason not to buy them. Who ever made that phone shouldn't be rewarded for making a crappy GPU. Let's not go backward. That's just my opinion though.
It's not about a crappy GPU. It's about memory & bus bandwidth, power and whether or, more likely, not the screen can display all those colors.



注意:この電子メールには、株式会社エイチアイの機密情報が含まれている場合が有ります。正式なメール受信者では無い場合はメール複製、 再配信または情報の使用を固く禁じております。エラー、手違いでこのメールを受け取られましたら削除を行い配信者にご連絡をお願いいたし ます。

NOTE: This electronic mail message may contain confidential and privileged information from HI Corporation. If you are not the intended recipient, any disclosure, photocopying, distribution or use of the contents of the received information is prohibited. If you have received this e-mail in error, please notify the sender immediately and permanently delete this message and all related copies.