[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] webgl tests seem to require 24/32-bit canvas buffer



It doesn't seem acceptable to make toDataURL expand 4444 to 8888 as the only solution. The developer is not expecting data to be destroyed.

a context creation flag seems like a better idea. 


On Wed, Jul 11, 2012 at 9:57 AM, Benoit Jacob <bjacob@mozilla.com> wrote:



The difference between OpenGL and WebGL

OpenGL: Expose a native platform dependent graphics API to C
WebGL: Expose OpenGL in as much as possible in a platform independent API
The argument could be flipped around: a major goal of WebGL is portability by default; requiring a color depth that is higher than the standard color depth of most mobile devices is not what one would expect from this portability perspective.



OpenGL developers are expected deal with all the issues of every platform.
Web developers are not. They expect to author a webpage and it should just work in every browser that supports the standards they used.

It's bad enough that there are so many variables already with WebGL but we've tried to smooth over what's possible to smooth over. 

That's also why there's a call for requiring certain framebuffer attachment combinations to work.

Going to 4/4/4/4 or 5/6/5 works for Unity because it's display only. But, it would break any app that calls canvas.toDataURL expecting an 32bit PNG to be created that represents WebGL content like (http://evanw.github.com/webgl-filter/) or (http://lkzf.info/main2.html)

And there's no easy recourse. You can't call toDataURL on a framebuffer object and all though I'm sure some devs could re-write toDataURL in _javascript_ using readPixels then implementing libpng and zlib in JS that doesn't sound like something we should force on any dev that wants to be cross platform

maybe add a creation attribute?  allowLowFidelity: true?  require8bitDrawingBuffer: true?
If toDataURL is the main concern, one could simply convert to 8888 in toDataURL.

Meanwhile we are trying to gather more data about exactly how much this is needed; will report here when we do have data.

Benoit








On Wed, Jul 11, 2012 at 6:05 AM, Julian Adams <joolsa@gmail.com> wrote:
Unity 3D (for example) defaults to < 8 bits per pixel on iOS, with the option to run at 8 bits per pixel. I've not noticed any issues with our game running like that, and neither has our artist. That's across iPhone 3GS, iPhone 4, iPad 1,2 and 3. There's some performance hit for the higher bit depths, even on expensive hardware. Personally I'd prefer a compliant WebGL implementation not to force a performance drop comapred to native OpenGL, although I can see it makes testing harder.

Jools


On 11 July 2012 06:27, Mark Callow <callow_mark@hicorp.co.jp> wrote:
On 11/07/2012 04:51, Gregg Tavares (社用) wrote:


Then honestly I'd prefer to see WebGL not on those phones and hopefully that will be one more reason not to buy them. Who ever made that phone shouldn't be rewarded for making a crappy GPU. Let's not go backward. That's just my opinion though.
It's not about a crappy GPU. It's about memory & bus bandwidth, power and whether or, more likely, not the screen can display all those colors.

Regards

-Mark

--
注意:この電子メールには、株式会社エイチアイの機密情報が含まれている場合が有ります。正式なメール受信者では無い場合はメール複製、 再配信または情報の使用を固く禁じております。エラー、手違いでこのメールを受け取られましたら削除を行い配信者にご連絡をお願いいたし ます。

NOTE: This electronic mail message may contain confidential and privileged information from HI Corporation. If you are not the intended recipient, any disclosure, photocopying, distribution or use of the contents of the received information is prohibited. If you have received this e-mail in error, please notify the sender immediately and permanently delete this message and all related copies.