[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] webgl tests seem to require 24/32-bit canvas buffer
I'd much prefer being able to specify the format of the buffer vs. 'optimizePerformance'/etc. Unfortunately, that would require querying the context (that hasn't yet been created), and likely constants off the context. Yuck. So here's my idea:
We already have an 'alpha' value, so really what we'd needs a minimum bits-per-pixel. Let's say 'bitDepth' - then when I create contexts where I can take the lower quality I'd pass:
{ alpha: true, bitDepth: 4 } (could pick 4444 or 8888+)
or
{ alpha: false, bitDepth: 4 } or { alpha: false, bitDepth: 5 } (could pick 565 or 888+)
If I wanted high quality:
{ alpha: true|false, bitDepth:8 } (get what we have today)
By making it a minimum and a request an implementation could ignore it entirely, pick what it knows is most optimal, and most importantly: never degrade the quality of an authors content unexpectedly. If I'm building a photo editor, for example, and requested a minimum bpp of 8, I would rather have context creation fail then give me back 565. As a minimum it also allows implementations to, in the possibly-not-too-far future use 16 or 32bit depths if it were more efficient or the browser found it easier to work with.