[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] ATC texture size computation



Here's the problem in a nutshell:

You expect application developers to "sort it out". Yet we'll only know how to "sort it out" once we've asked the webgl context for its supported formats. Requiring us to to cross-transpile textures in a zigillion different formats and dynamically branching out after context creation and engage an XHR2 to binarily load them. There's several things wrong with that:

1) it makes it really difficult to use
2) While I'm happy to rely on XHR2 binary loading, there might be people who aren't, besides, I think mobile support for XHR2 isn't overwhelming.
3) Unlike <audio> and <video> which support automatic content selection you leave it up to the developer
4) if we can't convert an <img> or array buffer to a texture, we can't even do the "simple" fallback of transpiling textures in JS somehow.

So suboptimal only describes this solution in terms of it is worse than:
- <audio>
- <video>
- CSS Shaders
- vendor prefix hell
- all of the above combined 

On Fri, Sep 7, 2012 at 6:58 PM, Brandon Jones <bajones@google.com> wrote:

On Fri, Sep 7, 2012 at 9:34 AM, Florian Bösch <pyalot@gmail.com> wrote:
But how're you gonna gloss over the fact that the format is not specified, making it impossible to convert an <img> to a texture or a texture back to bytes or an <img>? 

I'll agree that it's a less-than-ideal situation, but I'm not sure that there's a lot of utility in ensure textures can always be represented in <img> tags. The currently implemented S3TC textures must be manually unpacked and rendered to a canvas or data url before they can be shown in the DOM, but I've only once seen anybody bother. As long as the hardware can unpack it properly I have a hard time seeing why we shouldn't consider exposing it to give us wider hardware coverage. We don't have a whole lot of alternatives if we want compressed texture support, sadly.