[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] ETC texture compression.

Oliver Hunt wrote:
> On Aug 26, 2010, at 5:25 PM, Steve Baker wrote:
>> Chris Marrin wrote:
>>> Assuming the copyright issues for ETC1 get sorted out I think we
>>> should just make it a part of the spec rather than an extension. The
>>> availability of a software decoder would allow it to be implemented on
>>> platforms without ETC1.
>> The problem with resorting to software decoders is that most (if not
>> all) desktop systems don't provide support for ETC1.  You'd package your
>> textures into ETC1, suffer the horrific loss of image quality...and then
>> discover that you're actually not saving any video memory or improving
>> texture cache coherency at all!   ETC1 does help your network bandwidth
>> (the files are 6:1 compressed) - but the average file compression rate I
>> get from PNG's zlib compression is 3.6:1 (averaged over all the maps in
>> my game) - so the network bandwidth savings from ETC are less than a
>> factor of two over PNG.  Given that, I can't imagine many people
>> preferring ETC1 over PNG for desktop systems.
> How does it compare to jpeg?
JPEG is incredibly compact - something like 10:1 on a fairly high
quality setting, maybe even 100:1 on the lower quality settings!  Vastly
more compact than any of the specifically "texture" compression
schemes.  But despite that, JPEG is a simply awful format for texture!

The problem is that it's based on a human perceptual model that presumes
things about the amplitude and frequency response of your eye - and that
assumes you're looking at the image square-on under normal room lighting
and such that the image resolution is about what a typical screen
resolution is and that you have the gamma setting of the screen set right.

But none of those special conditions hold for textures - we squash them,
stretch them, MIP them, lighten and darken them.  If you look closely at
a JPEG image you'll see that you tend to get odd random texels that are
wildly "wrong" in hue.  Bright green or magenta or something.  When
you're viewing under optimal conditions those colors are displayed at
higher resolution than the color-perception cells in your eye can
resolve them so they blend nicely to an intermediate hue and
brightness.  It's assumption of correct gamma presentation means that it
can shave bits off of some brightness ranges and pack more precision
into others.   But when you stretch and squash and illuminate that, you
get REALLY wierd shit coming out of it.

Lossy "texture" compression systems are careful to avoid such
assumptions - and that's why they can't get to such high densities.

So "Just Say No" to JPEG.
>   Given the current  texture loading model is to load from Image (or Canvas) objects you need to transmit your data to the UA in a form the UA understands.
> In all honesty I find myself wondering if the API should simply be something akin to telling the WebGL implementation to use a compressed texture if possible, then leaving it up to the implementation to determine the best format on the current platform.
Yeah - but it's not just a platform decision.  You might choose only to
compress your largest textures (on the grounds that they are 90% of the
problem) - and you certainly don't want to compress normal maps using
ETC1.   There are also many reasons for using textures that are utterly
unrelated to RGB data - or even "texels" in the conventional sense.  For
those kinds of thing, you CERTAINLY don't want the hardware messing with
your data.

  -- Steve

You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: