On Mon, Oct 24, 2011 at 10:48 AM, Kornmann, Ralf <firstname.lastname@example.org> wrote:
I know that image compression formats can give better compression results but there is another problem that we already face with our 2D HTML5 Canvas games. The number of files is always increasing and therefore we like to pack assets (geometry data, textures and whatever we need) in blobs. I might have overlooked it but so far I haven’t seen an API that allows decoding compressed images that are delivered as blob.That should be possible. Download your data with XMLHttpRequest set responseType to "arraybuffer". This gives you binary. Split/decompress that binary anyway you want into the FileAPI. Use FileAPI URLs with images. (note: I have not tried it)
We already make heavy use of this for Json and BASE91 encoded data. (I know this is not your stuff as we talk about WebGL here)
Please don’t get me wrong on the format side. As I already said as professional game developers we are very used to work with different hardware that supports different formats. We solve such problems with assets pipelines. Therefore we would not have problems to generate assets for different formats if necessary.
Based on my DXT experiences artists are in general not very happy with the results of runtime compression. They prefer to do it offline with more heavy calculations that provides better results. Beside of this online compression increases the startup time. Maybe not for the first time when we still need to fetch data from the network but if you play again from the cache.If there was in browser compression the results would be cached or at least possible to cache. So there wouldn't be a startup cost the second time.
In the Webgame business this is a quite common case that people play games multiple times if the like them. We have optimized our file handling to a level that we can reach 100% local file cache hit rate for static data without a single request to the CDN. OK this is no magic but we have seen many websites that aren’t care in this area at all. This startup time problem might be occur with shaders, too. But this is another thing that might be considered when moving the standard to the next level.
So what we need is just a list of supported formats that we can use to load the correct asset from the CDN. If we don’t have something that match we can generate an alert in the system that we need to extend the asset pipeline and go with an uncompressed assets until it is done. That’s what I was thinking about when I called it generic. Maybe you can just forward what the driver reports as supported.
PS: this is just from a game developer view. I am pretty sure that people who are doing other kinds of web sites/application may have other requestsJust FYI, I'm not arguing against getting a list of supported formats. In fact considering the options I personally feel that's the best way forward. I just wanted to point out there are solutions to some of the issues you brought up that are not really arguments for or against various texture compression options.
Von: Chris Marrin [email@example.com]
Gesendet: Montag, 24. Oktober 2011 18:59
An: Kornmann, Ralf
Cc: Gregg Tavares (wrk); public webgl
Betreff: Re: [Public WebGL] Texture Compression in WebGL
On Oct 23, 2011, at 12:30 AM, Kornmann, Ralf wrote:
Thank you for this opportunity. During evaluating of different techniques to bring 3D games to the browser the missing texture compression support was one of the cons for WebGL. There are three primary reasons why we would like to see compression support:1. Reducing the download time for the players. We still have many customers with slower (~1 MBIt/s) connections.2. Reducing the bandwidth cost. This may not a problem for many sites today but if we deliver game content to Millions of people this would require quite an amount of bandwidth that is not for free.But ETC texture compression only gives you 6x. You can easily beat that with high quality using JPEG. Downloading of JPEG (and PNG and GIF) compressed images is supported today.
3. Improving performances for people with lower class GFX hardware. To be honest I am not sure if we will run in memory limitations before we hit the limit of the script engine. But as the script engines improve faster than the hardware it is likely.I understand that this is the biggest issue for some hardware. The problem I see is that I don't think there is any texture compression format supported universally.
Therefore I would like to see two features for the compressed texture support:1. Allow downloading data for different compression methods separately. We already can (and need) handle this well for different sound formats so there is no magic behind this. A combined storage for all formats could be fine for people who don’t care that much about bandwidth but as said we prefer to only transfer data to the client that could be used there.2. Allow separated transfer of the different mip levels. This way we could stream the lower resolutions first.Aside from the compression issue, you can download different mip levels separately today.
3. Make it somewhat generic. I think the standards don’t need to list the supported formats itself. Just a way to tell the application what is supported would be enough. This way there would no need to update the standard if a new format shows up.There's an issue we've discussed before, but I don't think we had the benefit of game developer input. Would it be reasonable to add a hint that would say "please use texture compression if available, and I understand this may result in somewhat lower quality or performance issues for the initial runtime compression"? Or is the actual compression of the texture too finely tuned for a runtime algorithm to do a reasonable job? Because if we were able to do that, we could use the texture compression capabilities of the hardware without the author needing to deal with the details.