[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] Re: WebGL spec clarifications
- To: Chris Marrin <email@example.com>
- Subject: Re: [Public WebGL] Re: WebGL spec clarifications
- From: Cedric Vivier <firstname.lastname@example.org>
- Date: Tue, 16 Nov 2010 10:13:49 +0800
- Cc: public webgl <email@example.com>, "Ligon, David" <firstname.lastname@example.org>
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=gamma; h=domainkey-signature:received:mime-version:sender:received :in-reply-to:references:from:date:x-google-sender-auth:message-id :subject:to:cc:content-type:content-transfer-encoding; bh=1Tx62pYu9wBChcLNSvvPHYM1R58jWKQkEEvXhjCfcwk=; b=bZMA9H5q6a06wq8l6PYnQry0EH2PPHz8Suggn/S7brMc59eCfMPGFig4N17l7iHsA/ eyTkBzjtQAm9PY78iPbLT/f8fEN3xxYYBfNc4nUktjQTjBq77n13lDnEEHj+B2hYqtQI KWeijqx4N0aJo59MLhWcV7ep1MFcTxXlSagb0=
- Domainkey-signature: a=rsa-sha1; c=nofws; d=gmail.com; s=gamma; h=mime-version:sender:in-reply-to:references:from:date :x-google-sender-auth:message-id:subject:to:cc:content-type :content-transfer-encoding; b=hlzSSzBu9vqms1q8pZ+QsWNYyZ/Kbl2w+hVvCOMzhAgCgaDoXfytfow/sVjmT9na9S WbC5Y7L4KTWeOfWYJtZShY1mCJW8J75c+aEpAN23G+tNzpqLIYe2M5VE7PMgZOVBsqPU EHQGeQNWmPLnYfkOcUWFi9qTzyjIw7IN83KHo=
- In-reply-to: <D5AEB09F-82BF-4053-9640-3ABEA70005E2@apple.com>
- References: <AANLkTi=ct1dm1GnkQD3GBL1ZkGSQkNqOFDy1Jy74f+zV@mail.gmail.com> <22D217E8-EB35-4EE1-A5C9-D30F998722AA@apple.com> <DE2360FAC5837B49A7168D259F83B60B548EE610A1@NASANEXMB05.na.qualcomm.com> <D5AEB09F-82BF-4053-9640-3ABEA70005E2@apple.com>
- Sender: email@example.com
On Tue, Nov 16, 2010 at 09:14, Chris Marrin <firstname.lastname@example.org> wrote:
> My comment is, yes, this is an inconsistency in the spec. As of now we don't support any compressed texture formats.
Stock OpenGL ES 2.0 spec doesn't support any compressed texture
formats either... It just gives the entry points that can be used by
implementations to support one or more compressed formats.
>We have discussed supporting ETC1 (and maybe ETC2?) since it seems like they may be moving in the direction of being IP free. But for now I believe we plan to ship WebGL 1.0 without any specified support for compressed textures.
Yes so we probably should remove these enums in 1.0.
> There are really a couple of issues here. One is that it is currently not easy to download binary data, so it's difficult to get compressed image data into the system. This is being solved by making it possible to get an ArrayBuffer (our binary data object) back from XMLHttpRequest (HTML's generic data fetching mechanism).
This is a nice incremental improvement but there's plenty of examples
of for download binary data (eg. for meshes) that run actually quite
> The other question is, when we can get binary data in and when we decide on the compressed format(s) to support, what calls do we make? My preference, and I believe the solution we have discussed, is to use the current texImage2D call.
I do not see any reason to divert from OpenGL here. A simple
getExtension("some_format") test should be enough, the issue is do we
want to make a policy decisio in order to enforce compatibility by
waiting for a IP-safe format that is actually in use everywhere and
that can be decompressed by the browser on non-supporting GPUs (which
makes the usage of such format *useless* compared to say a PNG) ... or
just be follow OpenGL's way, which incidentally is the same that was
choosen for <video> (keeping format choice open, as there is always
new possibilities - in the case of WebGL it is actually less of a
problem because the 'uncompressed' (eg. PNG) textures are always
available and usable by the developers.
You are currently subscribed to email@example.com.
To unsubscribe, send an email to firstname.lastname@example.org with
the following command in the body of your email: