[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] Adding internalformat param to all texImage2d variants
On 18 May 2010, at 23:28, Chris Marrin wrote:
> On May 18, 2010, at 1:31 PM, Johannes Behr wrote:
>>> ...In thinking about this some more, I agree with Cedric and Gregg that returning the actual format is not really necessary. If we're able to support gl.NONE (or some custom enum as Mark suggests) as an internalformat and it loads the image in the original source format, I think that satisfies the requirements of X3DOM. So I think we should eliminate getTexLevelParameter() and just add internalformat to texImage2D() and add an enum which allows us to add the image in the original format.
>> The difference is, that we can not query anymore but must give the "internalFormat" with every texture. Correct ?
> I don't think it's any different. You would always pass NO_CONVERSION (or whatever we end up calling it), which would use whatever was the input format.
But how could we know what the format was?
> I think that would do the same thing that X3D does today. There's no additional X3D requirement to know what the format is, right?
Again: The lighting model works different if you have e.g. L/LA or RGB/RGBA as input format.
We can easily live with everything converted to RGB(A) as long as we know what the format was
to select the right shader.
Dr. Johannes Behr
Leiter Projektbereich VR
Fraunhofer-Institut für Graphische Datenverarbeitung IGD
Fraunhoferstr. 5 | 64283 Darmstadt | Germany
Tel +49 6151 155-510 | Fax +49 6151 155-196
email@example.com | www.igd.fraunhofer.de
You are currently subscribe to firstname.lastname@example.org.
To unsubscribe, send an email to email@example.com with
the following command in the body of your email: