format, type, and data specify the format of the image data
Textures with integer internal formats (see table 8.12) require integer data.
An INVALID_OPERATION error is generated if the internal format is integer
and format is not one of the integer formats listed in table 8.3, or if the internal
format is not integer and format is an integer format
At least we agree about something.On 2012/11/16 12:40, Gregg Tavares (社用) wrote:
I'm not following you
type + format = the format of the data I'm passing to texImage2Dinternal_format = the format I want the driver to store the texture except that in ES type also influences that storageNo disagreement here either. My point is that type influences the storage format only because ES essentially uses the external format as the storage format.
This is an unrenderable texture according to ES
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 2, 2, 0, GL_RGBA, GL_UNSIGNED_BYTE, dataFor4RGBA8Pixels);glTeximage2D(GL_TEXTURE_2D, 0, GL_RGBA, 1, 1, 0, GL_RGBA, GL_UNSIGNED_SHORT_4_4_4_4, dataFor1RGBA4Pixel);
The types are different therefore the mips don't match therefore the texture is not "texture complete" and will not render. From the specWhat I am suggesting is something like this:
If WebGL implementations are allowed to chose any random format to upload an image than it would be impossible to specific the second level
gl.texImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image) // implementation chooses the type.gl.texImage2D(GL_TEXTURE_2D, 1, GL_RGBA, 1, 1, 0, GL_RGBA, unknown_type!!!!!, dataForPixels);
This would also fail
gl.texImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image) // implementation chooses the type.gl.texSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 1, 1, GL_RGBA, unknown_type!!!!, dataForPixels);
What type is the developer suppose to supply for this second call to gl.texImage2D or to gl.texSubImage2D? If the implementation chose the type the developer wouldn't know what type WebGL supplied. They could maybe guess what WebGL if they supplied the image but the image could be user supplied and the developer would have no way of knowing what WebGL created.
gl.texImage2D(GL_TEXTURE_2D, 0, GL_RGBA8888, /* no <format>, no <type>, */ image);
gl.texImage2D(GL_TEXTURE_2D, 1, GL_RGBA8888, 1, 1, 0, GL_RGBA, GL_UNSIGNED_BYTE, dataForPixels);
gl.texImage2D(GL_TEXTURE_2D, 0, GL_RGBA8888, /* no format param, no type param, */ image);
gl.texSubImage2D(GL_TEXTURE_2D, 1, 0, 0, 1, 1, GL_RGBA, GL_UNSIGNED_BYTE, dataForPixels);
There would be a table that gives you the valid combinations of <format>, <type> and <internalformat> and another table giving the valid <internalformat> values that can be used with ImageData and HTML{Image,Canvas,Video}Element. <format> and <type> don't serve any purpose for these elements when internalformat controls the storage format.
The definition of texSubImage2D would have to be modified to say format and type must make a valid combination with the internalformat specified for the original texture.
Yes I realize this would be a big difference from ES 2.0 and therefore likely unacceptable. I'm looking ahead.
Regards
-Mark
--
注意:この電子メールには、株式会社エイチアイの機密情報が含まれている場合が有ります。正式なメール受信者では無い場合はメール複製、 再配信または情報の使用を固く禁じております。エラー、手違いでこのメールを受け取られましたら削除を行い配信者にご連絡をお願いいたし ます.NOTE: This electronic mail message may contain confidential and privileged information from HI Corporation. If you are not the intended recipient, any disclosure, photocopying, distribution or use of the contents of the received information is prohibited. If you have received this e-mail in error, please notify the sender immediately and permanently delete this message and all related copies.