[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Moving float color buffer proposals to draft (was Re: Fixing the OES_texture_float mess)



The OpenGL specification (4.2 core) states in section 8.5 

format, type, and data specify the format of the image data
 
and it also states that

Textures with integer internal formats (see table 8.12) require integer data.
An INVALID_OPERATION error is generated if the internal format is integer
and format is not one of the integer formats listed in table 8.3, or if the internal
format is not integer and format is an integer format

OpenGL ES 2.0 specification seems to just skip over any description of internalformat, format and type except that it states in section 3.7.1 that no conversion is performed and that format and internalformat channels have to match up.
The OpenGL extension http://www.opengl.org/registry/specs/ARB/texture_float.txt specifies new internalformat types (such as RGBA32F_ARB)
The OpenGL ES extension http://www.khronos.org/registry/gles/extensions/OES/OES_texture_float.txt only introduces two new types (FLOAT AND HALF_FLOAT) but no new internal format.
The OpenGL ES extension http://www.khronos.org/registry/gles/extensions/EXT/EXT_color_buffer_half_float.txt does not introduce a new internalformat for textures (only for renderbuffers).

For OpenGL, the type parameter is superfluous because it *has* to match the internal formats type and the data buffer *has* to match the type.
For OpenGL ES the type parameter is the only way to "specify" an internal format completely, and the data equally (presumably) *has* to match the type.

Therefore, since WebGL is modelled after OpenGL ES:
1) It should not be possible to create a floating point texture by passing an unsigned byte array (which by extension means <img> objects)
2) The only way you can specify a floating point texture is with the second flavor call that accepts a type.
3) WebGLs current interpretation of the ES spec is correct.
4) WebGLs OES_texture_float is correct
5) GL_RGBA32F or the like are Desktop OpenGL specific terminology that do not apply to OpenGL ES, and introducing new internal format types is not supported by the ES spec nor any extension.




On Fri, Nov 16, 2012 at 10:34 AM, Mark Callow <callow.mark@artspark.co.jp> wrote:

On 2012/11/16 12:40, Gregg Tavares (社用) wrote:

I'm not following you

type + format = the format of the data I'm passing to texImage2D
internal_format = the format I want the driver to store the texture except that in ES type also influences that storage
At least we agree about something.


This is an unrenderable texture according to ES

  glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 2, 2, 0, GL_RGBA, GL_UNSIGNED_BYTE, dataFor4RGBA8Pixels);
  glTeximage2D(GL_TEXTURE_2D, 0, GL_RGBA, 1, 1, 0, GL_RGBA, GL_UNSIGNED_SHORT_4_4_4_4, dataFor1RGBA4Pixel);

The types are different therefore the mips don't match therefore the texture is not "texture complete" and will not render. From the spec
No disagreement here either. My point is that type influences the storage format only because ES essentially uses the external format as the storage format.



If WebGL implementations are allowed to chose any random format to upload an image than it would be impossible to specific the second level

  gl.texImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image)   // implementation chooses the type.
  gl.texImage2D(GL_TEXTURE_2D, 1, GL_RGBA, 1, 1, 0, GL_RGBA, unknown_type!!!!!, dataForPixels);

This would also fail

  gl.texImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image)   // implementation chooses the type.
  gl.texSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 1, 1, GL_RGBA, unknown_type!!!!, dataForPixels);

What type is the developer suppose to supply for this second call to gl.texImage2D or to gl.texSubImage2D? If the implementation chose the type the developer wouldn't know what type WebGL supplied. They could maybe guess what WebGL if they supplied the image but the image could be user supplied and the developer would have no way of knowing what WebGL created.
What I am suggesting is something like this:

gl.texImage2D(GL_TEXTURE_2D, 0, GL_RGBA8888, /* no <format>, no <type>, */ image);
gl.texImage2D(GL_TEXTURE_2D, 1, GL_RGBA8888, 1, 1, 0, GL_RGBA, GL_UNSIGNED_BYTE, dataForPixels);

gl.texImage2D(GL_TEXTURE_2D, 0, GL_RGBA8888, /* no format param, no type param, */ image);
gl.texSubImage2D(GL_TEXTURE_2D, 1, 0, 0, 1, 1, GL_RGBA, GL_UNSIGNED_BYTE, dataForPixels);

There would be a table that gives you the valid combinations of <format>, <type> and <internalformat> and another table giving the valid <internalformat> values that can be used with ImageData and HTML{Image,Canvas,Video}Element. <format> and <type> don't serve any purpose for these elements when internalformat controls the storage format.

The definition of texSubImage2D would have to be modified to say format and type must make a valid combination with the internalformat specified for the original texture.

Yes I realize this would be a big difference from ES 2.0 and therefore likely unacceptable. I'm looking ahead.

Regards

    -Mark

--
注意:この電子メールには、株式会社エイチアイの機密情報が含まれている場合が有ります。正式なメール受信者では無い場合はメール複製、 再配信または情報の使用を固く禁じております。エラー、手違いでこのメールを受け取られましたら削除を行い配信者にご連絡をお願いいたし ます.

NOTE: This electronic mail message may contain confidential and privileged information from HI Corporation. If you are not the intended recipient, any disclosure, photocopying, distribution or use of the contents of the received information is prohibited. If you have received this e-mail in error, please notify the sender immediately and permanently delete this message and all related copies.