[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Moving float color buffer proposals to draft (was Re: Fixing the OES_texture_float mess)



Surely the ES specification mirrors this, but the actual internal format for floating point textures is GL_RGBA32F_EXT, and *that* is determining the internal format. The type parameter and the format parameter describe the input data.


On Fri, Nov 16, 2012 at 4:40 AM, Gregg Tavares (社用) <gman@google.com> wrote:



On Fri, Nov 16, 2012 at 11:57 AM, Mark Callow <callow.mark@artspark.co.jp> wrote:

This discussion is rather pointless as it is too late to change this version of WebGL but it is important that people understand the intent of the API design to prevent cock-ups like this in the future.

There is no cockup. The way it is is the only way it can possible work.
 

In OpenGL {,ES} textures have an external and an internal format. The format & type parameters specify the external format. The internalformat parameter specifies the internal format. In ES, the external and internal formats are the same modulo allowable precision changes which I don't think any recent implementation does. The reason that the type parameter influences the internal format in ES is because the internal format is the external format.

When I wrote "GL_UNSIGNED_SHORT_4_4_4_4 and GL_RGBA4 are the same format" I was referring to the memory layout not the enum token values and that is also what I refer to when I say the internal and external formats are the same.


On 2012/11/15 18:46, Gregg Tavares (社用) wrote:

In ES if you specified

texImage2D(GL_TEXTURE2D, 0, GL_RGBA, 256, 256, 0, GL_RGBA, GL_FLOAT, data)

where data points to unsigned byte data (which is in effect what WebGL's OES_texture_float is telling the app to do)

No it's not. format and data tell it type of data you are supplying. WebGL will supply that data. If you tell it format = GL_RGBA, type = GL_FLOAT WebGL will supply GL_RGBA, GL_FLOAT (16 bytes per color).
You are viewing this from the perspective of a WebGL implementer. You need to pass type=FLOAT to ES because that is the external format of the data you are providing to ES.

From the perspective of a WebGL application programmer, the external format is whatever the format of the HTML elements is (most like RGB888 or RGBA8888) and the internal format he/she wants is FLOAT but, according to the real meaning of the type parameter, OES_texture_float is telling him/her to misclassify his/her external data as FLOAT. Hence the example I gave above.


I'm not following you

type + format = the format of the data I'm passing to texImage2D
internal_format = the format I want the driver to store the texture except that in ES type also influences that storage

This is an unrenderable texture according to ES

  glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 2, 2, 0, GL_RGBA, GL_UNSIGNED_BYTE, dataFor4RGBA8Pixels);
  glTeximage2D(GL_TEXTURE_2D, 0, GL_RGBA, 1, 1, 0, GL_RGBA, GL_UNSIGNED_SHORT_4_4_4_4, dataFor1RGBA4Pixel);

The types are different therefore the mips don't match therefore the texture is not "texture complete" and will not render. From the spec

3.7.10

A two-dimensional texture is complete if the following conditions all hold true:

* The set of mipmap arrays zero through q (where q is defined in the Mipmapping discussion of section 3.7.7) were each specified with the same format,
internal format, and type.

...

If WebGL implementations are allowed to chose any random format to upload an image than it would be impossible to specific the second level

  gl.texImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image)   // implementation chooses the type.
  gl.texImage2D(GL_TEXTURE_2D, 1, GL_RGBA, 1, 1, 0, GL_RGBA, unknown_type!!!!!, dataForPixels);

This would also fail

  gl.texImage2D(GL_TEXTURE_2D, 0, GL_RGBA, image)   // implementation chooses the type.
  gl.texSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 1, 1, GL_RGBA, unknown_type!!!!, dataForPixels);

What type is the developer suppose to supply for this second call to gl.texImage2D or to gl.texSubImage2D? If the implementation chose the type the developer wouldn't know what type WebGL supplied. They could maybe guess what WebGL if they supplied the image but the image could be user supplied and the developer would have no way of knowing what WebGL created.
 

The texImage2D commands that take elements should not even have a type parameter.

Again, it's impossible to use the API correctly without the type parameter. See above 

It is unnecessary as the element has a known (to the WebGL implementation) external format. internalformat should be used to tell WebGL what internal format the application would like used to store the resulting texture. That is its purpose in GL and ES. See OES_required_internal_format or the OpenGL ES 3.0 specification.

Regards

    -Mark

--
注意:この電子メールには、株式会社エイチアイの機密情報が含まれている場合が有ります。正式なメール受信者では無い場合はメール複製、 再配信または情報の使用を固く禁じております。エラー、手違いでこのメールを受け取られましたら削除を行い配信者にご連絡をお願いいたし ます.

NOTE: This electronic mail message may contain confidential and privileged information from HI Corporation. If you are not the intended recipient, any disclosure, photocopying, distribution or use of the contents of the received information is prohibited. If you have received this e-mail in error, please notify the sender immediately and permanently delete this message and all related copies.