[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Moving float color buffer proposals to draft (was Re: Fixing the OES_texture_float mess)

On 2012/11/15 16:05, Gregg Tavares (社用) wrote:

I just noticed the following in OES_texture{_half,}_float ...

The texImage2D and texSubImage2D entry points taking ImageData, HTMLImageElement, HTMLCanvasElement and HTMLVideoElement are extended to accept the pixel type FLOAT

What is this supposed to do? The type parameter is used to tell the implementation what type of data is being passed in.

The type parameter tells WebGL what format to convert the ImageData/HTMLImageElement/HTMLCanvasElement/HTMLVideoElement into before calling texImage2D. Allowing GL_FLOAT there means you can upload images into float textures. (maybe you'd like to edit them with floating point precision)
This is abusing the type parameter and will cause any experienced GL hand to say "what the f*%&?" as I did. It is the purpose of the internalformat parameter to tell the GL what format you want the data converted to.

Safari actually supports loading FLOAT TIFF files in image tags so arguably it should allow uploading those directly into FLOAT textuires without loss of data. Maybe you should add that language to the spec like the main spec has for 8bit lossless formats. I'm happy to add add a test.
There should actually not be a type parameter for the texImage2D commands that take an HTML element. The type is determined by data contained in the element. There is no need for the programmer to specify it separately.

If the intention is for the implementation to convert the incoming data to {half-,}float then what is needed is for the internalformat parameter to accept one of the tokens from the new extensions: RGBA16F, RGBA32F etc.

Why? ES 2.0 doesn't specify stuff that way.
Because that is the purpose of the internalformat parameter since the year dot. Misusing type for this gives a GL programmer a head mistake.

Can we rename the parameter to internalformat? As I said, these commands do not need a type parameter and renaming it will not break any existing applications.

It's broken but it infers the internal format from the combination of format/type.

ES 2.0 requires format == internalformat because we did not want to have to generate additional components, e.g. format = LUMINANCE, internalformat = RGBA and because we wanted flexibility for the implementation to choose the internal format. Yes, it has issues which is why there is now an OES_sized_internal_formats extension that lets you specify the internal format, via the internalformat parameter and has a set of required formats. You still can't use combinations that would require generating components though.

So in WebGL, 
     texImage2D(...., GL_RGBA, GL_RGBA, GL_FLOAT, image)  

Says: Take the data in 'image'. Convert to RGBA32F then create an RGBA32F texture. There are tests for this
Is it actually written somewhere in the WebGL spec that this is the expected behavior? It is not in the extension.

Yes, internal_format/format/type in ES 2.0 is busted. I believe this has been fixed in ES 3.0 but I'm not sure it's backward compatible. (haven't looked)
ES 3.0 and OES_sized_internal_formats extension provide what is needed and ES 3.0 is backward compatible. Ideally we would have liked to require the app. to specify its desired internal format but that would not have been backward compatible. 



注意:この電子メールには、株式会社エイチアイの機密情報が含まれている場合が有ります。正式なメール受信者では無い場合はメール複製、 再配信または情報の使用を固く禁じております。エラー、手違いでこのメールを受け取られましたら削除を行い配信者にご連絡をお願いいたし ます.

NOTE: This electronic mail message may contain confidential and privileged information from HI Corporation. If you are not the intended recipient, any disclosure, photocopying, distribution or use of the contents of the received information is prohibited. If you have received this e-mail in error, please notify the sender immediately and permanently delete this message and all related copies.