[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] backwards compatibility handling

The texture image enum combo jungle is one of the worst aspects of tedious mini changes between desktop/mobile/web GL versions, especially for engines that are designed and written with a  "use latest GL version available everywhere" runtime initialization. Specifically for this case, each of the engines (e.g. Unity3D, UE4, internally knows and are interested in saying "initialize me a texture of format R32G32B32A32_FLOAT_LITTLE_ENDIAN", and what they have to do is manage a switch-case mapping of initialized GL version and flavor -> the GL enum triplet that will give the desired format. This is horrible, and time is often wasted when this mapping is not correct. This always amounts to just a simple "oh, right, this GL version X GL flavor needs that enum here instead", so it's not a huge deal to change, but it wastes time due to a pointless api inconsistency. Like you mention, there is no backwards compatibility, and there is no forwards compatibility, and there is no compatibility across GL flavors, so it is quite hairy indeed.

I am not sure if there is a good overall solution to this, but what I would prefer here is that WebGL would strictly follow the GLES specs to minimize the number of disruptions in this front, so it feels like the option 1. would be the easiest from the perspective of GLES<->WebGL parity and engines that multi-target WebGL, GLES and desktop GL.

2015-04-02 9:38 GMT+03:00 Florian Bösch <pyalot@gmail.com>:
It is possible that successive versions of WebGL wish to remove functionality that was contained previously in order to get rid of cruft. To some extent this is already the case with WebGL 2 as the example below illustrates:

The texImage family of calls in ES 2 (and WebGL 1) accept unsized internalformat parameters such as RGB, RGBA, LUMINANCE_ALPHA, LUMINANCE and ALPHA.

In ES 3 (and WebGL 2) a host of new sized formats is introduced, for example: R8, R8_SNORM, R16F, R32F etc.

If you enable OES_texture_float in WebGL 1 this call becomes valid:

texImage2D(TEXTURE_2D, 0, RGBA, x, y, 0, RGBA, FLOAT)

However ES 3 has floating point texturing support built in (and so OES_texture_float will not be offered as an extension) you will also have to change the call to account for that only sized internalformats are valid to pass for this type:

texImage2D(TEXTURE_2D, 0, RGBA32F, x, y, 0, RGBA, FLOAT)

At this time most of the backwards compatibility breaking changes are related to extensions. However future revisions of the standard might also have core incompatibilities.

The way this has been handled in OpenGL is by defaulting to something called a "compatibility profile" which contained everything from previous revisions of the standard in addition to the new stuff. This is undesirable for obvious reasons.

However the problem this solves should not be ignored, it's how people can deploy applications written against different versions of the standard in transitionary periods where support for a new standard isn't as widespread and so older profiles need to be supported as well. Obviously it would be bad if you had to maintain two separate implementations, and even writing cleanly separable code can be a challenge sometimes.

In a nutshell, it makes it easier (for the right kind of programmer) to migrate to newer profiles. Unfortunately it also makes it easier (for the wrong kind of programmer) to produce messy dysfunctional GL code, which has lead to various initiatives/libraries attempting to provide clean core profiles. Fortunately WebGL doesn't have this problem (yet).

I feel it's important to have a discussion about how this should be handled in WebGL, and I see 3 alternatives (examples)
  1. Do not handle it, features in WebGL are liberally removed in backwards compatibility breaking ways. This will make it a bit harder to adopt newer WebGL profiles, but it will prevent mixed spagetti code and it probably makes life easier for vendors.
  2. Introduce the concept of a compatibility profile which you default to (i.e. gl.getContext('webgl2') -> webgl2 compatibility profile, gl.getContext('webgl2-core') -> webgl2 core profile. This comes with all the drawbacks of the traditional OpenGL solution to the problem.
  3. Introduce the concept of a compatibility profile, but default to core (i.e. gl.getContext('webgl2') -> webgl2 core, gl.getContext('webgl2-compatibility') -> webgl2 compatibility profile.