[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] backwards compatibility handling
The plan has been to adhere to the ES 3.0 semantics during the upgrade
from WebGL 1.0 to 2.0 (option #1). It's been known for a while that
certain features don't behave the same way in OpenGL ES 2.0 +
extensions as they do in the OpenGL ES 3.0 core. draw_buffers, an
important piece of functionality, is a prime example -- in ES 3.0, in
order to access this feature, shaders must be written using ESSL 3.00
syntax, and auto-upgrading shaders seems infeasible. Applications
upgrading from WebGL 1.0 to 2.0 will inevitably require code changes,
so it was felt at the beginning of WebGL 2.0 spec development that it
should be considered a mostly-compatible but still incompatible
upgrade. It's for this reason that WebGL2RenderingContext does not
inherit from WebGLRenderingContext -- mostly a symbolic gesture in a
On Thu, Apr 2, 2015 at 4:04 AM, Jukka Jylänki <firstname.lastname@example.org> wrote:
> The texture image enum combo jungle is one of the worst aspects of tedious
> mini changes between desktop/mobile/web GL versions, especially for engines
> that are designed and written with a "use latest GL version available
> everywhere" runtime initialization. Specifically for this case, each of the
> engines (e.g. Unity3D, UE4, internally knows and are interested in saying
> "initialize me a texture of format R32G32B32A32_FLOAT_LITTLE_ENDIAN", and
> what they have to do is manage a switch-case mapping of initialized GL
> version and flavor -> the GL enum triplet that will give the desired format.
> This is horrible, and time is often wasted when this mapping is not correct.
> This always amounts to just a simple "oh, right, this GL version X GL flavor
> needs that enum here instead", so it's not a huge deal to change, but it
> wastes time due to a pointless api inconsistency. Like you mention, there is
> no backwards compatibility, and there is no forwards compatibility, and
> there is no compatibility across GL flavors, so it is quite hairy indeed.
> I am not sure if there is a good overall solution to this, but what I would
> prefer here is that WebGL would strictly follow the GLES specs to minimize
> the number of disruptions in this front, so it feels like the option 1.
> would be the easiest from the perspective of GLES<->WebGL parity and engines
> that multi-target WebGL, GLES and desktop GL.
> 2015-04-02 9:38 GMT+03:00 Florian Bösch <email@example.com>:
>> It is possible that successive versions of WebGL wish to remove
>> functionality that was contained previously in order to get rid of cruft. To
>> some extent this is already the case with WebGL 2 as the example below
>> The texImage family of calls in ES 2 (and WebGL 1) accept unsized
>> internalformat parameters such as RGB, RGBA, LUMINANCE_ALPHA, LUMINANCE and
>> In ES 3 (and WebGL 2) a host of new sized formats is introduced, for
>> example: R8, R8_SNORM, R16F, R32F etc.
>> If you enable OES_texture_float in WebGL 1 this call becomes valid:
>> texImage2D(TEXTURE_2D, 0, RGBA, x, y, 0, RGBA, FLOAT)
>> However ES 3 has floating point texturing support built in (and so
>> OES_texture_float will not be offered as an extension) you will also have to
>> change the call to account for that only sized internalformats are valid to
>> pass for this type:
>> texImage2D(TEXTURE_2D, 0, RGBA32F, x, y, 0, RGBA, FLOAT)
>> At this time most of the backwards compatibility breaking changes are
>> related to extensions. However future revisions of the standard might also
>> have core incompatibilities.
>> The way this has been handled in OpenGL is by defaulting to something
>> called a "compatibility profile" which contained everything from previous
>> revisions of the standard in addition to the new stuff. This is undesirable
>> for obvious reasons.
>> However the problem this solves should not be ignored, it's how people can
>> deploy applications written against different versions of the standard in
>> transitionary periods where support for a new standard isn't as widespread
>> and so older profiles need to be supported as well. Obviously it would be
>> bad if you had to maintain two separate implementations, and even writing
>> cleanly separable code can be a challenge sometimes.
>> In a nutshell, it makes it easier (for the right kind of programmer) to
>> migrate to newer profiles. Unfortunately it also makes it easier (for the
>> wrong kind of programmer) to produce messy dysfunctional GL code, which has
>> lead to various initiatives/libraries attempting to provide clean core
>> profiles. Fortunately WebGL doesn't have this problem (yet).
>> I feel it's important to have a discussion about how this should be
>> handled in WebGL, and I see 3 alternatives (examples)
>> Do not handle it, features in WebGL are liberally removed in backwards
>> compatibility breaking ways. This will make it a bit harder to adopt newer
>> WebGL profiles, but it will prevent mixed spagetti code and it probably
>> makes life easier for vendors.
>> Introduce the concept of a compatibility profile which you default to
>> (i.e. gl.getContext('webgl2') -> webgl2 compatibility profile,
>> gl.getContext('webgl2-core') -> webgl2 core profile. This comes with all the
>> drawbacks of the traditional OpenGL solution to the problem.
>> Introduce the concept of a compatibility profile, but default to core
>> (i.e. gl.getContext('webgl2') -> webgl2 core,
>> gl.getContext('webgl2-compatibility') -> webgl2 compatibility profile.
You are currently subscribed to firstname.lastname@example.org.
To unsubscribe, send an email to email@example.com with
the following command in the body of your email: