[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] NPOT (non-power-of-two) textures
On Jan 14, 2010, at 12:26 PM, Kenneth Russell wrote:
> On Thu, Jan 14, 2010 at 12:08 PM, Chris Marrin <firstname.lastname@example.org> wrote:
>> On Jan 14, 2010, at 11:52 AM, Kenneth Russell wrote:
>>> On Thu, Jan 14, 2010 at 10:48 AM, Chris Marrin <email@example.com> wrote:
>>>> OpenGL supports NPOT textures in two ways. The first is called "Rectangle
>>>> Textures" (RT), which can be any size, but can't be repeating, mip-mapped or
>>>> have borders. And rather than using 0-1 texture coordinates, they use 0-w,
>>>> 0-h. OpenGL Also supports true NPOT textures, which have similar constraints
>>>> to RT, but which use the normal 0-1 texture coordinates.
>>>> The issue is that some older hardware (and when I say "older" I mean
>>>> hardware from 2005) only supports RT, not true NPOT. It's not possible to
>>>> emulate NPOT when you just have RT support because in GLSL you use a
>>>> different sampler for RT (sampler2DRect vs sampler2D).
>>>> OpenGL ES only supports NPOT, not RT. So does that mean we leave these
>>>> older cards in the cold? The only options I see are:
>>>> 1) Add optional support (as some sort of queryable extension) for RT
>>>> 2) Remove support for any type of NPOT from WebGL (double ick!)
>>>> 3) Leave these older cards in the dust.
>>>> Now, I can say that using NPOT on a Mac with one of these older cards
>>>> still works, it just uses software rendering and is therefore much slower.
>>> A WebGL implementation can scale up NPOT texture data to the next
>>> highest power of two dimension during texImage2D and texSubImage2D
>>> calls. This wouldn't involve any API changes. O3D does this in some
>>> cases as proof that the technique can work without the end user
>>> knowing. I think it would be a bad idea to expose rectangular textures
>>> in the WebGL API; they are definitely not the path forward.
>> I'm not sure that works in the presence of shaders. What about the use of
>> textures as shader data? If a shader is using a 100x100 texture as numeric
>> data, scaling that data to 128x128 would break that shader, right? There are
>> also issues with stride and texSubImage2D. Seems like it would at least be
>> very difficult to make it transparent to the user.
> You're correct, assuming the user expected precise sampling using
> NEAREST min/mag filters, rescaling will break the app. However, for
> the majority of use cases this technique can work, and be completely
> hidden from the programmer with enough machinery behind the scenes.
But then how does the user know whether or not their app will break? I would hate to recommend that authors not use NPOT textures for certain purposes just because some older hardware might not work as they expect. I think we have to decide between OpenGL ES purity and wide hardware coverage. It's becoming clear that OpenGL ES doesn't just mean "less capable than OpenGL". It also means "leaving legacy behind". That means both legacy OpenGL APIs and hardware that doesn't support the most modern features. We've already made this decision by leaving out all non-shader hardware. This may be another case where we just need to leave this unsupported.
Let me say though that the hardware in question plays the WebGL content just fine. It just falls back to a software renderer. So I think WebKit is doing the right thing in this case. The content runs, just more slowly.
The nice thing about problems like this is that they tend to eventually go away. As time goes on these older machines get replaced with machines that don't have the issue. Of course by then we will want to support features of newer machines and the cycle will start over. I guess that's what extensions will be all about.
You are currently subscribe to firstname.lastname@example.org.
To unsubscribe, send an email to email@example.com with
the following command in the body of your email: