[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] gl.enable(gl.TEXTURE_2D)
On Wed, Jan 20, 2010 at 3:48 PM, Kenneth Russell <email@example.com> wrote:
> On Tue, Jan 19, 2010 at 8:37 AM, Giles Thomas <firstname.lastname@example.org> wrote:
>> Hi all,
>> A quick follow-up question:
>> 2010/1/13 Kenneth Russell <email@example.com>
>>> TEXTURE_2D is not a valid enable bit in OpenGL ES 2.0 or, consequently,
>>> WebGL. The enum was in the EnableCap section only because it was there in
>>> the OpenGL ES 2.0 headers. I've moved its definition to the TextureTarget
>> Have there been versions of Chrome (or perhaps WebKit?) where
>> gl.enable(gl.TEXTURE_2D) was erroneously required to enable textures? A
>> reader of my blog reports that Chrome 4.0.295 seems to need it with a
>> specific OS/graphics card combination -- details here:
> If there is a sample which only works with that call in place, then it
> isn't Chrome or WebKit imposing that requirement but the OpenGL
> driver. It should not be necessary to enable the TEXTURE_2D bit on a
> particular texture unit in order to sample it in a shader. You should
> report this issue to the graphics card vendor.
> By the way, because the code is evolving quickly, at this point I only
> recommend using WebGL inside of the latest Chromium builds. See
> http://khronos.org/webgl/wiki/Getting_a_WebGL_Implementation for
> instructions on downloading and running them.
To clarify, I should have said, if you're trying to use WebGL inside
of Chrome, use the Chromium continuous builds rather than the Dev
Channel builds for the time being.
You are currently subscribe to firstname.lastname@example.org.
To unsubscribe, send an email to email@example.com with
the following command in the body of your email: