I'm still very focused on native GL applications. Since GL does
not have blend (compositing mode) shaders an extra pass is needed
to "fix gamma using a shader in the compositing stage." This will
cause a performance drop compared to rendering directly to a
window surface. But as you point out, WebGL is already performing
this extra pass to composite the canvas with the other page
contents and it makes a great deal of sense to perform color space
conversion during that pass.
Regards
-Mark
On 03/09/2010 12:51, Steve Baker wrote:... 3) You are doing no lighting/blending/mipmapping/fog/etc and (for some reason) you have also chosen not to do gamma correction at the end. In that case and ONLY in that case, you should gamma-correct your textures on input. I maintain that very few WebGL applications will do (3).I think that the number of mobile devices which have "gamma correctors" is approaching 0 and, with the exception of doing the "correction" in a shader (which will screw up blending), control of any such "correctors" is outside OpenGL. So I suspect the number of applications doing 3 is quite large.I strongly disagree with every single thing you just said! 1) EVERY mobile device that can support WebGL is capable of rendering the final image to the screen (the "compositing" stage) by drawing a textured quadrilateral using a gamma correcting shader. There is no need for custom gamma-correcting CLUT hardware anymore...that's what we have shaders for. 2) If you decided to put the gamma correction at the end of the shader(s) that you use for rendering your 3D scene (which I most certainly don't advocate!), it would indeed "screw up blending" - but less so than applying the gamma to the textures before the shader runs. Gamma is a non-linear effect and as such has to come after all of the linear effects in the rendering pipeline. 3) You say that control of external gamma correctors is outside of OpenGL - that's true but I didn't suggest that we have to use an external gamma corrector. I specifically said that we can fix gamma using a shader in the compositing stage. 4) You can't say how many applications are "doing 3" because there are (by definition) no finished WebGL applications yet (because the specification isn't 100% finished). The only applications that might fall into class (3) are the ones that don't do ANY lighting/anti-aliassing/MIPmapping/texture-magnification/fogging/alpha-blending or translucent-canvas compositing. Basically, every single 3D application is class (1) or (2) - and preferably class (2) because (1) is an ugly kludge. True class (3) applications should probably be using <canvas> directly. If the specification were to say that the compositor does gamma correction by default (possibly with the option to turn that off for people who don't want it for some very specific reason) then everyone should be happy and we do things correctly without any nasty kludges hardwired into the system. -- Steve
begin:vcard fn:Mark Callow n:Callow;Mark org:HI Corporation;Graphics Lab, Research & Development adr:Higashiyama 1-4-4, Meguro-ku;;Meguro Higashiyama Bldg 5F;Tokyo;;153-0043;Japan email;internet:callow_mark@hicorp.co.jp title:Chief Architect tel;work:+81 3 3710 9367 x228 tel;fax:+81 3 5773 8660 x-mozilla-html:TRUE url:http://www.hicorp.co.jp, http://www.mascotcapsule.com version:2.1 end:vcard