[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] HI-DPI displays and WebGL issues




On Jun 21, 2012, at 12:55 AM, Florian Bösch wrote:

On Thu, Jun 21, 2012 at 2:09 AM, Chris Marrin <cmarrin@apple.com> wrote:
Whatever solution we provide, it should not be an "HD" solution or a "2x" solution. It should be a solution which separates, once and for all, the notion of actual pixel dimensions of the buffers used by the GPU from everything outside the WebGL context. We just need to get rid of the notion that the current ways of specifying image and canvas sizes have anything to do with the actual pixel dimensions of the incoming media. 
I vehemently disagree with the notion that everything should be CSS pixels. Textures are used for many things in 3D applications, a subset of those is which texels to display on screen. Other subsets concern themselves with postprocessing, GPGPU, rendering geometry from textures, generating noise, etc. It is unnegotiatably mandatory that image pixels == real pixels == backing store pixels == viewport pixels == gl_FragCoord pixels. 

I'm not sure if you're vehemently disagreeing with something I said. But I agree with all that you said. All the concepts you mention are internal to WebGL so they are deal with real pixels. It is only in communicating with the outside world (the page) that the fact that CSS pixels are not always equal to real pixels must be faced. 

For instance, if an image comes in that has a "width" and "height" of 300 x 200, then when it is made into a texture it must be 300 pixels by 200 pixels. If the image is actually larger because it was expressed in "CSS Pixels" then it needs to be scaled down. There also needs to be a way for an author to know the actual pixel dimensions of the image so the full resolution can be used if desired. But that has to be opt-in. The same goes for getting pixels out of WebGL and back to the page.

-----
~Chris