[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] How to set a canvas backing store to display units?





On Thu, Jun 14, 2012 at 12:33 AM, Mark Callow <callow_mark@hicorp.co.jp> wrote:

On 14/06/2012 15:13, Gregg Tavares (社用) wrote:


On Wed, Jun 13, 2012 at 10:46 PM, Mark Callow <callow_mark@hicorp.co.jp> wrote:

On 14/06/2012 12:08, Gregg Tavares (社用) wrote:



Example:

    --fragment-shader-
   void main( void ) {
float r = mod(gl_FragCoord.x / 20.0, 2.0) < 1.0 ? 0.0 : 1.0; 
float g = mod(gl_FragCoord.y / 20.0, 2.0) < 1.0 ? 0.0 : 1.0; 
gl_FragColor = vec4(r, g, 0, 1);
   }    

This shader will generate a 20x20 plaid pattern. It has no input to fix. Having it suddenly go 1/4 size on a HD-DPI display doesn't seem acceptable.

Similarly, having every particle in three.js suddenly draw at 1/2 size also doesn't seem acceptable.

Or having all the lines in MapsGL suddenly become 1/2 as thick.

If the author's intention is that the pattern always approximates to some particular size in human terms, then the shader above is buggy.  Given the wide range of screen resolutions out there (in the proper sense of resolution) a pattern like that is obviously going to render at many different actual sizes. Perhaps the author intends it to vary in actual size.

No, it wouldn't this has nothing to with display size. It has to do with rendering pixels. No OpenGL api asks for some backbuffer of width by height pixels and is given some other size. All OpenGL programs explicitly request a size or leave it up to the OS and then query it. These units are always in device pixels. It can work no other way.
Who mentioned display size? I wrote "in the proper sense of resolution".

As I understand it, the shader draws a 20 pixel by 20 pixel pattern. The case where it would "suddenly go to 1/4 size," is when the display resolution (dpi) increases and the display size is unchanged.

If the author does not intend the pattern to change size as the dpi increases, instead of hard-coding 20, he or she needs to calculate a divisor based on the display resolution.

To respond specifically to this issue. It's perfectly fine for the developer to hard code a 20x20 pixel pattern. Say the author makes a canvas with canvas.width = 640; canvas.height = 480 then applies this 20x20 pattern. They expect to see that 20x20 pattern repeated exactly 32 by 24 times.

If WebGL magically actually made a canvas that was requested at 640x480 into a 1280x960 backbuffer on HI-DPI displays then the author would not get the result they expect. Every app not tested on a HD-DPI would potentially break or have incorrect rendering results.

That's was wrong with Glenn's suggestion. Many apps will break and authors won't know until they test on HD-DPI displays that their code is wrong. This is already true for all 2600+ examples on glsl.heroku.com and several more on the shader toy since they all do effectively this math

    vec2 texCoord = gl_FragCoord.xy / resolution

Where resolution = canvas.width, canvas.height

That produces values that go from 0.0 to 1.0 across the screen

If the real backbuffer is magically 2x the canvas.width,canvas.height on HD-DPI display then that calculation will go from 0.0 to 2.0 and all the samples break.

 

Either the code is buggy or you are making an unwarranted assumption when you say having it go to 1/4 size on an "HD-DPI' display is not acceptable.

I'm asserting that the code is fine and that having the browser make the backbuffer a different resolution on HD-DPI displays will break webpages.  
 

Regards
   
    -Mark