[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] about the VENDOR, RENDERER, and VERSION strings



----- Original Message -----
> So if we think that there are ~10 bits of information leaked through
> the
> VENDOR/RENDERER/VERSION strings - the important question is how much
> of
> that is also leaked through gl.GetParameter and other easily testable
> means?

I agree that's the next very important question. I don't expect to be able to bring the leakage down to 0 bits, but don't have to as a certain amount of information can be obtained by simple benchmarking anyway. I want to find a useful compromise as I explained in earlier emails.

But there still is a quite large immediate benefit in disabling RENDERER alone, as RENDERER gives two important informations that can't be obtained in any other way (afaik)):
 1) RENDERER tells whether the machine is a desktop or laptop
 2) RENDERER tells whether the machine is a hardcode gamer's machine (or was when it was purchased). This was noted earlier by Oliver. Even if a geforce 8800 Ultra is not powerful by today's standards, it indicates a gaming machine.

The VERSION string still gives away its ~4 bits of information, in a way that's completely orthogonal to what other getParameter calls give.

So disabling VENDOR/RENDERER/VERSION readily solves half of the problem. It's not a useless thing to do. Then I agree with the need to look further into the other getParameter calls to further reduce the problem.

Cheers,
Benoit


> 
> If that number is anywhere close to 10 bits then simply disabling the
> VENDOR/RENDERER/VERSION strings just makes life inconvenient for the
> good
> guys without buying us anything in terms of privacy from the bad guys.
> The only way to fix THAT would be to dumb everything down to the
> lowest
> common denominator...and IMHO, that's suicide for WebGL...or at least
> for
> whichever browser does it if others do not.
> 
> There certainly COULD be 10 bits of information here - but it's hard
> to
> tell how much variability there is out there without doing some kind
> of
> large-scale survey. Also, we don't know how well that correlates with
> the
> other bits of data that can already be obtained. We'd also have to
> worry
> that every time we introduce a WebGL extension (some of which are too
> important to miss out on) or provide any other kind of optional
> feature,
> that we're leaking more data. Adding extensions to WebGL would
> probably
> be the strongest 'signal' we could be sending.
> 
> The values that seem most useful are:
> 
> ALIASED_LINE_WIDTH_RANGE, ALIASED_POINT_SIZE_RANGE,ALPHA_BITS
> COMPRESSED_TEXTURE_FORMATS, DEPTH_BITS,
> MAX_COMBINED_TEXTURE_IMAGE_UNITS
> MAX_CUBE_MAP_TEXTURE_SIZE, MAX_FRAGMENT_UNIFORM_VECTORS
> MAX_RENDERBUFFER_SIZE, MAX_TEXTURE_IMAGE_UNITS, MAX_TEXTURE_SIZE
> MAX_VARYING_VECTORS, MAX_VERTEX_ATTRIBS,
> MAX_VERTEX_TEXTURE_IMAGE_UNITS
> MAX_VERTEX_UNIFORM_VECTORS, MAX_VIEWPORT_DIMS,
> NUM_COMPRESSED_TEXTURE_FORMATS, NUM_SHADER_BINARY_FORMATS,
> SAMPLE_BUFFERS
> STENCIL_BITS, SUBPIXEL_BITS
> 
> I still think this is too tough. Dumbing everything down to lowest
> common
> denominator is too painful when that lowest common denominator is a
> cellphone.
-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: