[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] about the VENDOR, RENDERER, and VERSION strings
----- Original Message -----
> So if we think that there are ~10 bits of information leaked through
> VENDOR/RENDERER/VERSION strings - the important question is how much
> that is also leaked through gl.GetParameter and other easily testable
I agree that's the next very important question. I don't expect to be able to bring the leakage down to 0 bits, but don't have to as a certain amount of information can be obtained by simple benchmarking anyway. I want to find a useful compromise as I explained in earlier emails.
But there still is a quite large immediate benefit in disabling RENDERER alone, as RENDERER gives two important informations that can't be obtained in any other way (afaik)):
1) RENDERER tells whether the machine is a desktop or laptop
2) RENDERER tells whether the machine is a hardcode gamer's machine (or was when it was purchased). This was noted earlier by Oliver. Even if a geforce 8800 Ultra is not powerful by today's standards, it indicates a gaming machine.
The VERSION string still gives away its ~4 bits of information, in a way that's completely orthogonal to what other getParameter calls give.
So disabling VENDOR/RENDERER/VERSION readily solves half of the problem. It's not a useless thing to do. Then I agree with the need to look further into the other getParameter calls to further reduce the problem.
> If that number is anywhere close to 10 bits then simply disabling the
> VENDOR/RENDERER/VERSION strings just makes life inconvenient for the
> guys without buying us anything in terms of privacy from the bad guys.
> The only way to fix THAT would be to dumb everything down to the
> common denominator...and IMHO, that's suicide for WebGL...or at least
> whichever browser does it if others do not.
> There certainly COULD be 10 bits of information here - but it's hard
> tell how much variability there is out there without doing some kind
> large-scale survey. Also, we don't know how well that correlates with
> other bits of data that can already be obtained. We'd also have to
> that every time we introduce a WebGL extension (some of which are too
> important to miss out on) or provide any other kind of optional
> that we're leaking more data. Adding extensions to WebGL would
> be the strongest 'signal' we could be sending.
> The values that seem most useful are:
> ALIASED_LINE_WIDTH_RANGE, ALIASED_POINT_SIZE_RANGE,ALPHA_BITS
> COMPRESSED_TEXTURE_FORMATS, DEPTH_BITS,
> MAX_CUBE_MAP_TEXTURE_SIZE, MAX_FRAGMENT_UNIFORM_VECTORS
> MAX_RENDERBUFFER_SIZE, MAX_TEXTURE_IMAGE_UNITS, MAX_TEXTURE_SIZE
> MAX_VARYING_VECTORS, MAX_VERTEX_ATTRIBS,
> MAX_VERTEX_UNIFORM_VECTORS, MAX_VIEWPORT_DIMS,
> NUM_COMPRESSED_TEXTURE_FORMATS, NUM_SHADER_BINARY_FORMATS,
> STENCIL_BITS, SUBPIXEL_BITS
> I still think this is too tough. Dumbing everything down to lowest
> denominator is too painful when that lowest common denominator is a
You are currently subscribed to firstname.lastname@example.org.
To unsubscribe, send an email to email@example.com with
the following command in the body of your email: