[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] Information leakage and the extension regisrty.



> ----- Original Message -----
>> If we are really serious about stopping information leakage then
>> having
>> even a few individual queryable extensions is a terrible idea -
>
> Please, please, let's not have this discussion again!

We don't have an acceptable answer yet...it's not "again".

> So I'm *not* trying to have zero leakage. I'm just trying to keep the
> leakage limited. In this respect, it's relatively OK to have some
> extensions and getParameter()'s. I'd be surprised if one could extract
> more than 5 or 6 bits from all that, because these things are mutually
> correlated.

Look at this numerically:

World Internet user population: 2 billion - 31 bits.
Current Leakage: 20 bits.
Available bits to avoid server-side evercookies: 10 bits.

You're proposing to reduce the margin by 5 or 6 bits - meaning that the
bad guys can identify an individual down to one person in 16 to 32 (right
now, it's one person in 2048 or so).  I believe that's an underestimate
today...but let's suppose you're right.

As the years go by that number will be eroded to zero by a combination of
(a) new WebGL extensions and (b) other browser features that'll leak
information.  Some of those - such as geometry shaders and the Mozilla
extensions for touch-screen detection - are already coming - so pretty
soon it won't be 5 to 6 bits but 7 to 8 - and other browser improvements
will push several more bits - and the game is over.

If continue we work this way - we WILL lose the battle to avoid
server-side evercookies within a year.

So - we need to take a cold, dispassionate view here and either admit that
and stop with this business of shutting off important information to the
application - or we have to come up with a different way to represent
hardware capabilities that hides MUCH more information - and permits
future growth without more leakage.  Tricks such as grouping a bunch of
capabilities into one "level of performance" indicator that your hardware
either meets or doesn't meet...kinda like Microsoft do with DirectX
versioning and "shader model N" and marking old hardware as "obsolete" in
order that we don't have to describe it to the application.

> By contrast, the RENDERER/VERSION strings give 13 bits right
> away, and moreover allow user categorization as Oliver noted.

But once you've leaked more than just a couple more bits - that's
irrelevant because you've already leaked enough to give away the farm. 
Rationally, we don't give a damn whether the total leakage is 31 bits or a
thousand bits...but we (perhaps) do care if we can keep it down below
(say) 24 bits.

> Furthermore,
> as Vladimir noted, the RENDERER string had the (to many people, even more
> important) problem that it was going to create a new user-agent-string
> epic.

That's the way both DirectX and OpenGL have always worked on the desktop -
and it hasn't caused any terrible grief as a result.

> So I don't have a particular problem with extensions as they currently are
> looking like.

I don't see how can you reconcile that view with the simple numerical
truth above.

  -- Steve



-----------------------------------------------------------
You are currently subscribed to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: