[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] String constructors for arrays?

On Tue, Dec 22, 2009 at 7:08 AM, Vladimir Vukicevic
<vladimir@mozilla.com> wrote:
> I was reading Ilmari's post here --
> http://fhtr.blogspot.com/2009/12/3d-models-and-parsing-binary-data-with.html
> -- and thought that it might be interesting to supporting constructors on
> the WebGL array types that take a string as input data.  The 8-bit types
> would take only the low 8 bits of each character value, and for completeness
> the 32-bit types could sign-extend to 32 bits.  This would make it much
> easier to work with data read from the net until we have better native
> support for interacting with truly binary data, because dealing with it in
> string form is pretty common.  Feature creep or useful?

What about big-endian vs little-endian? Signed vs unsigned? Floats,
half-floats, truncated 16-bit floats, fixed point numbers? It seems
like this API would be either very specific to a handful of data
representations (and therefore useless when people want their data to
be very slightly different), or it would be a large complex
general-purpose binary number parsing system (and therefore harder to
understand and use, and it shouldn't be defined as part of a canvas
context because it's out of scope).

With a very rough test like
(parsing and summing 4-byte ints), I see speeds of about 2 million
ints/sec in Firefox 3.5, and 14 million ints/sec in Opera 10.50 - so
it's possible for pure JS to be pretty fast, and the bottleneck is
likely to be network bandwidth.

So I'd be happier to keep binary parsing as plain JS code, and focus
any browser implementation effort on optimising JS engines for this
kind of use, since that minimises spec complexity and maximises
flexibility for authors.

Philip Taylor

You are currently subscribe to public_webgl@khronos.org.
To unsubscribe, send an email to majordomo@khronos.org with
the following command in the body of your email: