[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] Loading WebGLBuffer from an HTMLImageElement
On 04/14/2011 08:56 PM, Kenneth Russell wrote:
> On Tue, Apr 12, 2011 at 12:34 PM, Glenn Maynard <firstname.lastname@example.org> wrote:
>> On Tue, Apr 12, 2011 at 1:48 PM, Kenneth Russell <email@example.com> wrote:
>>> Uploading an image to be used as vertex data is a hack. You'll only be
>>> guaranteed lossless data transfer with PNGs. I don't think we would
>>> want to add these entry points in the general case as they would
>>> promote poor application development practices. Why do you not want to
>>> download binary data?
>> In this case, I want to load a user-provided image to generate a
>> histogram. The user-supplied image may be a remote URL, which I can
>> load and manipulate but not access directly due to same-origin
>> The first pass of that (per channel) would be to use the pixel data to
>> generate points. It could use vertex shader texture sampling, but
>> there's no guarantee that any vertex texture units exist (and most
>> current implementations don't implement them at all), so using a
>> vertex buffer here eliminates the dependency on optional requirements.
>> (It's also simpler to implement, and probably faster.)
>> (I'm not interested in weird hacks like, say, loading mesh data from a
>> PNG. That's not very useful--PNG won't be good at compressing it, and
>> you'd need to compound the hack badly to get anything other than 8-bit
>> integers out of it.)
> Thanks, I understand your use case now. However, I am still concerned
> that adding APIs to upload images, canvases and video elements into
> (vertex) buffers will encourage gross misuse.
> I think it should be possible for you to achieve your use case by
> accumulating summary statistics in a fragment shader while repeatedly
> downsampling the image. I've spoken with colleagues that have computed
> mean and standard deviation of an image's brightness in a fragment
> shader using this technique. Could you give this approach some
That technique is used almost universally in high dynamic range (HDR)
lighting applications - which includes most games. It works very well.
I've personally used it to construct histograms of image brightness (we
were making a simulation of an advanced thermal imaging camera for the
US Air Force).
The difficulty in WebGL is that we only have access to a single "render
target" - which means we can only write out four numbers from the
fragment shader in each pass (R,G,B,A). So if you want a four band
histogram, it's easy. If you needed (say) 16 bands, then you'd have to
do the calculation four times - gathering 4 bands of the histogram each
Also, the precision of your results will be limited if you're running on
a cellphone or something with only 16 bit pixels - you really need to
use the floating point texture extension.
However, rendering images that are progressively smaller than the input
image (maybe 4x4=16 times smaller with each reduction stage) means that
you can do a LOT of passes and still get superb frame-rates. If your
input image is 1024x1024 - then you render quads at 256x256, 64x64,
16x16, 4x4 and finally 1x1 - so you get your results after drawing just
5 quadrilaterals per four histogram samples. If you 'atlas' the results
at the lower resolutions, you can batch the quads together - and it's
AMAZINGLY fast, even if you want a 256 band histogram.
You are currently subscribed to firstname.lastname@example.org.
To unsubscribe, send an email to email@example.com with
the following command in the body of your email: