[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WebGL-next

I'm somewhat in the same boat. I've got a lot of different possible shader configurations that are assembled and compiled at run time, usually when the user changes some options. It seems unfeasible to compile/assemble all of them previously.
Besides shadertoy, another application that I started to like is the firefox shader debugger. I guess that one wouldn't work with intermediate bytecode?
Personally I'd love to see both options. The glsl compilers as they are now and intermediate bytecode, enabling developers to choose what's best for the task at hand.

Am 18/01/2015 um 08:24 schrieb Florian Bösch:
On Sun, Jan 18, 2015 at 7:41 AM, Brandon Jones <bajones@google.com> wrote:
Regarding intermediate bytecode for shaders, I don't view this as a problem and actually view it as a positive for the web in general. Aside from being more efficient to process and easier to distribute, by having an bytecode users can develop shaders in whatever language has a compiler available but the web only has to understand one format. That puts us in roughly the same position as we're in with GLSL today, but with a more performance-friendly format.
Well, I don't mean a problem as in "it shouldn't be done". I think it should be done, and it's a good idea. It's a problem because it introduces a (hitherto unheard of) precompilation requirement on web technology (and so will surprise a whole lot of people).
It may prove to be a concern for Shadertoy-like sites that want to compile their shaders on the fly, but with the advent of emscripten we can be reasonably certain that any native compilers that get built can have a JS interface. Also, for the sake of clarity: the compile process shouldn't require a GPU. The bytecode will be standardized and not rely on any GPU specific instructions, so it should be perfectly feasible to run the compilers on any device with a CPU.
I think that stating it as "a concern for Shadertoy-like sites" is not a correct characterization. Being able to deliver bytecode is beneficial, no two ways about it. However, the facility to provide source (and compile it to bytecode on the client) has to be present from day 0, no ifs and whens and tooling. This is because many sites (among them shadertoy like ones, but also fractal explorer etc.) assemble a shader based on performance requirements and user settings. For instance, I recently engaged a client from the geomapping domain, which has a complex configuration requirement where a shader needs to perform one of a dozen of different flavors of interpolation, using one of 3 possible fading effects, using one of 2 possible texture layout interpretations, using one of 2 possible texture packing formats, leading to (drumroll) a complete total of  144 possible shader combinations (only 2-3 of which are in actual use at any point in time). It'd be a huge drag on bandwidth, build times and code complexity to pre-assemble all combinations instead of assembling them in-situ. And an emscripten compilate as a required dependence to get something back which is hugely beneficial is a suggestion I find unsavory.