[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] EXT_disjoint_timer_query disabled

On Sat, May 19, 2018 at 6:35 PM, Ken Russell <kbr@google.com> wrote:
Chrome has always been returning microsecond resolution for these queries rather than nanosecond resolution. In discussion with the GLitch researchers, it seems likely that this reduction in precision is sufficient – and since no WebGL developer ever complained about low resolution of Chrome's timer queries, there's no need to make any changes to the precision.

Actually Ken...

I was running some tests on instruction speeds a while back to figure out optimal codepaths with disjoint timer query and didn't get any result but noise, so I assumed, it's all the same to the GPU and they're that smart to even optimize badly written GLSL to "good". Is it perhaps possible I was just measuring how much you fuzzed my results? Could you please not fuzz my results?