[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WEBGL_dynamic_texture extension proposal




On Jul 5, 2012, at 12:47 AM, Mark Callow <callow_mark@hicorp.co.jp> wrote:

I have completed the initial draft of a proposed WEBGL_dynamic_texture extension. The purpose of this is to provide an API for handling textures with dynamic images such as video that can be implemented at the maximum efficiency possible on a given platform, including with zero copies. 

Please look at the issues list before posting any questions as it contains many of the questions you are likely to have and several answers.

You can find the proposal at:

http://www.khronos.org/registry/webgl/extensions/proposals/WEBGL_dynamic_texture

I welcome your comments and help to resolve all the listed issues.

Good first pass, Mark. Thanks.

There are a couple of inconsistencies:

1) The IDL says "dynamicTextureAcquireImage" and the description says "dynamicTextureAcquireFrame". Same for release.

2) dynamicTextureSetSource passes the texture in the IDL and description, but not in the example. Given the nature of OpenGL, seems like you should not pass the texture, but rather use the currently bound texture? Same would be true of all the other API calls.


I am also concerned about timestamping. iOS and OSX has the concept of image queues, where a series of images are stored, each with a timestamp of when they should be displayed. So, for instance, it might contain several video frames, each with a timestamp that is 1/30th of a second apart (assuming a 30hz frame rate). This allows the decoder to place images in the queue at a different (and possibly non-uniform) rate, relative to the times at which each image is displayed. 

Your proposal doesn't deal with timestamps, so it's possible to get out of sync with the decode rate, causing strobing or other undesirable effects. Maybe this can be solved by passing the desired timestamp via dynamicTextureAcquireImage? When passed, the frame with the closest timestamp would be acquired and locked. You'd probably also want to return the actual timestamp acquired to allow for throttling. Or maybe it would be better to return the timestamp of the next frame after the one acquired. That would make it easy to sync rendering with the video frame rate.

Either way, there would need to be rules about frame availability. If I ask for a frame in the distant past, would it fail, or give me the oldest frame it has. After I release a frame with a given timestamp, is it and all older frames no longer available? Or is the lifetime of a frame something that is hidden.

At any rate, I think something like this will be needed to deal with timed media.

-----
~Chris Marrin
cmarrin@apple.com