[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] WEBGL_dynamic_texture extension proposal
On 07/07/2012 06:17, Chris Marrin wrote:
I agree with the goal. Before I can specify this I need to
understand a complete picture of what the application needs to do.
If synchronizing with audio was not a consideration, I think the
triple buffering approach works just fine. The app would always get
the most recent frame available. If it's running slower than the
video a few might be missed, if faster, a few might be repeated.
Either way it's not likely visible to the watcher. But we must deal
with synchronizing to audio.
I think our goal should be to allow full frame rate playback
of perfectly synced and paced video in a 3D scene. But to do
that we need to do better than just letting the video decoder
give us whatever frame happens to be ready. The renderer really
needs full awareness of media timing, which needs to be
communicated in both directions between the media provider and
You mentioned in your first message that iOS stores the images with
a timestamp of when they should be displayed. Are you sure? ISTM
that the application needs to control when the image will be
displayed. Are you sure the timestamp isn't what we used to call at
SGI (and in OpenML) the media stream counter (MSC). This indicates
the time of the frame relative to the beginning of the media
sequence. The audio stream has a similar counter and it is the
application's responsibility to match sure that video and audio
samples with matching MSCs emerge from the screen & speaker at
the same time.
To enable this, the application needs access to the MSCs. It also
needs a way to specify to the playback functions the time at which
the matching samples should emerge and it needs to know the delay
from scheduling the output to when it appears. To do this for
graphics, we added an extension to OpenGL. The name is mentioned in
the OpenML specification but I've not been able to find the
When playing a <video> element the browser or some lower-level
playback engine will be handling synchronization. The WebGL app.
needs to get some control over that in order to account for the
delays incurred by the 3D rendering.
I'll talk to some experts I know and see if I can get a handle on a
way to deal with this.
NOTE: This electronic mail message may contain confidential and
privileged information from HI Corporation. If you are not the
intended recipient, any disclosure, photocopying, distribution
or use of the contents of the received information is
If you have received this e-mail in error, please notify the
sender immediately and permanently delete this message and all