[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Public WebGL] WEBGL_dynamic_texture redux



Hi Mark,

Thanks for putting together this sample. A few thoughts.

  - Keeping the video stream separate from the video element seems cleaner. I think we should avoid APIs which require mutating HTML elements, and in particular, adding new properties.

  - When thinking about creating streams for arbitrary canvases, not just video elements, how will the producerFrame be handled? Today, when _javascript_ code modifies a 2D canvas, the results are made available (a) when the web page is composited, (b) when a readback API like toDataURL / getImageData is called, or (c) when WebGL uploads the canvas via texImage2D / texSubImage2D. It's a "pull" model. When a stream is connected, and the canvas is modified in the current _javascript_ callback, should acquireImage() cause a flush of the upstream canvas and an update of the producerFrame at the same time? I think it should. Is there any issue with spec'ing this behavior?

  - You point out that cpc won't update when the tab is backgrounded, but there are other issues: (a) the WebGL app might decide to not produce new frames sometimes (if the scene isn't updating); (b) msc won't update if the browser decides not to repaint because the page didn't update at all; (c) for backgrounded tabs it's unlikely that msc will update, and requestAnimationFrame will stop, but setTimeout based timers will still probably fire. Are there issues with any of these behaviors? I think probably not; applications will use the page visibility API to know when they've been backgrounded and suspend any measurements of frame rate.

  - Should this spec reference http://dvcs.w3.org/hg/webperf/raw-file/tip/specs/HighResolutionTime/Overview.html for the high-resolution timestamps instead of defining another concept?

-Ken


On Wed, Oct 24, 2012 at 12:20 AM, Mark Callow <callow.mark@artspark.co.jp> wrote:

Hi,

I am updating the WEBGL_dynamic_texture proposal to (a) provide a better interface to the stream producer (HTMLVideoElement, etc.) and (b) provide tools for handling timing and synchronization issues. Rather than writing spec. text I have been playing with sample code to see how various ideas feel. The entire sample program is attached. Please review it and send your feedback. Hopefully the embedded comments and IDL interface definitions give sufficient background for understanding.

(a) stemmed from David Sheets comments to this list requesting the stream interface be added to the producer HTML elements. The sample code offers two alternatives shown in the extract below: augmenting the producer element with the stream interface or keeping it as a separate object.

For (b) I've added query functions based on a monotonically increasing counter to retrieve the current value and to retrieve the value the last time the canvas was presented (updated to the screen).

The first part of the extract shows how the video producer and texture consumer are connected via a new wdtStream interface. The second part, the drawFrame function shows acquire and release of the frames and also how to determine how long it is taking to display the frames, whether any are being missed, etc.

Once we're all happy with this, I'll update the spec. text and then I think we'll be able to move it from proposal to draft.

  //
  // connectVideo
  //
  // Connect video from the passed HTMLVideoElement to the texture
  // currently bound to TEXTURE_EXTERNAL_OES on the active texture
  // unit.
  //
  // First a wdtStream object is created with its consumer set to
  // the texture. Once the video is loaded, it is set as the
  // producer. This could potentially fail, depending on the
  // video format.
  //
  // interface wdtStream {
  //   enum state {
  //     // Consumer connected; waiting for producer to connect
  //     wdtStreamConnecting,
  //     // Producer & consumer connected. No frames yet. */
  //     wdtStreamEmpty,
  //     wdtStreamNewFrameAvailable,
  //     wdtStreamOldFrameAvailable,
  //     wdtStreamDisconnected
  //   };
  //   // Time taken from acquireImage to posting drawing buffer; default 0?
  //   readonly int consumerLatency;
  //   // Frame # (aka Media Stream Count) of most recently inserted frame
  //   // Value is 1 at first frame.
  //   readonly int producerFrame;
  //   // MSC of most recently acquired frame.
  //   readonly int consumerFrame;
  //   // timeout for acquireImage; default 0
  //   int acquireTimeout;
  //
  //   void setConsumerLatency(int);
  // };
  //
 
  //
  function connectVideo(ctx, video)
  {
    g.loadingFiles.push(video);
    g.videoReady = false;

    //-----------------------------
    // Options for connecting to video
    //-----------------------------
    // OPTION 1: method on WDT extension augments video element
    // with a wdtStream object.
    ctx.dte.createStream(video);
    assert(video.wdtStream.state == wdtStreamConnecting);
    //-----------------------------
    // OPTION 2: method returns a stream object.
    g.vstream = ctx.dte.createStream(ctx);
    assert(g.vstream.state == wdtStreamConnecting);
    //-----------------------------

    video. color="#008080">function
() {
        g.loadingFiles.splice(g.loadingFiles.indexOf(video), 1);
        try {
          // OPTION 1: video object augmented with stream
          video.wdtStream.connect();
          assert(video.wdtStream.state == wdtStreamEmpty);
          //-----------------------------
          // OPTION 2: separate stream object
          g.vstream.connectProducer(video);
          assert(g.stream.state == wdtStreamEmpty);
          //------------------------------
          if (!video.autoplay) {
            video.play(); // Play video
          }
          g.videoReady = true;
        } catch (e) {
          window.alert("Video texture setup failed: " + e.name);
        }
      };
  }

  function drawFrame(gl)
  {
    var lastFrame;
    var syncValues;
    var latency;
    var graphicsMSCBase;

    // Make sure the canvas is sized correctly.
    reshape(gl);

    // Clear the canvas
    gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);

    // Matrix set-up deleted ...

    // To avoid duplicating everything below for each option, use a
    // temporary variable. This will not be necessary in the final
    // code.
    // OPTION 1: augmented video object
    var vstream = g.video.wdtStream;
    // OPTION 2: separate stream object
    var vstream = g.vstream;

    // In the following
    //   UST is a monotonically increasing counter never adjusted by NTP etc.
    //   The unit is nanoseconds but the frequency of update will vary from
    //   system to system. The average frequency at which the counter is
    //   updated should be 5x the highest MSC frequency supported. For
    //   example if highest MSC is 48kHz (audio) the update frequency
    //   should be 240kHz. Most OSes have this kind of counter available.
    //
    //   MSC is the media stream count. It is incremented once/sample; for
    //   video that means once/frame, for audio once/sample. For graphics,
    //   it is incremented once/screen refresh.
    //
    //   CPC is the canvas presentation count. It is incremented once
    //   each time the canvas is presented.
    //
    
    if (graphicsMSCBase == undefined{
        graphicsMSCBase = gl.dte.getSyncValues().msc;
    }

    if (lastFrame.msc && vstream.producerFrame > lastFrame.msc + 1) {
      // Missed a frame! Simplify rendering?
    }

    if (!latency.frameCount) {
      // Initialize
      latency.frameCount = 0;
      latency.accumTotal = 0;
    }

    if (lastFrame.ust) {
      syncValues = gl.dte.getSyncValues();
      // interface syncValues {
      //     // UST of last present
      //     readonly attribute long long ust;
      //     // Screen refresh count (aka MSC) at last present
      //     // Initialized to 0 on browser start
      //     readonly attribute long msc;
      //     // Canvas presentation count at last present
      //     // Initialized to 0 at canvas creation.
      //     readonly attribute long cpc;
      // };
      // XXX What happens to cpc when switch to another tab?
      if (syncValues.msc - graphicsMSCBase != syncValues.cpc) {
        // We are not keeping up with screen refresh!
        // Or are we? If cpc increment stops when canvas hidden,
        // will need some way to know canvas was hidden so app
        // won't just assume its not keeping up and therefore
        // adjust its rendering.
        graphicsMSCBase = syncValues.msc; // reset base.
      }
      latency.accumValue += syncValues.ust - lastFrame.ust;
      latency.frameCount++;
      if (latency.frameCount == 30) {
        vstream.setConsumerLatency(latency.accumValue / 30);
        latency.frameCount = 0;
        latency.accumValue = 0;    
      }
    }  

    if (g.videoReady) {
      if (g.video.wdtStream.acquireImage()) {
        // Record UST of frame acquisition.
        // No such system function in JS so it is added to extension.
        lastFrame.ust = gl.dte.ustnow();
        lastFrame.msc = vstream.consumerFrame;
      }
      // OPTION 2:
      vstream.acquireImage();
      lastFrame = g.stream.consumerFrame;
    }

    // Draw the cube
    gl.drawElements(gl.TRIANGLES, g.box.numIndices, gl.UNSIGNED_BYTE, 0);

    if (g.videoReady)
      vtream.releaseImage();

    // Show the framerate
    framerate.snapshot();

    currentAngle += incAngle;
    if (currentAngle > 360)
        currentAngle -= 360;
  }

 

Regards

    -Mark

Please note that, due to the integration of management operations following establishment of our new holding company, my e-mail address has changed to callow.mark<@>artspark<.>co<.>jp. I can receive messages at the old address for the rest of this year but please update your address book as soon as possible.
--
注意:この電子メールには、株式会社エイチアイの機密情報が含まれている場合が有ります。正式なメール受信者では無い場合はメール複 製、 再配信または情報の使用を固く禁じております。エラー、手違いでこのメールを受け取られましたら削除を行い配信者にご連絡をお願いい たし ます.

NOTE: This electronic mail message may contain confidential and privileged information from HI Corporation. If you are not the intended recipient, any disclosure, photocopying, distribution or use of the contents of the received information is prohibited. If you have received this e-mail in error, please notify the sender immediately and permanently delete this message and all related copies.