Results 1 to 2 of 2

Thread: WebGL readPixels failure on Ubuntu

  1. #1

    WebGL readPixels failure on Ubuntu

    On Ubuntu 11.10 on desktop and laptop 386 machines, both with NVIDIA cards, using the recommended NVIDIA proprietary driver, WebGL readPixels seems to fail with both Firefox and Chrome. I've implemented a standard mouse-picking algorithm, which is to render on a hidden canvas with no lighting and with each object having a false color equal to its ID. The canvas is scissored (clipped) to the one pixel under the mouse, and readPixels gets the color of that pixel, which yields the ID of the object. This works fine on all WebGL-enabled browsers on Windows and Mac, but on Ubuntu I always get the background color to which the canvas was cleared at the start of the special render.

    I've tried many many different tests including setting preserveDrawingBuffer to true (should be irrelevant as readPixels is executed before exiting the render routine), trying to read from the regular canvas the color of the pixel under the mouse (in which case there is no scissoring), etc. In no case do I ever manage to see a color other than the background color. I'd appreciate hearing from a developer who has made WebGL readPixels work properly on Ubuntu. Thanks.

    P.S. I'm developing a WebGL-based programming environment (glowscript.org) suitable for novice programmers, inspired by the similarly novice-friendly OpenGL-based VPython (vpython.org).

  2. #2

    Re: WebGL readPixels failure on Ubuntu

    I made an extremely minimal program and readPixels works fine on Ubuntu. I still don't know why my full code works on Windows and Mac but not on Ubuntu, but I guess that's my problem. Perhaps its some kind of timing issue.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •