[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Public WebGL] Lifetime of WebGL objects in Firefox and Webkit
- To: Glenn Maynard <firstname.lastname@example.org>
- Subject: Re: [Public WebGL] Lifetime of WebGL objects in Firefox and Webkit
- From: "Gregg Tavares (wrk)" <email@example.com>
- Date: Tue, 28 Jun 2011 15:30:46 -0700
- Cc: Benoit Jacob <firstname.lastname@example.org>, public webgl <email@example.com>
- Dkim-signature: v=1; a=rsa-sha1; c=relaxed/relaxed; d=google.com; s=beta; t=1309300249; bh=RCsq0Eud7uOw20+svAnTdzrJu7Q=; h=MIME-Version:In-Reply-To:References:Date:Message-ID:Subject:From: To:Cc:Content-Type; b=l9eUkkD3dtn7rukumtBE4P6wVOWZlcehwcBUG5cLm8xQNVF1w/4nOPzH/OvlCY4cO ZdB6yNF3ZJAt5lgWQ3eeA==
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=beta; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :cc:content-type; bh=pTtX882moiHxvCuxE0l+wPVY6VGhFii8PLU4MJ6YApw=; b=dc0OzyoiTEb66nHSVUKgnP0kJTDSeZY3r1/LLtowBOZ8uW4JvzTYinGqyadYidvHP2 5o6W1geAz/GSjeNa+Z0A==
- Domainkey-signature: a=rsa-sha1; s=beta; d=google.com; c=nofws; q=dns; h=dkim-signature:mime-version:in-reply-to:references:date: message-id:subject:from:to:cc:content-type:x-system-of-record; b=Z1r9dINyrTrMn6GAmyXo8lBQqe5JdHMwn3Sz92lDlSM4FWioK8XIYvUMUljZXWiyd acPIl5Zb+1gyWOHCQTTvg==
- In-reply-to: <BANLkTin4fUqgvyVg4fq198fwb0o2K5t65A@mail.gmail.com>
- List-id: Public WebGL Mailing List <public_webgl.khronos.org>
- References: <1076858097.337214.1309022066203.JavaMail.firstname.lastname@example.org> <788576513.337434.1309027073734.JavaMail.email@example.com> <BANLkTin4fUqgvyVg4fq198fwb0o2K5t65A@mail.gmail.com>
- Sender: firstname.lastname@example.org
On Tue, Jun 28, 2011 at 11:47 AM, Glenn Maynard <email@example.com>
As far as I can see, in both Firefox and Webkit, the WebGLRenderingContext stores a hash table of reference pointers to all the WebGL objects associated to this context, such as textures.
This means that unreferenced objects such as textures are kept alive until the WebGLRenderingContext gets destroyed. The only way to destroy a WebGL texture earlier than the context itself, is to call deleteTexture explicitly.
It's easy to imagine (admittedly bad) JS code that would inadvertently cause a leak because of that. Imagine a video player that would create a new texture object for each frame, let it go out of scope and naively expect that to destroy it.
It's hard to imagine a WebGL app that would run all that long if it did anything like this. A 512meg card would fill up on a image viewer after 500 images or less. A video player would run out of memory in probably 5-15 seconds.
On top of that, unless you are forcing a GC if OpenGL runs out of memory, cleaning up on GC won't help.
This is definitely a bug. Objects should not hold implicit references to other objects.
A straightforward analogy is HTMLDocument (WebGLRenderingContext) to HTMLImageElement (WebGLTexture). If you remove an image from a document and discard the reference, the document must not keep a reference to the image around; the image must eventually be collected. You don't have a guarantee of when, but it's *not* tied to the lifetime of the document.
The existance of glDeleteTexture doesn't change this. Developers *should* delete textures explicitly, in order to reclaim texture memory deterministically, but if they don't do so the resource should still always be GC'd eventually, not kept in memory by the context.
The issue here seems straightforward: the WebGLRenderingContext's table should be using weak references, to keep whatever extra information about the objects they need without preventing the resource from being collected.
Sure an implementation should probably do this but a WebGL app can't count on GC saving them from sloppy WebGL resource management.