On Wed, Feb 4, 2015 at 4:01 PM, Roger Fong <email@example.com> wrote:
> Hello again folks,
> I have (another) potentially stupid question, which thus may potentially be
> easy to answer.
> I'm working on some code to cover texture format validation in WebGL2 and
> I'm focusing right now on CopyTexImage2D.
> It seems easy to validate the format, internal format, and type combination
> if I know all 3. However, for CopyTexImage2D I need to get format and type
> from table 3.15 in the GLES3 spec.
> Page 152 of
> My question is, how do I determine my read buffer color format?
I think the paragraph beginning with "Otherwise the effective internal
format is determined by the row in table 3.17 or table 3.18..."
defines the rules for the "default" frame buffer -- that allocated by
the WebGL implementation itself. Typically this will just be a texture
attached to a framebuffer which is inaccessible to the WebGL
application. Either way, it ought to be possible to make enough
queries of the RGBA channels' sizes that the effective internal format
can be inferred. Does that answer your question?
> I was expecting there to be enums associated with the 4 items in that column
> of the table and a matching query but that doesn't seem to be the case. What
> am I missing here?
> Also is this the right place to ask questions about implementation?