Textures - more

From OpenGL Wiki
Revision as of 16:00, 21 August 2009 by V-man (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Internal and External

One of the confusions that has always existed for newcomers is the way that texture format is handled by functions such as glTexImage2D and glTexImage3D.
There is the internalFormat and also the externalFormat
What is more confusing is that glTexImage2D and glTexImage3D take in 2 parameters for defining what the external format is.

  glTexImage2D(GL_TEXTURE_2D, 0, internalFormat, width, height, border, format, type, ptexels);

With internalFormat, you tell the GL driver how you want the texture to be stored on the GPU.
internalFormat might be GL_RGBA8.
If you look at the valid parameters for internalFormat, you will never find GL_BGRA8.
GL_RGBA8 doesn't truly mean that the GPU will store it as GL_RGBA8. It might actually store it as GL_BGRA8. This is typically done on all GPUs on the PC (Windows). We are pretty sure that none of the GPUs trully store as GL_RGBA8.

So what happens if instead of GL_RGBA8, you use GL_RGBA or perhaps 4? This is explained in the GL specification. The driver is free to select a storage format of GL_RGBA8. The driver makes the decision for you. The driver can even choose a higher precision level if it wants, such as GL_RGBA16.

What about the externalFormat?
externalFormat is defined by format and type.
For the format, you could use GL_BGRA or GL_RGBA. Here, you CAN NOT use GL_RGBA8.
For type, you can use GL_UNSIGNED_BYTE.

The fun thing is that you aren't forced to give GL_BGRA and GL_UNSIGNED_BYTE to glTexImage2D.
You can try GL_RED and GL_FLOAT or perhaps GL_RGBA and GL_UNSIGNED_INT and the driver has no choice but to do data conversion and then upload to the GPU.
This makes GL drivers complicated, at least the glTexImage2D part of the driver code.
This is one reason that in OpenGL ES (khronos.org), data conversion is not done at all. The driver simply gives you an error (glGetError).
You must give the driver a format that is truly support.
That makes Embeded systems such as cellphones and PDA lightweight.

Max Texture Size

To know the max width and height of a 1D or 2D texture that your GPU supports

 int value;
 glGetIntegerv(GL_MAX_TEXTURE_SIZE, &value);   //Returns 1 value

To know the max width, height, depth of a 3D texture that your GPU supports

 int value;
 glGetIntegerv(GL_MAX_3D_TEXTURE_SIZE, &value);

To know the max width and height of a a cubemap texture that your GPU supports

 int value;
 glGetIntegerv(GL_MAX_CUBE_MAP_TEXTURE_SIZE, &value);

Max Texture Units

Never call

 glGetIntegerv(GL_MAX_TEXTURE_UNITS, &MaxTextureUnits);

because this is for the fixed pipeline which is deprecated now. It would return a low value such as 4.
For GL 2.0 and onwards, use the following

 glGetIntegerv(GL_MAX_TEXTURE_IMAGE_UNITS, &MaxTextureImageUnits);

The above would return a value such as 16 or 32 or above. That is the number of image samplers that your GPU supports in the fragment shader.

The following is for the vertex shader (available since GL 2.0). This might return 0 for certain GPUs.

 glGetIntegerv(GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS, &MaxVertexTextureImageUnits);

The following is for the geometry shader (available since GL 3.2)

 glGetIntegerv(GL_MAX_GEOMETRY_TEXTURE_IMAGE_UNITS, &MaxGSGeometryTextureImageUnits);

The following is VS + GS + FS (available since GL 2.0)

 glGetIntegerv(GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS, &MaxCombinedTextureImageUnits);

and the following is the number of texture coordinates available which usually is 8

 glGetIntegerv(GL_MAX_TEXTURE_COORDS, &MaxTextureCoords);