Difference between revisions of "Texture Sampling"

From OpenGL Wiki
Jump to navigation Jump to search
m
m
Line 1: Line 1:
 +
== After Compiling Your Shader ==
 
In order to setup texture sampling, after you have compiled your shader, you need to get the location for the samplers.<br>
 
In order to setup texture sampling, after you have compiled your shader, you need to get the location for the samplers.<br>
 
For getting a shader's uniform's location, the shader doesn't need to be bound when you call glGetUniformLocation.<br>
 
For getting a shader's uniform's location, the shader doesn't need to be bound when you call glGetUniformLocation.<br>
Line 33: Line 34:
 
   glBindTexture(GL_TEXTURE_2D, tex[2]);
 
   glBindTexture(GL_TEXTURE_2D, tex[2]);
  
 +
== In The GLSL Shader ==
 
So what happens when you sample the texture?<br>
 
So what happens when you sample the texture?<br>
 
When you sample it, you get a normalized value if the texture is some fixed point format. By fixed point, we mean integer formats.
 
When you sample it, you get a normalized value if the texture is some fixed point format. By fixed point, we mean integer formats.

Revision as of 03:23, 9 December 2008

After Compiling Your Shader

In order to setup texture sampling, after you have compiled your shader, you need to get the location for the samplers.
For getting a shader's uniform's location, the shader doesn't need to be bound when you call glGetUniformLocation.
Let's assume you have a sampler called MyDiffuseTexture, MyEnvironmentMap, MyGlossMap in your FS

 uniform sampler2D MyDiffuseTexture;
 uniform samplerCube MyEnvironmentMap;
 uniform sampler2D MyGlossMap;

In your code, get the locations

 the_location0 = glGetUniformLocation(ProgramObject, "MyDiffuseTexture");
 the_location1 = glGetUniformLocation(ProgramObject, "MyEnvironmentMap");
 the_location2 = glGetUniformLocation(ProgramObject, "MyGlossMap");

By default, each of those uniforms are 0. If you use the shader as is, an error will be raised. You would need to call glGetError(). The state of the shader is also consider invalid.

 int isValid;
 glValidateProgram(ProgramObject);
 glGetProgramiv(ProgramObject, GL_VALIDATE_STATUS, &isValid);

So, to set up those uniforms, bind the shader and call glUniform1i since they are considered as integers

 glUseProgram(ProgramObject);
 //Bind to tex unit 0
 glUniform1i(the_location0, 0);
 //Bind to tex unit 1
 glUniform1i(the_location1, 1);
 //Bind to tex unit 2
 glUniform1i(the_location2, 2);

To bind your texture, use glBindTexture

 glActivateTexture(GL_TEXTURE_0);
 glBindTexture(GL_TEXTURE_2D, tex[0]);
 glActivateTexture(GL_TEXTURE_1);
 glBindTexture(GL_TEXTURE_CUBE_MAP, tex[1]);
 glActivateTexture(GL_TEXTURE_2);
 glBindTexture(GL_TEXTURE_2D, tex[2]);

In The GLSL Shader

So what happens when you sample the texture?
When you sample it, you get a normalized value if the texture is some fixed point format. By fixed point, we mean integer formats. Integer formats are BYTE, UNSIGNED_BYTE, SHORT, UNSIGNED_SHORT, INTEGER, UNSIGNED_INTEGER.
These would be formats like GL_RGBA8, GL_LUMINANCE8, GL_ALPHA8, GL_LUMINANCE16 and the many other formats listed in the GL spec.

 vec4 texel0 = texture2D(MyDiffuseTexture, TexCoord0);
 vec4 texel1 = textureCube(MyEnvironmentMap, TexCoord1);
 vec4 texel2 = texture2D(MyGlossMap, TexCoord2);

The values returned will be from 0.0 to 1.0 for each RGBA component.
The GPU reads the integer format and converts them automatically.
If for some reason, you don't want it to do a conversion, see http://www.opengl.org/wiki/index.php/GL_EXT_texture_integer
That extension is available on Geforce 8 and up.

If the texture is a floating point format such as GL_RGBA32F, then since this is a 32 bit float, then no conversion takes place.
If the texel has a value like -489.5, then that's what you get.
If you are using GL_RGBA16F, then the 16 bit float gets upconverted (cast) to 32 bit float.
If you don't want it to convert, do the following

 half4 texel0 = texture2D(My16BitFloatTexture, TexCoord0);

That works on nVidia. You must make sure not to define any version number in your shader else nVidia drivers will consider half4 not defined.
half, half2, half3, half4 are not defined in the standard of GL 2.1

 #version 110

Don't put the above at the top of your shader if your shader doesn't follow the standard!