Hardware specifics: ATI
ATI is a major contributor to OpenGL's design.
Their GL support started with sometime when Rage 128 was released.
Rage 128 = GL version 1.2, 2 tex units
The Radeon series came along and it was clear they wanted to dominate the market. In other words, take the crown from nVidia (TNT, TNT2, Geforce 256).
Radeon 7000 = GL version 1.3, 3 tex units
More improvements with the 8500. I think they limited to GL 1.3 because it could not support the core texture environment modes.
Radeon 8500 = GL version 1.3, 6 tex units, DXT compression
Vertex shader (ARB_vertex_program), anisotropy (16x), pn triangles make an appearance and then they get removed during Radeon 9700 generation.
Radeon 9000, 9100, 9200. I don't know how these are different from 8500. Perhaps GPU and memory clock rate only.
Like the 8500, they could not support GL 1.4 or 1.5 as far as the specification goes, but came close.
There was a lot of buzz when this appeared. It was a DX9 part. Very powerful shaders.
shader model 2.0 support (SM2)
Radeon 9700 = GL version 2.0, 8 tex image units, 16 texcoord interpolators, DXT compression,
anisotropy (16x), pn triangles killed off (bad performance), MRT support (4), FBO support added (no stencil support),
Radeon 9500, 9600, 9800 were essentially the same as the 9700
The 9500 has half the fragment pipes (4) compared to the 9700.
The 9500 was expensice to produce, so another series of 9500 released that had a tuned up GPU. Less transistors, less power consumption. The 9600 was further tuned but was either same as 9500 or slower. Leaving out some detail.
X300, X600, X700.
These seem to be the same as Radeon 9700 but with support for longer shaders.
I think it's called shader model 2.x support (SM2x)
X1300, X1600, X1800
This one is suppose to have shader model 3.0 support (SM3)
All X2xxx HD versions, in other words all of them starting with HD2xxx since they all have the HD nomenclature, are SM 4.0 GPUs.
If you card doesn't suppose a certain GL version or a certain GL extension, the solution is to use a software renderer like Mesa3D or buy a new card.
On nVidia, when you create a standard texture and when you don't include
glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAPS, GL_TRUE); glTexImage2D(...);
and instead you call
it works because glGenerateMipmapEXT allocates the mipmaps and generates them. On ATI/AMD, it doesn't work for some reason.
Someone has discovered that you would have to call glEnable(GL_TEXTURE_2D) just before glGenerateMipmapEXT and it works.
glTexImage2D(...); glEnable(GL_TEXTURE_2D); glGenerateMipmapEXT(GL_TEXTURE_2D);