The <type> parameter is not superfluous. It is the only way the OpenGL implementation knows what type of data the application is providing. For the integer case you cited, there are several integer formats that could be used to supply that data. They are not disambiguated by <format>.
For ES 2.0, unfortunately, <format> and <type> do not specify the internal format completely. The implementation has freedom to choose the number of bits at which it stores each component.
The data absolutely has the match the type. If it does not, the application is likely to crash. If it does not crash, the texture will be nonsense.
As I said when I started this thread (and again above), if you try that in native ES the app will most likely crash. The app has to first convert the data from unsigned byte array to FLOAT.