In several applications ranging from medical imaging to terrain modeling, original data to be visualized consist of single channel (grayscale) maps and require more than 8 bits for usable precision. In these cases LUMINANCE16 is required as a format and UNSIGNED_SHORT as a datatype for textures.
Since WebGL targets span from mobile devices to powerful graphics workstations, support for LUMINANCE16 should be offered as an extension that will be initially available only on desktops and power laptops.
The workaround to this is to "encode" 16-bit values into 4- or 8-bit channels but this results to significant performance overhead since values need to be "decoded" in shaders. Additionally, filtering needs also to be implemented in shader code.
Does the OES_texture_float extension not address this use case (potentially at the expense of more GPU memory)? This extension is portable to mobile hardware, where support for the GL_LUMINANCE16 internal format would not be. WebGL follows the OpenGL ES 2.0 texture specification rules, where the format and internal format must currently be identical.
Chromium and Safari already offer support for the OES_texture_float extension in WebGL.
I think the problem is that the original source data is already in L16 format. To use texture_float, it would have to be converted to floating point first, before being uploaded to GL. However, the browser would take care of that in the case of texImage2D with an image element, so it probably wouldn't be that big of a nissue, other than the added memory usage.