View Full Version : OpenGL-Interop for 16bit Unsigned Integer (GL_LUMINANCE16UI)
12-13-2011, 07:10 AM
Surely I am not the only person with 16bit integer textures so I am confused why this wasn't planned or at least an optional feature. If you only have a single channel there is no reason to have to pad it to a RGBA lumanance texture in OpenGL just to be able to do interop. NVidia's OpenCL driver actually support this currently but AMD doesn't (most likely because it is not mentioned in the spec). Also for those that say the vendors are just padding it anyway I have tested it on NVidias implementation and they clearly are not as the performance is considerably better using the single channel version even if I only do pixel calculations on the first channel.
12-13-2011, 04:30 PM
Surely I am not the only person with 16bit integer textures so I am confused why this wasn't planned or at least an optional feature.
Vendors are free to support that texture format as an extension. I think you would have more success asking AMD to implement this extension, as it is probably rather trivial for them to add. It would be best if you could name the NVidia extension that you would like them to support.
12-14-2011, 06:34 AM
It would be best if you could name the NVidia extension that you would like them to support.
It isn't an extension it simply happens to work on Nvidia hardware ("supports" was probably not the correct word choice in my previous post). I while I am sure someone at NVidia thought of this and wrote the code to support it they don't seem to have documented it, at least not in the public domain.
The OpenCL specification does not mentioned anthing about GL_LUMINANCE16UI not do any of their extensions.
While I realise they could write their own extension it seems ridiculous for a language used for Scientific community to only be optimized for 8bit values or 16bit values with four 4 channels.
12-14-2011, 03:48 PM
True, an extension may not even be necessary since according to section 220.127.116.11. of the OpenCL 1.1. spec, implementations can choose which particular OpenGL texture formats they wish to support in OpenCL.
I'm confident that if you ask politely your hardware vendor's developer relations group to add support for your preferred pixel formats they will be happy to oblige.
Powered by vBulletin® Version 4.2.2 Copyright © 2015 vBulletin Solutions, Inc. All rights reserved.