Results 1 to 4 of 4

Thread: OpenGL-Interop for 16bit Unsigned Integer (GL_LUMINANCE16UI)

  1. #1
    Junior Member
    Join Date
    Dec 2009
    Posts
    17

    OpenGL-Interop for 16bit Unsigned Integer (GL_LUMINANCE16UI)

    Surely I am not the only person with 16bit integer textures so I am confused why this wasn't planned or at least an optional feature. If you only have a single channel there is no reason to have to pad it to a RGBA lumanance texture in OpenGL just to be able to do interop. NVidia's OpenCL driver actually support this currently but AMD doesn't (most likely because it is not mentioned in the spec). Also for those that say the vendors are just padding it anyway I have tested it on NVidias implementation and they clearly are not as the performance is considerably better using the single channel version even if I only do pixel calculations on the first channel.

  2. #2
    Senior Member
    Join Date
    May 2010
    Location
    Toronto, Canada
    Posts
    845

    Re: OpenGL-Interop for 16bit Unsigned Integer (GL_LUMINANCE1

    Surely I am not the only person with 16bit integer textures so I am confused why this wasn't planned or at least an optional feature.
    Vendors are free to support that texture format as an extension. I think you would have more success asking AMD to implement this extension, as it is probably rather trivial for them to add. It would be best if you could name the NVidia extension that you would like them to support.
    Disclaimer: Employee of Qualcomm Canada. Any opinions expressed here are personal and do not necessarily reflect the views of my employer. LinkedIn profile.

  3. #3
    Junior Member
    Join Date
    Dec 2009
    Posts
    17

    Re: OpenGL-Interop for 16bit Unsigned Integer (GL_LUMINANCE1

    Quote Originally Posted by david.garcia
    It would be best if you could name the NVidia extension that you would like them to support.
    It isn't an extension it simply happens to work on Nvidia hardware ("supports" was probably not the correct word choice in my previous post). I while I am sure someone at NVidia thought of this and wrote the code to support it they don't seem to have documented it, at least not in the public domain.

    The OpenCL specification does not mentioned anthing about GL_LUMINANCE16UI not do any of their extensions.

    While I realise they could write their own extension it seems ridiculous for a language used for Scientific community to only be optimized for 8bit values or 16bit values with four 4 channels.

  4. #4
    Senior Member
    Join Date
    May 2010
    Location
    Toronto, Canada
    Posts
    845

    Re: OpenGL-Interop for 16bit Unsigned Integer (GL_LUMINANCE1

    True, an extension may not even be necessary since according to section 9.8.3.1. of the OpenCL 1.1. spec, implementations can choose which particular OpenGL texture formats they wish to support in OpenCL.

    I'm confident that if you ask politely your hardware vendor's developer relations group to add support for your preferred pixel formats they will be happy to oblige.
    Disclaimer: Employee of Qualcomm Canada. Any opinions expressed here are personal and do not necessarily reflect the views of my employer. LinkedIn profile.

Similar Threads

  1. OpenCL - OpenGL 2D texture interop
    By majicou in forum OpenCL
    Replies: 4
    Last Post: 01-14-2012, 06:56 AM
  2. OpenCL + OpenGL (interop)
    By Executor in forum OpenCL
    Replies: 2
    Last Post: 11-09-2009, 05:16 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •