Results 1 to 2 of 2

Thread: Wrong OpenGL Texture Format Type for CL/GL-Interop?

  1. #1
    Junior Member
    Join Date
    Jan 2012
    Posts
    3

    Wrong OpenGL Texture Format Type for CL/GL-Interop?

    I'm trying the OpenCL-OpenGL interop for textures on my Geforce 330M with CUDA Toolkit 4.0.

    I want to capture a frame, use that data as an input image (Image2D) to a OpenCL Kernel. The Kernel should manipulate the data and write it to an Image2DGL, which is an image object with an attached OpenGL texture. Basically it looks like that:

    Code :
     _______________      RGB        _______________
    |               |    uint8*     |               |   CL_RGBA / CL_UNORM_INT8
    |   Grabber     | ------------> |   Image2D     | -------------------------.
    |   avcodec     |               |   [input]     |                          |
    |_______________|               |_______________|                          |
                                                                               |    
                                                                               V
     _______________                 _______________                       _______________
    |               |               |               |                     |               |
    |   Texture     | ------------> |   Image2DGL   | <-----------------> |    Kernel     |
    |_______________|               |   [output]    |                     |_______________|
                                    |_______________|
    Internal
    Format: GL_RGBA
    Format: GL_RGBA
    Type: ?

    I'm initializing the texture like that:

    Code :
    GLuint tex = 0;
     
    void initTexture( int width, int height )
    {
        glGenTextures(1, &tex);
        glBindTexture(GL_TEXTURE_RECTANGLE, tex);
    // now here is where I need assistance: The type parameter of the Texture (GL_UNSIGNED_BYTE)
        glTexImage2D(GL_TEXTURE_RECTANGLE, 0, GL_RGBA, width, height, 0, GL_UNSIGNED_BYTE, GL_FLOAT, NULL );
    }

    Then I create the shared image (Image2DGL):

    Code :
    texMems.push_back(Image2DGL(clw->context, CL_MEM_READ_WRITE, GL_TEXTURE_RECTANGLE, 0, tex, &err));

    Then I create the source image (input image):

    ImageFormat format;
    format.image_channel_data_type = CL_UNORM_INT8;
    format.image_channel_order = CL_RGBA;
    srcImgBuffer = Image2D(clw->context, CL_MEM_READ_WRITE, format, width, height, 0, NULL, &err);
    In every render loop I'm writing the data into the srcImgBuffer:

    Code :
    // write the frame to the image buffer
    clw->queue.enqueueWriteImage(srcImgBuffer, CL_TRUE, origin, region, 0, 0, (void*)data, NULL, NULL);

    Also I'm setting the arguments for the kernel:

    Code :
    tex_kernel.setArg(0, texMems[0]);
    tex_kernel.setArg(1, srcImgBuffer);
    tex_kernel.setArg(2, width);
    tex_kernel.setArg(3, height);

    Before and after I do acquire and release the GL objects. The testing kernel looks like that:

    Code :
    __kernel void init_texture_kernel(__write_only image2d_t out, __read_only image2d_t in, int w, int h)
    {
        const sampler_t smp = CLK_NORMALIZED_COORDS_FALSE | CLK_ADDRESS_CLAMP | CLK_FILTER_NEAREST;
     
        int2 coords = { get_global_id(0), get_global_id(1) };
        float4 pixel = read_imagef(in, smp, coords);
        float4 test = { (float)coords.x/(float)w , 0, 0, 1};
        write_imagef( out, coords, pixel );
    }

    The image_channel_data_type can be read as float in the kernel and is interpreted as normalized value. The output image looks not right, I've got a sliced picture (linewise), obviously because of the wrong data interpretation. As I mentioned I assume the error is within the initialization of the texture's type. I also tried GL_FLOAT (as I'm writing as float to the image in the kernel).



    The left one is a PPM out of the decoder, the right one is what I'm getting back on my output texture.

    If I bind the captured frames to the texture directly, the video plays ok. So it must be related to the CL-GL interface somehow.

    The Image2DGL init did not throw any errors.

    If someone actually read up to here: Do you have suggestions regarding the texture's type to resolve the issue? Or can anybody point to a more-in-depth explanation of the interop?

  2. #2
    Junior Member
    Join Date
    Jan 2012
    Posts
    3

    Re: Wrong OpenGL Texture Format Type for CL/GL-Interop?

    Nevermind, I've found the solution.

    The trouble was with the grabbed frames (as probably everybody except me can guess from the screenshot). I grabbed RGB and in the kernel read the pixels using float4s. Of course that messes up the picture.

    Plus I probably posted in the wrong forum anyway, as this is about interoperability of OpenCL code on different devices. Sorry.

Similar Threads

  1. OpenCL - OpenGL 2D texture interop
    By majicou in forum OpenCL
    Replies: 4
    Last Post: 01-14-2012, 06:56 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •