Results 1 to 3 of 3

Thread: NVIDIA's GeForce 9400 GT: CL_DEVICE_PREFERRED_VECTOR_WIDTH_FLOAT = 1

  1. #1
    Junior Member
    Join Date
    Oct 2013
    Posts
    22

    NVIDIA's GeForce 9400 GT: CL_DEVICE_PREFERRED_VECTOR_WIDTH_FLOAT = 1

    Hello,

    On my device (NVIDIA's GeForce 9400 GT) I ran the following command:

    Code :
    clGetDeviceInfo(devices[i], CL_DEVICE_PREFERRED_VECTOR_WIDTH_FLOAT, 		
                sizeof(vec_width), &vec_width, NULL);

    I got vec_width=1.
    Does it make sense ?

    I ran the following kernel and got correct results:

    Code :
    __kernel void matvec_mult(__global float4* matrix,
                              __global float4* vector,
                              __global float* result) {
     
       int i = get_global_id(0);
       result[i] = dot(matrix[i], vector[0]);
    }

    So it seems the NVIDIA device supports at least a vecotr of 4 floats.

    I'm using AMD's SDK. Can this cause the problem ?

    Thanks,
    Zvika

  2. #2
    Junior Member
    Join Date
    Jul 2011
    Location
    Bristol, UK
    Posts
    19
    Hi Zvika,

    The preferred vector width is just a recommendation for improving performance. In this case, NVIDIA's OpenCL implementation is telling you that it would prefer vectors of size 1 (i.e. scalar), because it doesn't have native support for vectors. It will work perfectly fine with vectors of any other size however (as will all implementations that conform to the standard), so there's no restrictions as to what vector size you can actually use in your kernels.

    Best wishes,

    James

  3. #3
    Newbie
    Join Date
    Nov 2013
    Posts
    1
    Good advice. Thanks for sharing this.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •