Khronos Public Bugzilla
Bug 202 - Cannot compile GL_TIMEOUT_IGNORED on Visual C++ 6.0
Summary: Cannot compile GL_TIMEOUT_IGNORED on Visual C++ 6.0
Alias: None
Product: OpenGL
Classification: Unclassified
Component: Registry (show other bugs)
Version: unspecified
Hardware: PC Windows
: P3 normal
Target Milestone: ---
Assignee: Jon Leech
QA Contact:
Depends on:
Reported: 2009-09-07 06:14 PDT by Marek Fort
Modified: 2018-01-23 00:27 PST (History)
0 users

See Also:

Draft glext.h with 64-bit literal support for MSVC 6.0 (492.77 KB, text/plain)
2009-09-24 16:58 PDT, Jon Leech

Note You need to log in before you can comment on or make changes to this bug.
Description Marek Fort 2009-09-07 06:14:38 PDT
Cannot compile use of GL_TIMEOUT_IGNORED on Visual Studio 6.0.

It says:
 error C2059: syntax error : 'bad suffix on number'

The problems is in uint64_t literal.
the only supported syntax for Visual C++ 6.0 is

Please add some ifdefs like:

#if _MSC_VER <= 1200
#define GL_TIMEOUT_IGNORED                0xFFFFFFFFFFFFFFFFui64
Comment 1 Jon Leech 2009-09-24 16:58:13 PDT
Created attachment 37 [details]
Draft glext.h with 64-bit literal support for MSVC 6.0
Comment 2 Jon Leech 2009-09-24 17:00:11 PDT
Please comment in the bug if this works. It isn't straightforward to do
this due to the way glext.h is generated but it should be OK.
Comment 3 Marek Fort 2009-09-25 03:22:14 PDT
The declaration of __GL_DECLARE_UINT is done too late.

The compiler says:
  error C2065: '__GL_DECLARE_UINT64' : undeclared identifier

this new macro is used at line 1652, but declared at line 4468.
Comment 4 Jon Leech 2010-03-04 23:41:10 PST
Sorry I missed replying to this at the time. I'm mystified as to why this
is a compiler error since all it's doing is using a preprocessor macro
in the definition of GL_TIMEOUT_IGNORED, then defining the macro afterwards.
AFAIK preprocessor behavior is deferred until macros are expanded so this
should work - it certainly does with gcc. Unfortunately I don't have
access to VC6 so I'm unable to test this, but I'm wondering what your
compiler does with the following code?

#include <stdio.h>

#define MACRO 3

main() {
    printf("VALUE = %d\n", VALUE);

If this generates a VC6 compiler error too then maybe there is some sort
of preprocessor control option that's affecting its behavior?
Comment 5 Marek Fort 2010-03-05 03:27:05 PST
You are right, I was caught in this trap. The macro expansion works well even in VC6.0. Your attached draft works well - on clean MSVS6.0 system.
Consider it done.

But ...

The problem why your new glext.h was not working in my environment is bit harder.
There is a collision in the way we (you and me) make new typedefs int64_t and  uint64_t.
Because my code does almost the same and compiler complained about double definition.
That's why I made a trick and defined GLEXT_64_TYPES_DEFINED in my code.
This avoided redefinition of uint64_t when glext.h was included later then my header.

This worked flawlessly until new stuff was added in conditional section GLEXT_64_TYPES_DEFINED.  New macro __GL_DECLARE_UINT64 is defined inside GLEXT_64_TYPES_DEFINED.

The question is if it is good idea to make typedefs int64_t/uint64_t in OpenGL header. I guess it is not as opengl header should only make gl prefixed macros/symbols.
On the other hand if you made some provision to live together with systems that also defines int64_t then it is OK.

Options to resolve this:
1- close this bug as it is (I can modify my copy of glext.h )

2- move __GL_DECLARE_UINT64 declaration outside section GLEXT_64_TYPES_DEFINED, and use ifndef __GL_DECLARE_UINT64 instead:

#ifndef __GL_DECLARE_UINT64
#if defined(_MSC_VER) && (_MSC_VER <= 1200)
#define __GL_DECLARE_UINT64(arg) arg ## ui64
#define __GL_DECLARE_UINT64(arg) arg ## ull

3-do not make typedefs [u]int64_t in glext.h This would require quite lot of work.

Please do option 2. It is nice and tidy. ;-)
Comment 6 Jon Leech 2018-01-23 00:27:02 PST
We never did anything about this, and the problem is nearly 8 years old, so closing.