glGetStringi pointer error on 64bits (v3.0)

Started by
4 comments, last by Codeka 14 years, 7 months ago
Hi I'm testing some OpenGL 3.0 examples in my 64 bits machine with Ubuntu and I have a problem with glGetStringi. Compiling this code: printf("Version: %s\n", glGetString(GL_VERSION)); GLubyte *ptr = glGetStringi(GL_EXTENSIONS, (GLuint) i); printf("%s\n", ptr); The first line (glGetString) works fine but the glGetStringi outputs a warning with gcc: main.c:28: warning: initialization makes pointer from integer without a cast When executed, the program prints de Version line and then I got a Segmentation Fault... I think it could be because I'm using a 64 bits and glGetStringi is returning a 32bits pointer or something but I'm really lost at this point.
Advertisement
What type is i? Why are you casting it to a GLuint?
The var "i" is an int I've tried to cast it to GLuint but the warning remains.

glGetStringi is inside a loop. something like this:

for (i = 0; i < n; i++) {
printf("%s\n", glGetStringi(GL_EXTENSIONS, i));
}
Mmmm ok, what declaration of glGetStringi does your GL headers have?

Mine is:

extern const GLubyte * APIENTRY glGetStringi (GLenum, GLuint);

So, no pointer involved except for the return value, which is a plain pointer.
Mine is the same

GLAPI const GLubyte * APIENTRY glGetStringi (GLenum, GLuint);

The warning is about the return value from that function.

Today is my first time using 3.0 functions, maybe the linux libs are so new and are not 64 bits friendly. I don't know.

Thank you.
Quote:Original post by granerer
GLAPI const GLubyte * APIENTRY glGetStringi (GLenum, GLuint);
If that really is the definition, then I don't see how that could possibly produce that warning.

Is this straight C? If so, my guess is that you're not including the proper definition (or you're not using the correct #define or something to get the version 3.x stuff? I'm not really sure)

Do you get any other warnings? Try the highest warning level.

This topic is closed to new replies.

Advertisement