Jump to content
  • Advertisement

Athe

Member
  • Content Count

    5
  • Joined

  • Last visited

Community Reputation

107 Neutral

About Athe

  • Rank
    Newbie
  1. Ok...so now I just tried creating a small test program in C like this: #include <GLFW/glfw3.h> #include <stdio.h> int main(void) {         GLFWwindow *window;         const GLubyte *ext;         if(!glfwInit()) {                 fprintf(stderr, "Failed to initialize GLFW\n");                 return -1;         }         glfwWindowHint(GLFW_SAMPLES, 4); // 4x antialiasing         glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3); // OpenGL 3.3         glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);         glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE); // Apparently needed "to make OS X happy"         glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);         window = glfwCreateWindow(1024, 768, "Tutorial 01", NULL, NULL);         if(window == NULL) {                 fprintf(stderr, "Failed to open GLFW window.");                 glfwTerminate();                 return -1;         }         glfwMakeContextCurrent(window);         ext = glGetStringi(GL_EXTENSIONS, 0);         printf("%s\n", (const char*)ext);         return 0; } And it failed just like previously. The returned pointer from glGetStringi is not correct, the upper 4 bytes are all 0xFF, just as if it is returning a 32-bit pointer to my 64-bit program and the 0xFF are just default values for the extra bytes.   However, if I do what you said and define GL_GLEXT_PROTOTYPES, it works! That seems really weird to me, but I'm glad it solves the problem!
  2. I am not defining GL_GLEXT_PROTOTYPES myself, but I assume that GLFW/GLEW or some other header file that I'm using defines it, because I do not get that compilation error.
  3. Thanks for your reply, I'm using GCC (my code is compiled as C++, and SOIL is compiled as C) version 4.8.2.   SOIL.c includes GL/gl.h which in turn includes GL/glext.h, where the declaration for glGetStringi is found.
  4. A small update: If I call glGetStringi(GL_EXTENSIONS, 0) from my project and then call glGetStringi(GL_EXTENSIONS, 0) from within SOIL (by modifying the SOIL source), I find that when calling from MY code, it returns a pointer to a correct string ("GL_AMD_multi_draw_indirect") at 0x00007ffff473d244, and when called from SOIL, it returns a pointer to 0xfffffffff473d244.   So it would seem as the second time it returns the pointer, its upper 4 bytes have all been set to 0xFF. This leads me to believe that somewhere, the pointer returned from glGetStringi is handled as a 32-bit value, and then cast back to a pointer type. But why would what happen?   Update #2: I had missed a compilation warning that slightly confirms my suspicion: /home/emil/projects/gltest/external/soil/SOIL.c: In function ‘query_gl_extension’: /home/emil/projects/gltest/external/soil/SOIL.c:1888:24: warning: initialization makes pointer from integer without a cast [enabled by default]    const GLubyte *ext = glGetStringi(GL_EXTENSIONS, 0); However, the call to glGetStringi that i make from MY code does not generate this warning. I fail to see why this warning would occur, and google gives me nothing..
  5. Hey all, I hope I'm posting this in the right place   So I'm "refreshing" my OpenGL knowledge by following a series of tutorials, and have gotten to the texturing part. For loading textures I wanted to try the SOIL library, which seemed pretty good. However when trying to load a texture like so: GLuint tex_2d = SOIL_load_OGL_texture("img_test.png",                 SOIL_LOAD_AUTO,                 SOIL_CREATE_NEW_ID,                 SOIL_FLAG_MIPMAPS | SOIL_FLAG_INVERT_Y | SOIL_FLAG_NTSC_SAFE_RGB | SOIL_FLAG_COMPRESS_TO_DXT); I am getting a segmentation fault because SOIL is doing this: if((NULL == strstr( (char const*)glGetString( GL_EXTENSIONS ),     "GL_ARB_texture_non_power_of_two" ) )) {     /*      not there, flag the failure     */     has_NPOT_capability = SOIL_CAPABILITY_NONE; } else {     /*      it's there!     */         has_NPOT_capability = SOIL_CAPABILITY_PRESENT; } And it appears as if glGetString(GL_EXTENSIONS) is returning NULL. Upon searching for help on this, i found that calling glGetString with GL_EXTENSIONS has been deprecated and one should now use glGetStringi instead.   So what I did was I updated the SOIL code to use this new way of querying for extensions by implementing this function: int query_gl_extension( const char *extension) {         GLuint num_extensions, i;         glGetIntegerv(GL_NUM_EXTENSIONS, &num_extensions);         for(i = 0; i < num_extensions; ++i)         {                 const GLubyte *ext = glGetStringi(GL_EXTENSIONS, i);                 if(strcmp((const char*)ext, extension) == 0)                         return 1;         }         return 0; } But this changed nothing. If I use GDB and step through this code, and try to print the value of the variable ext, I get the following:   $2 = (const GLubyte *) 0xfffffffff473d244 <error: Cannot access memory at address 0xfffffffff473d244>   The strange part is; If I try to call glGetStringi(GL_EXTENSIONS, 0) from MY code right before loading the texture with SOIL, it works. So that should rule out my GL context not being active, no?   What could cause glGetStringi(GL_EXTENSIONS, index) to return what seems to be a pointer to random memory when invoked from SOIL, but not when invoked from my code?   My OpenGL version is 3.3 and my OS is Ubuntu 64-bit.   Thanks!
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!