Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!


1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


Segmentation fault when loading texture using SOIL


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
6 replies to this topic

#1 Athe   Members   -  Reputation: 107

Like
0Likes
Like

Posted 28 July 2014 - 12:17 PM

Hey all, I hope I'm posting this in the right place

 

So I'm "refreshing" my OpenGL knowledge by following a series of tutorials, and have gotten to the texturing part.

For loading textures I wanted to try the SOIL library, which seemed pretty good. However when trying to load a texture like so:

GLuint tex_2d = SOIL_load_OGL_texture("img_test.png",
                SOIL_LOAD_AUTO,
                SOIL_CREATE_NEW_ID,
                SOIL_FLAG_MIPMAPS | SOIL_FLAG_INVERT_Y | SOIL_FLAG_NTSC_SAFE_RGB | SOIL_FLAG_COMPRESS_TO_DXT);

I am getting a segmentation fault because SOIL is doing this:

if((NULL == strstr( (char const*)glGetString( GL_EXTENSIONS ),
    "GL_ARB_texture_non_power_of_two" ) ))
{
    /*      not there, flag the failure     */
    has_NPOT_capability = SOIL_CAPABILITY_NONE;
} else
{
    /*      it's there!     */
        has_NPOT_capability = SOIL_CAPABILITY_PRESENT;
}

And it appears as if glGetString(GL_EXTENSIONS) is returning NULL.

Upon searching for help on this, i found that calling glGetString with GL_EXTENSIONS has been deprecated and one should now use glGetStringi instead.

 

So what I did was I updated the SOIL code to use this new way of querying for extensions by implementing this function:


int query_gl_extension( const char *extension)
{
        GLuint num_extensions, i;

        glGetIntegerv(GL_NUM_EXTENSIONS, &num_extensions);
        for(i = 0; i < num_extensions; ++i)
        {
                const GLubyte *ext = glGetStringi(GL_EXTENSIONS, i);
                if(strcmp((const char*)ext, extension) == 0)
                        return 1;
        }

        return 0;
}

But this changed nothing. If I use GDB and step through this code, and try to print the value of the variable ext, I get the following:

 

$2 = (const GLubyte *) 0xfffffffff473d244 <error: Cannot access memory at address 0xfffffffff473d244>
 

The strange part is; If I try to call glGetStringi(GL_EXTENSIONS, 0) from MY code right before loading the texture with SOIL, it works. So that should rule out my GL context not being active, no?

 

What could cause glGetStringi(GL_EXTENSIONS, index) to return what seems to be a pointer to random memory when invoked from SOIL, but not when invoked from my code?

 

My OpenGL version is 3.3 and my OS is Ubuntu 64-bit.

 

Thanks!



Sponsor:

#2 Athe   Members   -  Reputation: 107

Like
0Likes
Like

Posted 28 July 2014 - 03:06 PM

A small update:

If I call glGetStringi(GL_EXTENSIONS, 0) from my project and then call glGetStringi(GL_EXTENSIONS, 0) from within SOIL (by modifying the SOIL source), I find that when calling from MY code, it returns a pointer to a correct string ("GL_AMD_multi_draw_indirect") at 0x00007ffff473d244, and when called from SOIL, it returns a pointer to 0xfffffffff473d244.

 

So it would seem as the second time it returns the pointer, its upper 4 bytes have all been set to 0xFF.

This leads me to believe that somewhere, the pointer returned from glGetStringi is handled as a 32-bit value, and then cast back to a pointer type. But why would what happen?

 

Update #2:

I had missed a compilation warning that slightly confirms my suspicion:

/home/emil/projects/gltest/external/soil/SOIL.c: In function ‘query_gl_extension’:
/home/emil/projects/gltest/external/soil/SOIL.c:1888:24: warning: initialization makes pointer from integer without a cast [enabled by default]
   const GLubyte *ext = glGetStringi(GL_EXTENSIONS, 0);

However, the call to glGetStringi that i make from MY code does not generate this warning. I fail to see why this warning would occur, and google gives me nothing..


Edited by Athe, 28 July 2014 - 03:25 PM.


#3 Sponji   Members   -  Reputation: 1655

Like
0Likes
Like

Posted 28 July 2014 - 05:37 PM

Sounds weird. What compiler and which version of it are you using? And just a random guess, but does SOIL.c have the needed includes for glGetStringi?


Derp

#4 Athe   Members   -  Reputation: 107

Like
0Likes
Like

Posted 29 July 2014 - 01:50 AM

Thanks for your reply,

I'm using GCC (my code is compiled as C++, and SOIL is compiled as C) version 4.8.2.

 

SOIL.c includes GL/gl.h which in turn includes GL/glext.h, where the declaration for glGetStringi is found.



#5 Sponji   Members   -  Reputation: 1655

Like
1Likes
Like

Posted 29 July 2014 - 03:10 AM

And you're actually defining GL_GLEXT_PROTOTYPES so that the glGetStringi gets declared? I just made a simple test, and compiling it as C++ creates an error.

 

gcc -Wall -g test.c $(pkg-config --libs --cflags sdl2) -lSOIL -lGL -lGLEW && ./a.out:

test.c:10:3: warning: implicit declaration of function ‘glGetStringi’ [-Wimplicit-function-declaration]

g++ -Wall -g test.cpp $(pkg-config --libs --cflags sdl2) -lSOIL -lGL -lGLEW && ./a.out:

test.cpp:10:53: error: ‘glGetStringi’ was not declared in this scope

Edited by Sponji, 29 July 2014 - 03:10 AM.

Derp

#6 Athe   Members   -  Reputation: 107

Like
0Likes
Like

Posted 29 July 2014 - 03:49 AM

I am not defining GL_GLEXT_PROTOTYPES myself, but I assume that GLFW/GLEW or some other header file that I'm using defines it, because I do not get that compilation error.



#7 Athe   Members   -  Reputation: 107

Like
0Likes
Like

Posted 29 July 2014 - 03:55 AM

Ok...so now I just tried creating a small test program in C like this:

#include <GLFW/glfw3.h>
#include <stdio.h>

int main(void)
{
        GLFWwindow *window;
        const GLubyte *ext;

        if(!glfwInit()) {
                fprintf(stderr, "Failed to initialize GLFW\n");
                return -1;
        }

        glfwWindowHint(GLFW_SAMPLES, 4); // 4x antialiasing
        glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3); // OpenGL 3.3
        glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
        glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE); // Apparently needed "to make OS X happy"
        glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);

        window = glfwCreateWindow(1024, 768, "Tutorial 01", NULL, NULL);
        if(window == NULL) {
                fprintf(stderr, "Failed to open GLFW window.");
                glfwTerminate();
                return -1;
        }

        glfwMakeContextCurrent(window);

        ext = glGetStringi(GL_EXTENSIONS, 0);
        printf("%s\n", (const char*)ext);
        return 0;
}

And it failed just like previously. The returned pointer from glGetStringi is not correct, the upper 4 bytes are all 0xFF, just as if it is returning a 32-bit pointer to my 64-bit program and the 0xFF are just default values for the extra bytes.

 

However, if I do what you said and define GL_GLEXT_PROTOTYPES, it works! That seems really weird to me, but I'm glad it solves the problem!






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS