Jump to content
  • Advertisement
Sign in to follow this  
The_Nerd

OpenGL glTexImage2D causing a SEGFAULT

This topic is 2945 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello all. I have been working with OpenGL for many years, but I have never had a problem this strange. Can anyone spot what is causing the SEGFAULT in my code below? Note: If I scale my malloc to allocate twice as much memory the SEGFAULT goes away... Note also that I am wrapping OpenGL because my propram can use a number of different "contexts". Another note: All my textures in my font ARE NOT POT textures (they all have dimensions that are non-power-of-two), however, I checked and my video card does report GL_ARB_texture_non_power_of_two.



nstd::sint32 font_face::load_glyph(font_glyph &glyph)
{
nstd::uint32 iTexI;
printf("Gen textures\n");
context::gen_textures(1, &iTexI);
printf("Bind texture\n");
glyph.user_index = iTexI;
context::bind_texture(NGUI_TEXTURE_2D, iTexI);

context::tex_parameterf(NGUI_TEXTURE_2D, NGUI_TEXTURE_WRAP_S, NGUI_CLAMP);
context::tex_parameterf(NGUI_TEXTURE_2D, NGUI_TEXTURE_WRAP_T, NGUI_CLAMP);

context::tex_parameteri(NGUI_TEXTURE_2D, NGUI_TEXTURE_MIN_FILTER, NGUI_LINEAR_MIPMAP_NEAREST);
context::tex_parameteri(NGUI_TEXTURE_2D, NGUI_TEXTURE_MAG_FILTER, NGUI_LINEAR);

nstd::uint8 *pixel_data;
printf("Malloc glyph: %d\n", glyph.image->format.pitch);
pixel_data = (nstd::uint8 *)malloc((glyph.image->w * glyph.image->h * glyph.image->format.pitch) * 2);
if (pixel_data == NULL)
context::tex_image(NGUI_TEXTURE_2D, 0, NGUI_RGBA, (unsigned int)glyph.image->w, (unsigned int)glyph.image->h, 0, NGUI_LUMINANCE, NGUI_UNSIGNED_BYTE, glyph.image->pixels);
else
{
printf("Assign\n");
for (size_t l = 0 ; l < (size_t)(glyph.image->w * glyph.image->h * glyph.image->format.pitch) ; l++)
memset(&pixel_data[l * 2], glyph.image->pixels[l * glyph.image->format.pitch], 2);

printf("Bind %d, %d, %u\n", (unsigned int)glyph.image->w, (unsigned int)glyph.image->h, pixel_data);
context::tex_image(NGUI_TEXTURE_2D, 0, NGUI_RGBA, (unsigned int)glyph.image->w, (unsigned int)glyph.image->h, 0, NGUI_LUMINANCE_ALPHA, NGUI_UNSIGNED_BYTE, pixel_data);

printf("Free\n");
::free(pixel_data);
}

return 0;
}



I am dying here! Thanks for the help!

Share this post


Link to post
Share on other sites
Advertisement
What is glyph.image->format.pitch?
Since the segfault disappears when you allocate more memory the cause must be that you're not allocating enough memory in the first place.
Since you are using RGBA as internal format you are telling opengl to use 4 bytes per pixel. So unless that pitch is 4, it'll crash. (I'm assuming that "* 2" is your allocate twice as much memory test)


Share this post


Link to post
Share on other sites
Can you tell us which of the 2 TexImage2D calls you're getting the segfault from? Also inspect the contents of variables and members in the debugger just before the TexImage2D call and make sure that there's not crazy values in there - the kind of thing that can happen if you've a bad pointer elsewhere that's clobbering data on you. If everything else seems OK, maybe step through font_face::load_glyph in the debugger line by line, examining local variables and member variables at each step, and making sure that everything is as you expect it to be.

The most common causes of crashes during texture uploads would be either not enough memory allocated, or the pointer used is somehow invalid, so that's something else to check.

Share this post


Link to post
Share on other sites
Sorry, I forgot to mention about the pitch. The pitch is the image's "number of bytes per pixel", in this case the font "glyph" is a 1 byte per pixel image (just gray). I thought that OpenGL did it's own internal allocation and that it would allocate memory based on the internal format, but that it would expand or compress the data as needed to fit that internal format. You see, I believe GL_LUMINACE_ALPHA is a two part format, and I am using unsigned byte for the data, so that makes the needed space (W * H * 2) correct? (You are right about the pitch, I shouldn't have that in there, but it is 1). I have already ran the code through the debugger and the memory pointer is fine. The stack fails somewhere deep inside OpenGL.

So you are saying that OpenGL will run through MY buffer based on the internal RGBA format, and not my format of GL_LUMINANCE_ALPHA @ GL_UNSIGNED_BYTE?

Share this post


Link to post
Share on other sites
Okay, I got it working... kinda...

I tried scaling my image data so that both internal format and input format are RGBA. This resolved my SEGFAULT, but now I have an even stranger issue. All my textures are white. But here is the funny thing:

1. All of them are powers of two size-wise (256x128, 512x512, etc...)
2. My program works fine if I use gluBuild2DMipMaps
3. Mipmapping IS DISABLED!

What am I doing wrong?


nstd::sint32 font_face::load_glyph(font_glyph &glyph)
{
context::enable(NGUI_TEXTURE_2D);
context::gen_textures(1, &glyph.user_index);
context::bind_texture(NGUI_TEXTURE_2D, glyph.user_index);

context::tex_parameteri(NGUI_TEXTURE_2D, NGUI_TEXTURE_WRAP_S, NGUI_CLAMP);
context::tex_parameteri(NGUI_TEXTURE_2D, NGUI_TEXTURE_WRAP_T, NGUI_CLAMP);

/* Disable mipmapping */
context::tex_parameteri(NGUI_TEXTURE_2D, NGUI_TEXTURE_MIN_FILTER, NGUI_NEAREST);
context::tex_parameteri(NGUI_TEXTURE_2D, NGUI_TEXTURE_MAG_FILTER, NGUI_NEAREST);

nstd::uint16 new_width = power_of_two(glyph.image->w), new_height = power_of_two(glyph.image->h);
nstd::uint8 *pixel_data = glyph.scale_glyph(new_width, new_height, 4);

if (pixel_data == NULL)
{
context::tex_image(NGUI_TEXTURE_2D, 0, NGUI_RGBA, (unsigned int)glyph.image->w, (unsigned int)glyph.image->h, 0, NGUI_LUMINANCE, NGUI_UNSIGNED_BYTE, glyph.image->pixels);
}
else
{
/* This is the function being called */
glTexImage2D(NGUI_TEXTURE_2D, 0, NGUI_RGBA, (unsigned int)new_width, (unsigned int)new_height, 0, NGUI_RGBA, NGUI_UNSIGNED_BYTE, pixel_data);
/* Works if I use this */
gluBuild2DMipmaps(NGUI_TEXTURE_2D, NGUI_RGBA, (unsigned int)new_width, (unsigned int)new_height, NGUI_RGBA, NGUI_UNSIGNED_BYTE, pixel_data);
::free(pixel_data);
}

return 0;
}


By the way, I also checked glGetError() after calling glTexImage2D and it is returning GL_NO_ERROR. What am I doing wrong?

Share this post


Link to post
Share on other sites
A white texture normally indicates an invalid texture object. gluBuild2DMipmaps does a LOT of things internally besides just building a mipmap chain; it checks against the max texture size allowed by your hardware, resizes to powers of 2, sets pixel store and transfer parameters, and so on. These are all potential items for you to check, especially that you're not exceeding GL_MAX_TEXTURE_SIZE.

I read somewhere that glGetError is not totally reliable for texture creation. Maybe try creating a proxy texture instead?

Share this post


Link to post
Share on other sites
I have already check all the things you mentioned. I have checked:

1) Mapmapping is off.
2) Texture size well within card limits
3) All textures are power of two textures
4) I even tried power of two square textures... no effect
5) I have visually inspected my data... it is good

One thing I haven't checked is SDL functions. Aren't there SDL functions to modify the OpenGL context? Maybe one or more of those need calling...

Share this post


Link to post
Share on other sites
Please help me... I have tried everything I know of including the pixel pack alignment, texture modulate, calling order of functions, etc... What am I doing wrong? Why is my texture all white when I call glTexImage2D but it works fine when I call gluBuild2DMipmaps??? Please help!

Share this post


Link to post
Share on other sites
Okay... I figured it out and it is rather embarrassing... My wrapper functions for glTexParameter* where never calling the corresponding GL functions... works great now! :D

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!