Jump to content
  • Advertisement
Sign in to follow this  
Bozebo

OpenGL app crash while loading textures

This topic is 3018 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have a texture class and a method to load raw image files into it. It works completely fine if I leave size as default but whenever I pass anything else to it my program just crashes. I don't think the array can be going out of range but it seems to be what is happening, no matter how much I look at it I can't find any problems - it even crashes if I pass size as a value smaller than 128 (not making any value too big to store). If I use a raw image of dimensions 256x256 and leave the size parameter empty it works correctly (loads half the bytes as a 128x128 texture - which displays oddly of course - but the function is doing it's job correctly). The variables size and path are not used/set anywhere else, so there can't be some strange global issue going on. Can anybody see the problem that I cannot?
class texture{
  public:
  GLuint texture; //texture resource
  
  //24bpp raw image file
  bool loadRawRGB(char* path,int size = 128){
    int bytes = size * size * 3; //3 bytes per pixel
    BYTE data[bytes]; //buffer to hold raw image data
    FILE * file; //file handle
    
    //open and read texture data
    file = fopen(path,"rb"); //attempt to open the file
    if(!file) //don't continue if it couldn't be opened
      return false;
    
    fread(&data,bytes,1,file); //copy file contents into the buffer
    fclose(file); //close the file
    
    //allocate a texture resource
    glGenTextures(1,&texture);
    //set as target
    glBindTexture(GL_TEXTURE_2D,texture);
    //tell opengl to make a the texture piramid
    gluBuild2DMipmaps(GL_TEXTURE_2D,3,size,size,GL_RGB,GL_UNSIGNED_BYTE,&data);
       
    //free buffer
    free(&data);
    
    return true;
  }
};

//example use
texture brickWall;
brickWall.loadRawRGB("resources/textures/brickWall.raw");


Thanks for your time. [Edited by - Bozebo on March 18, 2010 1:47:46 PM]

Share this post


Link to post
Share on other sites
Advertisement
Quote:
Original post by HuntsMan
BYTE data[bytes];

Use new to create a dynamic array of bytes, not a stack one.


no... I couldn't see why that would help, but I tried it anyway:


int bytes = size * size * 3; //3 bytes per pixel
BYTE * data;
data = new BYTE[bytes]; //buffer to hold raw image data
FILE * file; //file handle






Exact same issue. The problem is nothing to do with dynamic arrays. This is really annoying it is holding me back; there doesn't seem to be anything wrong with it :S

edit:
Seems I am wrong about smaller values than 128 not working, I must have been doing something wrong because I tested that again now with a dynamic array and without and it seems to work, now I still cannot get 512 working. Surely an array can have 512x512x3 = 786432 indexes? If not how do I load the texture, load it in 2 parts?

[Edited by - Bozebo on March 18, 2010 8:19:09 PM]

Share this post


Link to post
Share on other sites
I'm actually suprised the compiler lets you allocate an array onto the stack when it doesn't know the size at compile time. You would normally have to do what HuntsMan suggested and use a dynamic array, since it is dynamically sized.

Also, why are you calling free? If it is stack allocated then there is nothing to free, and if it is created with new[] then it should be destroyed with delete [].

Do you know which line it crashes on? That would be a major help obviously. Put a breakpoint in and step through to find which line causes it.

Share this post


Link to post
Share on other sites
OK. Right I've edited this I think I have figured it out.

There were a few different issues, the one I ended up with here was I forgot to change:
gluBuild2DMipmaps(GL_TEXTURE_2D,3,size,size,GL_RGB,GL_UNSIGNED_BYTE,&data);
to:
gluBuild2DMipmaps(GL_TEXTURE_2D,3,size,size,GL_RGB,GL_UNSIGNED_BYTE,data);

Silly really, data was a pointer already.

It is being odd but I need to work with it a bit and check I am loading the texture correctly, no crashes now but the texture appears mostly black with some small areas of dull brown.

[Edited by - Bozebo on March 18, 2010 9:45:34 PM]

Share this post


Link to post
Share on other sites
I wasn't expecting that one either. I had my money on it being the free (although I'd have also though that would've crashed no matter the size.)

The most likely cause is incorrect format/internalFormat values, although that should just raise an error, not crash. I haven't used any of the GLU stuff for a while, so thats a shot in the dark.

You should really be using either the GL_GENERATE_MIPMAP texture parameter or (even better) the glGenerateMipMap function, they're both full of hardware accelerated goodness. They should both be available on most machines.

Does it crash when just using glTexImage2D instead?

Share this post


Link to post
Share on other sites
glTexImage2D(GL_TEXTURE_2D,0,3,size,size,0,GL_RGB,GL_UNSIGNED_BYTE,data);
makes the texure completely white :S

GL_GENERATE_MIPMAP and glGenerateMipMap are both unrecognised, where do I get a new OpenGL version from? It might be that I have some old version which doesn't like textures of 512x512?

edit:
FAQ
Oh. "Although the current OpenGL version is 1.5 (as of 2004/03/29)"
Still outdated.

edit:
(ok why doesn't that link work?)

Share this post


Link to post
Share on other sites
Yip, the bain of everyone who uses OpenGL. The extensions =P
GLEW and GLee are your options on this one. I use GLEW myself, it pretty much reduces the whole thing down to 1 function call which gives you access to everything.

However, back to your white texture. Chances are now all you'll have to do is turn off mipmapping for that texture (just as a test obviously).
Just put this before your call to glTexImage2D and it should be able to see if that works.

glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );

I seem to remember that being the cause last time I got a full white texture, I'm still not sure why it was because of the mipmapping.

Edit: To make links work you just the html tags for it. < a href="target"></a > (without the spaces)

Share this post


Link to post
Share on other sites
Oops, I forgot the gamedev.net forums use some html >_< I got confused because I was using bbcode tags for code blocks.

anyway:
No I wasn't actually using mipmapping before, oddly. I recycled that function from somewhere else to start with and didn't realise I should have been better off using glTexImage2D.

This is where I load some textures to test the simple scene I am making:

//choose filtering options
if(linear){
glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
} else {
//nearest-neighbour filtering
glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
}

//colours of this texture chosen to test blending of the particles etc
//also good for seeing world scale
texture texChecks;
texChecks.loadRawRGB("resources/textures/checkered2.raw");

//brick wall texture
texture texBricks;
texBricks.loadRawRGB("resources/textures/brick080.raw",512);




Annoyingly none of the opaque textures are working, they are all white now.

I have a loadRawRGBA method in the texture class, which is used to load a pane of glass and some particle textures, that method basically gives an alpha value the same as the pixel's red value (yes cheap, but all I wanted was some basic "greyscale" alpha - and it let me play about with the texture formats a bit).

I am still really confused :S

Just to make sure here is my current loadRawRGB method:

bool loadRawRGB(char* path,int size = 128){
int bytes = size * size * 3;
BYTE * data;
data = new BYTE[bytes];
//BYTE * data[bytes];
FILE * file;

file = fopen(path,"rb");
if(!file)
return false;

fread(data,bytes,1,file);
fclose(file);

glGenTextures(1,&texture);
glBindTexture(GL_TEXTURE_2D,texture);
glTexImage2D(GL_TEXTURE_2D,0,3,size,size,0,GL_RGB,GL_UNSIGNED_BYTE,data);

return true;
}





I am so puzzled, if I go back to gluBuild2DMipmaps(GL_TEXTURE_2D,3,size,size,GL_RGB,GL_UNSIGNED_BYTE,data); and just avoid any textures larger than 128x128 it works fine, with glTexImage2D none of them work. I don't understand the purpose of border, but I assume it should be 0? Setting it to 1 doesn't help.

Share this post


Link to post
Share on other sites
Texture parameters can only be applied to the currently bound texture, they can't be applied globally. In the code you posted, there is no texture bound when you set the minification filter. Also the min/magnification filters are also set via glTexParameteri not glTexParameterf as your code states, they're integer flags, not floating point values =].

So when you load the texture, it takes the default per-texture values. According to the docs the default value for the minification filter is GL_NEAREST_MIPMAP_LINEAR, meaning that mipmaps are enabled. =P

Try pasting the code I posted between your calls to glBindTexture and glTexImage2D.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!