Archived

This topic is now archived and is closed to further replies.

smart_idiot

Where, oh where, has the texture data gone?

Recommended Posts

smart_idiot    1298
How do I use ''glGetTexImage(GLenum target, GLint level, GLenum format, GLenum type, GLvoid * pixels);''? Everytime I try to use it, my program crashes. I use ''glGetTexLevelParameteriv(GLenum target, GLint level, GLenum pname, GLint * params);'' to get the dimensions of my texture. It says ''256x256x4'', so I know that it''s binded and that everything is A-Okay. I can even see it on the screen. I make a buffer, unsigned char buf[256*256*4];, and everything is happy. But, I type: glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, buf); and my program crashes. I type: unsigned char *buf = NULL; glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, &buf); Thinking that maybe it allocates the right amount of memory on it''s own. But again it crashes. I type: unsigned char buf[256*256*256]; glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, buf); Thinking that maybe the data was aligned on some boundry and needed more memory, but again it crashed. Finially, getting really angry, I type: glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL); Just to see if it wouldn''t just do nothing. It still crashed. What am I doing wrong? Sure, I wouldn''t need to do this if I kept the original copy, but I don''t want to have several copies of the texture in memory if I don''t need to. Or maybe that is the problem, and I shouldn''t free the texture data after I call glTexImage2D? It never caused a problem before, though; I assumed OpenGL made it''s own copy. If I''m supposed to be in control of the data though, I wouldn''t need glGetTexImage in the first place, so I don''t think that''s the problem. Help, please?

Share this post


Link to post
Share on other sites
vincoof    514
First things first, what do you mean by ''it crashes'' ?
According to OpenGL specifications, misusage of GL commands only generate errors, not crashes.
It should crash if not enough memory is allocated into pixels (it''s up to you to alocate the memory, the GL assumes that memory has already been allocated when you pass the argument).

btw, yes the texture is stored into the server memory. You don''t have to duplicate the memory to keep the texture from the client side, even though you have to be sure that the texture has been treated before delting this client data. (just in case you don''t know : ''server'' roughly means ''OpenGL'' and ''client'' roughly means ''your application'').
Be careful that you called glFlush or glFinish before erasing your (client) texture pointer.

You should check the GL_PACK_ALIGNMENT, even though it would not be a problem if you allocate 256*256*256 bytes.

Do you always check the value of ''pixels'' when you pass it as an argument ?
Maybe you 256*256*256 array is not allocated, because it''s a big hit in the stack.

And the last things you should try is comparing with another GL implementation :
1- download the latest drivers (btw, what''s your card?)
2- check your exe on another computer.

Share this post


Link to post
Share on other sites
smart_idiot    1298
It crashes inside ''gdi???.dll'', I don''t know the exact name, because my program is at home, I don''t have internet at home, and I''m at school right now.

I''ll try unsigned char *buf = malloc(256*256*256); instead, check for NULL, and see if that works.

I''ll tell you if it works tomorrow.

Share this post


Link to post
Share on other sites
_DarkWIng_    602
your buffer should be something like this

unsigned char* buff = new unsigned char[ image_width * image_height * image_depth ];



There are more worlds than the one that you hold in your hand...

Share this post


Link to post
Share on other sites