Trouble using glGetTexImage

Started by
8 comments, last by Adam West 12 years ago
I'm trying to create a function that takes an array of textures and compiles them all into one big texture.

I'll be doing this using glTexSubImage2D. However to do that, I need to get the pixel data from the texture. So I'm doing that using glGetTexImage. Here's my code so far:


GLuint CreateMegaTexture(vector<GLuint> texies ){
glBindTexture(GL_TEXTURE_2D,texies.at(0));

GLint textureWidth, textureHeight;
int bytes;
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_WIDTH, &textureWidth);
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_HEIGHT, &textureHeight);
bytes = textureWidth*textureHeight;


float Pixels[bytes];

glGetTexImage(GL_TEXTURE_2D,0,GL_RGBA,GL_UNSIGNED_BYTE,&Pixels);
//GLenum error = glGetError();//Get error
}


For now, I'm just testing by trying to get the pixel data of the first element in the "texies" array. So first I get the size of the texture so that I know how large I need to make the array to hold the pixel data. Now there is the problem.

The documentation said the "Pixels" array would need to be of the same type that I chose, which is "GL_UNSIGNED_BYTE" in my case. However, if I do:

GL_UNSIGNED_BYTE Pixels[bytes];

I get an error:

"error: expected ';' before 'Pixels'

If I try GLfloat instead:

GLfloat Pixels[bytes];
glGetTexImage(GL_TEXTURE_2D,0,GL_RGBA,GLfloat,&Pixels);


I get an even stranger error:

expected primary-expression before ',' token

I don't think it will work if I have the array as simply "float" (although it doesn't throw any errors by the compiler), so I thought I should fix this issue before moving on to glTexSubImage2D.

Any help would be appreciated, thanks!
Advertisement

// GLubyte is the GL defined type for unsigned bytes
// make sure that "bytes" is large enough; since each texel has 4 components
// (r, g, b and a) "bytes = textureWidth * textureHeight;" is not sufficient.
// use "bytes = textureWidth * textureHeight * 4;" instead
// you probably don't want to declare this as a local either...
GLubyte Pixels[bytes];

// don't use &Pixels as that will get the address of the array. Just Pixels on
// it's own is what you need
glGetTexImage (GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, Pixels);

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Extra help. Example "Getting image data from texture":

GLint width,height,internalFormat;
glBindTexture(GL_TEXTURE_2D, your_texture_id);
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_COMPONENTS, &internalFormat); // get internal format type of GL texture
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_WIDTH, &width); // get width of GL texture
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_HEIGHT, &height); // get height of GL texture
// GL_TEXTURE_COMPONENTS and GL_INTERNAL_FORMAT are the same.
// just work with RGB8 and RGBA8
GLint numBytes = 0;
switch(internalFormat) // determine what type GL texture has...
{
case GL_RGB:
numBytes = width * height * 3;
break;
case GL_RGBA:
numBytes = width * height * 4;
break;
default: // unsupported type (or you can put some code to support more formats if you need)
break;
}

if(numBytes)
{
unsigned char *pixels = (unsigned char*)malloc(numBytes); // allocate image data into RAM
glGetTexImage(GL_TEXTURE_2D, 0, internalFormat, GL_UNSIGNED_BYTE, pixels);
{
// TODO with pixels
}
free(pixels); // when you don't need 'pixels' anymore clean a memory page to avoid memory leak.
}



Best wishes, FXACE.
A final note here is that this isn't actually OpenGL-specific stuff; it's basic C/C++ - allocating sufficient memory, getting data types right, how to use arrays and pointers, etc. I'd advise to focus some effort in this area before diving into OpenGL itself.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Thanks for the help everybody! I got it to work.

To mhagain,
I'm pretty sure I need to use the address of the Pixel array because according to the documentation, this function returns the pixel value *in* the Points array, so if I don't pass a reference it won't be able to update the value of Pixels array.

And the problem was me not knowing what the data type was called for an unsigned byte in openGL, not simply a general C++ problem.

To FXAXE, thanks again! I really really appreciate the detailed explanation and all your help so far with all my topics :P
Identifier for an array IS a reference to the array data.

Identifier for an array IS a reference to the array data.


Wait so if Points is an array, what's the difference between:

Points and &Points ?
The difference is the type of the expression; Points is a reference to an array (type is GLubyte [N], where N is the number of elements in the array), and &Points is a pointer to an array (type is GLubyte (*)[N]). They are different types, and have different properties when it comes to what other pointer types they decay to or can be implicitly cast into. But as long as they decay to or can be cast into another pointer type, they will likely have the same actual value which is a pointer to the first element of the array.
Look at another example - different API, same principle. Good old-fashioned memcpy.

If you want to memcpy to an array you use:type myarray[size];
memcpy (myarray, stuff, length);


If you want to memcpy to a pointer you use:type *mypointer; // make sure this points somewhere valid!!!
memcpy (mypointer, stuff, length);

Again, this is basic C/C++ stuff and nothing to do with OpenGL.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

IMO, use a third party texture loader like DEVIL or SOIL

This topic is closed to new replies.

Advertisement