Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


Trouble using glGetTexImage


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
9 replies to this topic

#1 OmarShehata   Members   -  Reputation: 205

Like
0Likes
Like

Posted 28 March 2012 - 11:16 PM

I'm trying to create a function that takes an array of textures and compiles them all into one big texture.

I'll be doing this using glTexSubImage2D. However to do that, I need to get the pixel data from the texture. So I'm doing that using glGetTexImage. Here's my code so far:

GLuint CreateMegaTexture(vector<GLuint> texies ){
glBindTexture(GL_TEXTURE_2D,texies.at(0));

GLint textureWidth, textureHeight;
int bytes;
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_WIDTH, &textureWidth);
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_HEIGHT, &textureHeight);
bytes = textureWidth*textureHeight;


float Pixels[bytes];

glGetTexImage(GL_TEXTURE_2D,0,GL_RGBA,GL_UNSIGNED_BYTE,&Pixels);
//GLenum error = glGetError();//Get error
}

For now, I'm just testing by trying to get the pixel data of the first element in the "texies" array. So first I get the size of the texture so that I know how large I need to make the array to hold the pixel data. Now there is the problem.

The documentation said the "Pixels" array would need to be of the same type that I chose, which is "GL_UNSIGNED_BYTE" in my case. However, if I do:

GL_UNSIGNED_BYTE Pixels[bytes];

I get an error:

"error: expected ';' before 'Pixels'

If I try GLfloat instead:

GLfloat Pixels[bytes];
glGetTexImage(GL_TEXTURE_2D,0,GL_RGBA,GLfloat,&Pixels);


I get an even stranger error:

expected primary-expression before ',' token

I don't think it will work if I have the array as simply "float" (although it doesn't throw any errors by the compiler), so I thought I should fix this issue before moving on to glTexSubImage2D.

Any help would be appreciated, thanks!

Sponsor:

#2 mhagain   Crossbones+   -  Reputation: 8275

Like
1Likes
Like

Posted 29 March 2012 - 03:51 AM


// GLubyte is the GL defined type for unsigned bytes

// make sure that "bytes" is large enough; since each texel has 4 components

// (r, g, b and a) "bytes = textureWidth * textureHeight;" is not sufficient.

// use "bytes = textureWidth * textureHeight * 4;" instead

// you probably don't want to declare this as a local either...

GLubyte Pixels[bytes];



// don't use &Pixels as that will get the address of the array.  Just Pixels on

// it's own is what you need

glGetTexImage (GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, Pixels);

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#3 FXACE   Members   -  Reputation: 182

Like
2Likes
Like

Posted 29 March 2012 - 05:22 AM

Extra help. Example "Getting image data from texture":
GLint width,height,internalFormat;
glBindTexture(GL_TEXTURE_2D, your_texture_id);
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_COMPONENTS, &internalFormat); // get internal format type of GL texture
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_WIDTH, &width); // get width of GL texture
glGetTexLevelParameteriv(GL_TEXTURE_2D, 0, GL_TEXTURE_HEIGHT, &height); // get height of GL texture
// GL_TEXTURE_COMPONENTS and GL_INTERNAL_FORMAT are the same.
// just work with RGB8 and RGBA8
GLint numBytes = 0;
switch(internalFormat) // determine what type GL texture has...
{
case GL_RGB:
numBytes = width * height * 3;
break;
case GL_RGBA:
numBytes = width * height * 4;
break;
default: // unsupported type (or you can put some code to support more formats if you need)
break;
}

if(numBytes)
{
unsigned char *pixels = (unsigned char*)malloc(numBytes); // allocate image data into RAM
glGetTexImage(GL_TEXTURE_2D, 0, internalFormat, GL_UNSIGNED_BYTE, pixels);
{
  // TODO with pixels
}
free(pixels); // when you don't need 'pixels' anymore clean a memory page to avoid memory leak.
}


Best wishes, FXACE.

#4 mhagain   Crossbones+   -  Reputation: 8275

Like
0Likes
Like

Posted 29 March 2012 - 09:45 AM

A final note here is that this isn't actually OpenGL-specific stuff; it's basic C/C++ - allocating sufficient memory, getting data types right, how to use arrays and pointers, etc. I'd advise to focus some effort in this area before diving into OpenGL itself.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#5 OmarShehata   Members   -  Reputation: 205

Like
0Likes
Like

Posted 29 March 2012 - 10:46 PM

Thanks for the help everybody! I got it to work.

To mhagain,
I'm pretty sure I need to use the address of the Pixel array because according to the documentation, this function returns the pixel value *in* the Points array, so if I don't pass a reference it won't be able to update the value of Pixels array.

And the problem was me not knowing what the data type was called for an unsigned byte in openGL, not simply a general C++ problem.

To FXAXE, thanks again! I really really appreciate the detailed explanation and all your help so far with all my topics :P

#6 tanzanite7   Members   -  Reputation: 1378

Like
0Likes
Like

Posted 30 March 2012 - 04:15 AM

Identifier for an array IS a reference to the array data.

#7 OmarShehata   Members   -  Reputation: 205

Like
0Likes
Like

Posted 30 March 2012 - 04:57 AM

Identifier for an array IS a reference to the array data.


Wait so if Points is an array, what's the difference between:

Points and &Points ?

#8 Brother Bob   Moderators   -  Reputation: 8571

Like
1Likes
Like

Posted 30 March 2012 - 05:11 AM

The difference is the type of the expression; Points is a reference to an array (type is GLubyte [N], where N is the number of elements in the array), and &Points is a pointer to an array (type is GLubyte (*)[N]). They are different types, and have different properties when it comes to what other pointer types they decay to or can be implicitly cast into. But as long as they decay to or can be cast into another pointer type, they will likely have the same actual value which is a pointer to the first element of the array.

#9 mhagain   Crossbones+   -  Reputation: 8275

Like
0Likes
Like

Posted 30 March 2012 - 02:18 PM

Look at another example - different API, same principle. Good old-fashioned memcpy.

If you want to memcpy to an array you use:
type myarray[size];
memcpy (myarray, stuff, length);

If you want to memcpy to a pointer you use:
type *mypointer; // make sure this points somewhere valid!!!
memcpy (mypointer, stuff, length);
Again, this is basic C/C++ stuff and nothing to do with OpenGL.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#10 Adam West   Members   -  Reputation: 219

Like
0Likes
Like

Posted 04 April 2012 - 01:42 AM

IMO, use a third party texture loader like DEVIL or SOIL




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS