Jump to content

  • Log In with Google      Sign In   
  • Create Account


How to squeeze rectangular bitmap into GL_TEXTURE_2D?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
7 replies to this topic

#1 Synthetix   Members   -  Reputation: 190

Like
0Likes
Like

Posted 12 November 2011 - 08:52 PM

I have a program working with images using GL_TEXTURE_RECTANGLE_ARB and a fragment shader with both sampler2DRect and texture2DRect to grab texture coordinates. However, I need to implement a shader that accesses the images using normalized values (0-1.0). sampler2DRect and texture2DRect only work with integers.

I know a big difference between GL_TEXTURE_RECTANGLE_ARB and GL_TEXTURE_2D is the way texture dimensions work. Of course, my images are all over the place size-wise, so I decided I would just take the longest dimension of an image and pass a square texture to glTexImage2D. I'd then work out the cropping, etc. in the shader:

/*
"orig_data" is value returned from image reader, etc. It is the
actual pixel data as a grayscale image (only one color plane)
"orig_length" is the length of the actual data in bytes
*/

int width = 1024;
int height = 768;

glEnable(GL_TEXTURE_2D);

/* make a bigger-than-normal buffer to hold the image plus padding needed to make texture square */
GLubyte *new_data = (GLubyte*)malloc(height*height);

/* copy existing data into bigger buffer */
memcpy(new_data,orig_data,orig_length);

glTexImage2D(GL_TEXTURE_2D,0,1,width,width,0,GL_LUMINANCE,GL_UNSIGNED_BYTE,(GLubyte*)new_data);

However, this doesn't work even when I use sampler2D and texture2D in my shader. What I usually get is either garbled data or black.

Any idea why this isn't working as expected? Is it something to do with my padding of the image data?

Sponsor:

#2 dpadam450   Members   -  Reputation: 842

Like
0Likes
Like

Posted 12 November 2011 - 09:58 PM

memcpy(new_data,orig_data,orig_length);

This is not going to put a square texture into a rectangle if that is what you are trying to do. Your newData is a rectangle, your old data is....?(assume a square). This is going to basically create garbage. On nvidia cards (never used amd but those probably work as well, maybe not intel though), you can texImage a rectangle and not have to do anything different between a rectangle and square texture. The texture coords are still the same as well (0 to 1). It will help what you are doing, I see that your destination is rectangle, then your source in square? Why put a square texture into a rectangle?

and pass a square texture to glTexImage2D

It sounds like you want to make a rectangle and pack it into a square texture the next biggest size. Right? Because that is not what you are doing here. You passed a texture of (1024x768) to texImage2D. And you could always open up mspaint/photoshop etc and just stretch a rectangular texture into the next highest square size. Then you don't have to program the stuff you are doing.

#3 Synthetix   Members   -  Reputation: 190

Like
1Likes
Like

Posted 13 November 2011 - 12:09 AM

On nvidia cards (never used amd but those probably work as well, maybe not intel though), you can texImage a rectangle and not have to do anything different between a rectangle and square texture. The texture coords are still the same as well (0 to 1). It will help what you are doing

Well, the documentation says that the texture size for GL_TEXTURE_2D must be power of 2. That's why I passed "width" as both the width and height. Is it possible to pass rectangular dimensions when the texture target is GL_TEXTURE_2D on some systems? I'd been using GL_TEXTURE_RECTANGLE_ARB because I thought I had to with non-power of 2 textures. But then I have the problem that the texture coords aren't normalized.

I see that your destination is rectangle, then your source in square? Why put a square texture into a rectangle?

Other way around. I have an image (1024x768) that I need to put into a square texture. I thought padding the pixel data to make it square would work, but as you pointed out it didn't work.

#4 dpadam450   Members   -  Reputation: 842

Like
0Likes
Like

Posted 13 November 2011 - 12:24 AM

Other way around. I have an image (1024x768) that I need to put into a square texture.

In your example though you made a texture of widthxheight = 1024x768 to glTexImage2D(). So you made your texture not square. Anyway on certain cards you don't need to do anything just make a non power of two texture and load it in. I've done it before, dont do all that memcpy() stuff and try it first.

I thought padding the pixel data to make it square would work, but as you pointed out it didn't work.

It didnt work because you didnt pad it, you just raw memcpy()'d it which means you continued writing the first row, with the second rows data.

Anyway, all you need to do is fix your texture as an artist would, resize in photoshop, which would stretch your rectangle to a square.
or
Make a square image, and copy/paste your rectangle into it (obviously it wont fit the exact size and will have padding). You will need to recompute UV coords then to display it without the blank padding.

#5 Synthetix   Members   -  Reputation: 190

Like
0Likes
Like

Posted 13 November 2011 - 01:06 AM

In your example though you made a texture of widthxheight = 1024x768 to glTexImage2D(). So you made your texture not square.

Have another look. I actually passed "width" twice (not w x h), so I could keep all the resolution of the input bitmap and satisfy the power of 2 requirement.

Anyway on certain cards you don't need to do anything just make a non power of two texture and load it in. I've done it before, dont do all that memcpy() stuff and try it first.

Ok, I'll give it a shot!

Anyway, all you need to do is fix your texture as an artist would, resize in photoshop, which would stretch your rectangle to a square.
or
Make a square image, and copy/paste your rectangle into it (obviously it wont fit the exact size and will have padding). You will need to recompute UV coords then to display it without the blank padding.

These are actually dynamically loaded images, so I won't know the dimensions beforehand. I guess this will be OK if the GL implementation accepts the non-power of 2 dimensions.

#6 mhagain   Crossbones+   -  Reputation: 7422

Like
0Likes
Like

Posted 13 November 2011 - 08:55 AM

Well, the documentation says that the texture size for GL_TEXTURE_2D must be power of 2. That's why I passed "width" as both the width and height. Is it possible to pass rectangular dimensions when the texture target is GL_TEXTURE_2D on some systems? I'd been using GL_TEXTURE_RECTANGLE_ARB because I thought I had to with non-power of 2 textures. But then I have the problem that the texture coords aren't normalized.


You've a misunderstanding there. The power of two restriction applies to each dimension separately, and textures are not required to be square. You can have a width of 1024 and a height of 64 if you wish, and the texture will work (with GL_TEXTURE_2D) on any hardware (provided the hardware supports a texture size of 1024 or higher, of course).

Only older hardware has the power of two restriction, so if you're happy to drop support for this older hardware then you can quite happily use any size you want. In general, any graphics hardware from the last 7 years or so will no longer have this restriction, but it still exists with some slightly more recent Intel chips (so if you need to run on Intel graphics it becomes last 4 or 5 years). This is where you need to research your target market and make a decision on what the minimum hardware spec you'll require is. You have a risk of putting a lot of work into supporting old hardware that none of your end users actually has, and while supporting that kind of hardware can seem like the "right thing" to do, in the end you need to decide if it's time and effort that would have been better invested elsewhere. Only you can make that decision, but you need to be sure that the decision is supported by actual facts.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#7 Synthetix   Members   -  Reputation: 190

Like
0Likes
Like

Posted 13 November 2011 - 08:38 PM

Well, oh boy. The reason it wasn't working was because I was still using glTexCoord2i instead of glTexCoord2f! Oops!

It seems to be working fine now with GL_TEXTURE_2D. Thanks for your help!

You've a misunderstanding there. The power of two restriction applies to each dimension separately, and textures are not required to be square.


Ah. That helps. Yes, I didn't get that. Thanks again.

#8 V-man   Members   -  Reputation: 797

Like
0Likes
Like

Posted 14 November 2011 - 01:28 PM

There is a bunch of things wrong here. You don't need to use integer coordinates to access GL_TEXTURE_RECTANGLE_ARB. You need non-normalized coordinates.

As for GL_TEXTURE_2D and the requirement of being power of 2, that was for GL 1.1
With GL 2.0, that restriction is removed. Make sure you video card supports at least GL 2.0.
Since you are using GLSL, I'm assuming that is so.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS