Sign in to follow this  

Maximum size of texture

This topic is 4106 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I use these piece of code to find the maximum size of texture: int max; glGetIntegerv(GL_MAX_TEXTURE_SIZE, &max); cout<<max<<endl; But the program generate different answers each time I run it. Why this happen? Also, what does GL_MAX_TEXTURE means? A texture can be size of max X max or size sqrt(max) X sqrt(max)?

Share this post


Link to post
Share on other sites
Most routines of OpenGL would result in valid values if and only if they are invoked with a valid context running. So make sure you have a valid context set-up, and perhaps the randomly shuffled results may go away (otherwise I don't know).

The result will normally be a power of 2, and gives you the maximum edge length of a texture. However, there is a pitfall with this function as described in the red book (but my copy is something old, and perhaps nowadays the situation is another one):

Quote:
Red Book
glGetIntegerv(GL_MAX_TEXTURE_SIZE,...) tells you the largest dimension (width or height, without borders) of a texture image, typically the size of the largest square texture supported. However, GL_MAX_TEXTURE_SIZE does not consider the effect of the internal format of a texture. A texture image that stores texels using the GL_RGBA16 internal format may be using 64 bits per texel, so its image may have to be 16 times smaller than an image with the GL_LUMINANCE4 internal format. (Also, images requiring borders or mipmaps may further reduce the amount of available memory.)

Share this post


Link to post
Share on other sites
there is a program used to query the graphics card for available extensions and other constants such and maximum lights, texture size, matrix stack size, etc: http://www.realtech-vr.com/glview/

The source code is freely available. I'm guessing there could be some miscoding on your program.

Share this post


Link to post
Share on other sites
Really interesting, what kind of results are you getting ?

I have a few suggestions, don't know if they would help:

1. Try to put glEnable( GL_TEXTURE_2D ); before calling this command

2. Check for OpenGl errors after calling glGetIntegerv( GL_MAX_TEXTURE_SIZE, &&maxe ); with glGetError();

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
It also depend on what video card you're running with...ATI's maximum texture size supported is 2048x2048 (don't know if they bumped this up with newer cards) while NVidia supports up to 4098x4098. So that may help you narrow the search down...

Share this post


Link to post
Share on other sites
I have tried http://www.realtech-vr.com/glview/ to check my texture size limit and the result is 4096*4096. However, when I run my program using this texture size, it cannot run properly. What may be the cause of this result?

Share this post


Link to post
Share on other sites
Quote:
Original post by rosicky2005
I have tried http://www.realtech-vr.com/glview/ to check my texture size limit and the result is 4096*4096. However, when I run my program using this texture size, it cannot run properly. What may be the cause of this result?

Perhaps the same that avoids you to get the GL_MAX_TEXTURE_SIZE value by your own app. I suggest you not to ignore the problem. Try to find out what happens, else you may have a hidden bug in your code. If we should help you, please start to post related code snippets, and describe in more detail what happens.

Share this post


Link to post
Share on other sites
Here is some of my code:

#include <iostream>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <math.h>
#include <time.h>
#include <GL/glew.h>
#include <GL/gl.h>
#include <GL/glu.h>
#include <GL/glut.h>
using namespace std;

struct struct_textureParameters {
char* name;
GLenum texTarget;
GLenum texInternalFormat;
GLenum texFormat;
char* shader_source;
} rect_nv_rgba_32, // texture rectangles, NV_float_buffer, RGBA, 32 bits
rect_nv_rgba_16, // texture rectangles, NV_float_buffer, RGBA, 16 bits
rect_nv_r_32, // texture rectangles, NV_float_buffer, R, 32 bits

int main(int argc, char **argv) {

// create variables for GL
createAllTextureParameters();
textureParameters = rect_nv_rgba_32;
//check the maximum size for the dimension a texture
glEnable(GL_TEXTURE_RECTANGLE_ARB);

int maxtexsize;
glGetIntegerv(GL_MAX_TEXTURE_SIZE, &maxtexsize);
cout<<"Maximum texture size is "<<maxtexsize<<endl;
.....................
}
void createAllTextureParameters(void) {

rect_nv_rgba_32.name = "TEXRECT - float_NV - RGBA - 32";
rect_nv_rgba_32.texTarget = GL_TEXTURE_RECTANGLE_ARB;
rect_nv_rgba_32.texInternalFormat = GL_FLOAT_RGBA32_NV;
rect_nv_rgba_32.texFormat = GL_RGBA;
rect_nv_rgba_32.shader_source = "uniform sampler2DRect textureY;" ................................
}
I really don't know why such a problem of printing out of maximum texture size occur.

Share this post


Link to post
Share on other sites
Quote:
Original post by DMINATOR
Really interesting, what kind of results are you getting ?

I have a few suggestions, don't know if they would help:

1. Try to put glEnable( GL_TEXTURE_2D ); before calling this command

2. Check for OpenGl errors after calling glGetIntegerv( GL_MAX_TEXTURE_SIZE, &&maxe ); with glGetError();


So, I used glGetError() and the string "OpenGL Error:illegal operation" is printed.

Share this post


Link to post
Share on other sites
Quote:
Original post by Anonymous Poster
It also depend on what video card you're running with...ATI's maximum texture size supported is 2048x2048 (don't know if they bumped this up with newer cards) while NVidia supports up to 4098x4098. So that may help you narrow the search down...


Wow, that might explain why I had varying results on two computers when I was working on my painting program. On the ATI card I was getting a weird smear effect (very much like a gaussian blur) when I did multiple render to textures on 2048x2048 or larger... On my nvidia I did not. It was a neat effect, but I had no idea what precisely caused the behaviour. It would smear a little, and then stop smearing... Like drawing on wet paper the edges only bleed so much. It also bled more the further from 0, 0 I got. Very strange.

Anyway, nothing to add, just speculating on past blunders. I ended up subdividing my surface into several smaller textures representing one large one which let me have arbitrarily large textures. This solution may be harder to implement in certain cases, but in my case it wasn't impossible.

Share this post


Link to post
Share on other sites
Quote:
Original post by Brother Bob
You must have valid and active rendering context before any OpenGL functions will return anything even remotely useful. I don't see where you create it in your code.


But what is a "valid and active rendering context"?

Share this post


Link to post
Share on other sites
To summarize without giving out too much free bread, you have to initialize OpenGL. Otherwise it has no idea what card you're trying to query specifications on...

Share this post


Link to post
Share on other sites
Quote:
Original post by rosicky2005
Quote:
Original post by Brother Bob
You must have valid and active rendering context before any OpenGL functions will return anything even remotely useful. I don't see where you create it in your code.


But what is a "valid and active rendering context"?

You should always ask google first.

Share this post


Link to post
Share on other sites

This topic is 4106 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this