Maximum size of texture

Started by
12 comments, last by RAZORUNREAL 17 years, 7 months ago
I use these piece of code to find the maximum size of texture: int max; glGetIntegerv(GL_MAX_TEXTURE_SIZE, &max); cout<<max<<endl; But the program generate different answers each time I run it. Why this happen? Also, what does GL_MAX_TEXTURE means? A texture can be size of max X max or size sqrt(max) X sqrt(max)?
Advertisement
Most routines of OpenGL would result in valid values if and only if they are invoked with a valid context running. So make sure you have a valid context set-up, and perhaps the randomly shuffled results may go away (otherwise I don't know).

The result will normally be a power of 2, and gives you the maximum edge length of a texture. However, there is a pitfall with this function as described in the red book (but my copy is something old, and perhaps nowadays the situation is another one):

Quote:Red Book
glGetIntegerv(GL_MAX_TEXTURE_SIZE,...) tells you the largest dimension (width or height, without borders) of a texture image, typically the size of the largest square texture supported. However, GL_MAX_TEXTURE_SIZE does not consider the effect of the internal format of a texture. A texture image that stores texels using the GL_RGBA16 internal format may be using 64 bits per texel, so its image may have to be 16 times smaller than an image with the GL_LUMINANCE4 internal format. (Also, images requiring borders or mipmaps may further reduce the amount of available memory.)
there is a program used to query the graphics card for available extensions and other constants such and maximum lights, texture size, matrix stack size, etc: http://www.realtech-vr.com/glview/

The source code is freely available. I'm guessing there could be some miscoding on your program.
Education is the progressive discovery of our own ignorance.
Really interesting, what kind of results are you getting ?

I have a few suggestions, don't know if they would help:

1. Try to put glEnable( GL_TEXTURE_2D ); before calling this command

2. Check for OpenGl errors after calling glGetIntegerv( GL_MAX_TEXTURE_SIZE, &&maxe ); with glGetError();
It also depend on what video card you're running with...ATI's maximum texture size supported is 2048x2048 (don't know if they bumped this up with newer cards) while NVidia supports up to 4098x4098. So that may help you narrow the search down...
I have tried http://www.realtech-vr.com/glview/ to check my texture size limit and the result is 4096*4096. However, when I run my program using this texture size, it cannot run properly. What may be the cause of this result?
Quote:Original post by rosicky2005
I have tried http://www.realtech-vr.com/glview/ to check my texture size limit and the result is 4096*4096. However, when I run my program using this texture size, it cannot run properly. What may be the cause of this result?

Perhaps the same that avoids you to get the GL_MAX_TEXTURE_SIZE value by your own app. I suggest you not to ignore the problem. Try to find out what happens, else you may have a hidden bug in your code. If we should help you, please start to post related code snippets, and describe in more detail what happens.
Here is some of my code:

#include <iostream>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <math.h>
#include <time.h>
#include <GL/glew.h>
#include <GL/gl.h>
#include <GL/glu.h>
#include <GL/glut.h>
using namespace std;

struct struct_textureParameters {
char* name;
GLenum texTarget;
GLenum texInternalFormat;
GLenum texFormat;
char* shader_source;
} rect_nv_rgba_32, // texture rectangles, NV_float_buffer, RGBA, 32 bits
rect_nv_rgba_16, // texture rectangles, NV_float_buffer, RGBA, 16 bits
rect_nv_r_32, // texture rectangles, NV_float_buffer, R, 32 bits

int main(int argc, char **argv) {

// create variables for GL
createAllTextureParameters();
textureParameters = rect_nv_rgba_32;
//check the maximum size for the dimension a texture
glEnable(GL_TEXTURE_RECTANGLE_ARB);

int maxtexsize;
glGetIntegerv(GL_MAX_TEXTURE_SIZE, &maxtexsize);
cout<<"Maximum texture size is "<<maxtexsize<<endl;
.....................
}
void createAllTextureParameters(void) {

rect_nv_rgba_32.name = "TEXRECT - float_NV - RGBA - 32";
rect_nv_rgba_32.texTarget = GL_TEXTURE_RECTANGLE_ARB;
rect_nv_rgba_32.texInternalFormat = GL_FLOAT_RGBA32_NV;
rect_nv_rgba_32.texFormat = GL_RGBA;
rect_nv_rgba_32.shader_source = "uniform sampler2DRect textureY;" ................................
}
I really don't know why such a problem of printing out of maximum texture size occur.
Quote:Original post by DMINATOR
Really interesting, what kind of results are you getting ?

I have a few suggestions, don't know if they would help:

1. Try to put glEnable( GL_TEXTURE_2D ); before calling this command

2. Check for OpenGl errors after calling glGetIntegerv( GL_MAX_TEXTURE_SIZE, &&maxe ); with glGetError();


So, I used glGetError() and the string "OpenGL Error:illegal operation" is printed.
You must have valid and active rendering context before any OpenGL functions will return anything even remotely useful. I don't see where you create it in your code.

This topic is closed to new replies.

Advertisement