glTexture3D requirement

Started by
7 comments, last by ptitjulien 13 years, 7 months ago
hi,

im trying to build a 3D texture when i browse around for info i found 3 different referrence that came up with something different:

http://www.opengl.org/sdk/docs/man/xhtml/glTexImage3D.xml

http://www.opengl.org/sdk/docs/man3/xhtml/glTexImage3D.xml

http://www.dei.isep.ipp.pt/~matos/cg/docs/manual/glTexImage3D.3G.html

the difference is in the requirement, where the first one came up with 16 texel minimal for width, height, depth, the second one requires 256 for height and depth, and the third one 64. im confused which one is correct. i did try to send a 250*250*256 data to GPU but it always returns me an error though my data is solidly correct. which one is right? does my error got something to do with this?


thanks in advance
Advertisement
Depends on the hardware. As hardware progressed, the minimum resolution required to be supported by drivers increased. On modern PC-based hardware, you can assume that at least 256 is supported (most current GPUs do a lot more).

On embedded devices, the limits may be a little lower. You can query the maximal supported resolution using GL_MAX_3D_TEXTURE_SIZE.

You did not specify on what hardware you tested this. In your example you supplied a non-power-of-two size. This may not be supported by older implementations. And make sure you have enough continuous VRAM available for larger 3D textures.
hi,

im using GeForece 9600 GT, 512 MB. i think its pretty modern, right? well the problem is what is that minimum for? if for example it says minimum 256 can i put y 250x250 pixel in it? I got this error when i try to put 250*250*256, it returns me an error (no explanation what it is) when calling glTexImage3D(), and i already check the data and the size is solid. But when i change to 300x300*256 it works fine (yet).

Any idea why? does it mean i cant put a lower than 256 size of texture?

thanks in advance
Quote:Original post by svnstrk
well the problem is what is that minimum for? if for example it says minimum 256 can i put y 250x250 pixel in it?
The minimum size means that all implementations must support a texture size at least that large. Smaller textures should be fine.
Quote:I got this error when i try to put 250*250*256, it returns me an error (no explanation what it is)
Did you check the value of glGetError()?

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

well i think i already deal with that. i dont need anything smaller than 250 again.

but i still got problem, im trying to import at 300*300*212 integer texture

glTexImage3D(GL_TEXTURE_3D, 0, GL_R32I, textureWidth, textureHeight, textureDepth, 0, GL_RED_INTEGER, GL_INT, indexFit);

the result is a mess, its not what i expected, and i already check the data is solid.but when i put a smaller depth: 300*300*65, it works. is there some kind of limitation to the depth of GLSL in indexing the texture? because when I put a different file 300*300*225 but instead of GL_INTEGER, i put GL_RGB, GL_RGB8, GL_UNSIGNED_BYTE, it works. any idea why?


well i did put glGetError() but all i got is error 1281, no explanation what is it and it always appear even if my program is working.
Quote:Original post by svnstrk
well i did put glGetError() but all i got is error 1281
You need to lookup this number (in hex) in the error codes in the gl.h header file.

The easier way is to feed this into gluErrorString(), which will return a string containing the error name.

In this particular case, the error is INVALID_VALUE, which means that you passed some function a bad argument.
Quote:and it always appear even if my program is working.
This means that you have an error somewhere, though presumably not a fatal one. Investigate with liberal doses of glGetError - the error may occur much earlier in the program, and be obscuring the real error.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

hi,

yeah i did notice the invalid value just now. but im not quite sure what went wrong. here is my code snipplet:

int* spFit = new int[textureWidth*textureHeight];for(int i = 0; i< textureWidth*textureHeight; i++) {	spFit = startingPoint2D;	std::cout << spFit << std::endl;}glGenTextures(1, &texIndexStartPoint);	glBindTexture(GL_TEXTURE_2D, texIndexStartPoint);glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);				glTexImage2D(GL_TEXTURE_2D, 0, GL_R32I, textureWidth, textureHeight, 0, GL_RED_INTEGER, GL_INT, spFit);


im checked and rechecked that spFit is a (textureWidth*textureHeight) sized array of integer. the value is valid from 0-5,763,887 or much less than integer max value. did you notice any error on the snipplet? did i miss anything?


thanks in advance
You should try to be more precise with your posts.

Quote:Original post by svnstrk
but i still got problem, im trying to import at 300*300*212 integer texture

glTexImage3D(GL_TEXTURE_3D, 0, GL_R32I, textureWidth, textureHeight, textureDepth, 0, GL_RED_INTEGER, GL_INT, indexFit);


I tried this on my GeForce 9800 GT using width = height = 300 and depth = 212. No GL errors generated.

Quote:
the result is a mess, its not what i expected,


What is not expected? Do you mean that the glTexImage3D call was successful and that when the texture was rendered on a geometric object, the rendered result did not look like what you thought it should? Can you post screen captures?

Quote:
and i already check the data is solid.but when i put a smaller depth: 300*300*65, it works.


What do you mean by "data is solid"? Do you mean that indexFit is an array of 300*300*65 elements, each of type 'int'?

Quote:
because when I put a different file 300*300*225 but instead of GL_INTEGER, i put GL_RGB, GL_RGB8, GL_UNSIGNED_BYTE, it works. any idea why?


If indexFit is an array of integers, changing the internal format to GL_RGB8, the format to GL_RGB, and the type to GL_UNSIGNED_BYTE causes OpenGL to interpret indexFit as an array of RGB values, each channel 8-bits. That is, each texel of indexFit is a 24-bit value. When you say "it works", what do you mean? Your statement leads me to believe that indexFit really is an array of RGB values, so switching to GL_R32I, GL_RED_INTEGER, and GL_INT will cause the texture to look completely wrong.

Quote:
well i did put glGetError() but all i got is error 1281, no explanation what is it and it always appear even if my program is working.


I suspect that the GL_INVALID_VALUE error occurred in some OpenGL call made previously, before you called glTexImage3D. What I do in my OpenGL wrapper is to enable the glGetError() call using conditional compilation (and making sure glGetError is not called in several situations when its results are undefined). The first time an OpenGL error is encountered, my code asserts so that I know exactly the OpenGL call that caused the problem.

By the way, your last post ("yeah I did notice...") now shows a code example for a 2D texture, not a 3D texture. Did you mean to post a block of code for a 3D texture?
i think that only power of 2 sizes are supported for 3D textures...
try 256*256*256, should work provided your graphics card has enough memory.

hope it helps,

This topic is closed to new replies.

Advertisement