[C++]Simple Texture Mapping won't work, disgusting.[SOLVED]

Started by
8 comments, last by directNoob 16 years, 10 months ago
Hi all. Currently I want to texture some of my geometry. To start very simple, I thought to texture a simple Quad, but still using texture objects. Ok, it is very simple, but I cant find the mistake. Ok, here is how I create the texture. Just some stupid colors within an array.

...

BYTE test2[64*64*3] ;

	for( int i=0; i<64;i++ ){
		for( int j=0; j<64; j+=3 ){
			test2[i*64+j]	= 40 ;
			test2[i*64+j+1] = i%255;
			test2[i*64+j+2] = i%255;
		}
	}
	
	// First, bind the texture object.
	//BindBuffer() ;
	glBindTexture(GL_TEXTURE_2D,1);
	//bool b = glIsTexture(1) ;
	// Copy pixel data up to the card.
	glPixelStorei( GL_UNPACK_ALIGNMENT, 1 ) ;

	glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT ) ;
	glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT ) ;
	glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST ) ;

	glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB, 64,64,//bmi.biWidth, bmi.biHeight, 
		0, GL_RGB, GL_UNSIGNED_BYTE, test2 ) ;
		
	GLenum err = 0 ;
	err = glGetError() ;

...


err is ever zero. And here, I render the texture onto the quad.

glEnable( GL_TEXTURE_2D ) ;
	glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE ) ;

	//texB.BindBuffer() ;
	glBindTexture(GL_TEXTURE_2D,1);
	glBegin( GL_QUADS ) ;

	glTexCoord2f(0.0f,0.0f) ;
	glVertex3f(-20.0f,-10.0f,-100.0f) ;
	glTexCoord2f(1.0f,.0f) ;
	glVertex3f( 20.0f,-10.0f,-100.0f ) ;
	glTexCoord2f(1.0f,1.0f) ;
	glVertex3f( 20.0f, 10.0f,-100.0f );
	glTexCoord2f(0.0f,1.0f) ;
	glVertex3f( -20.0f, 10.0f,-100.0f );
	glEnd() ;


This is so simple, I cant believe that I cant fix the problem. I need help please. Otherwise I go nuts. Grateful Alex [Edited by - directNoob on June 10, 2007 9:21:56 AM]
Advertisement
You need to allocate a texture before you can bind to it.

Use glGenTextures( 1/* num textures */, &texture /* GLuint identifying texture */);

then glBindTexture(GL_TEXTURE_2D, texture );

In practise you may find the texture handle you are allocate ends up being one, but Opengl won't recognise it until you have called "glGenTextures" and received that number as an output.
Well you never really said what the problem was, but I'm going to guess your texture doesn't quite look like what you want it to.

Quote:
for( int i=0; i<64;i++ ){
for( int j=0; j<64; j+=3 ){
test2[i*64+j] = 40 ;
test2[i*64+j+1] = i%255;
test2[i*64+j+2] = i%255;
}
}


The last element in test2 is at (64*64*3)-1 which is 12287, but your loop only goes to element 4097 or (63*64)+63+2, so there's a large portion of your texture that's empty(or undefined).

try this:

for( int i = 0; i < 64; i++ ){	for( int j = 0; j < 64; j++ )	{		test2[((i*64)+j)*3]	= 40;		test2[(((i*64)+j)*3)+1] = i % 255;		test2[(((i*64)+j)*3)+2] = i % 255;	}}


you could also make test2 a three dimensional array, which would make it a lot cleaner:

BYTE test2[64][64][3];for( int i = 0; i < 64; i++ ){	for( int j = 0; j < 64; j++ )	{		test2[j][0]	= 40;		test2[j][1] = i % 255;		test2[j][2] = i % 255;	}}
Sorry, but I haven't told you that I call glGenTextures.

This is done before calling the bitmap loading function.
But i simplyfied it and dont load the bitmap. Instead, see above,
I load some simple colors into an array.

I can tell you how I do it.

1. I call glGenTextures -> Get "1" ! I tested it with the debugger.
2. I call glBindTexture with created name by glGenTextures.
3. I call glTexImage2D ( as you can see above )
4. I tested with glIsTexture with name created by glGenTextures
It returns true. Also, I tested with glGetError() -> result is "0"
5. I unbind the texture object.

When rendering:
I also specify the parameters also with glTexParameteri and I do glEnable(GL_TEXTURE_2D).

Thanks
Alex

@Scet.

You are absolutly rigth!
I changed it.

But still no picture on the quad.

Alex
I made pictures again, that you see, that I dont lie to you.

Here you can see that the texture with m_iBufferIdentifier is
created.


Here you can see the texture name.


Here you can see, that no error occured.


And here is the quad.


Alex
I read here in other threads that your texture should be power of 2 and same width and height size. You might want to try that first and see if it works.
Also when trying to spot the error in fault searching, you should work with small code sections making it easier and not so extensive to find the fault.

Edit: Sorry, I just realized the you texture was just that.
Now i did the following.

BYTE test3[64*64*4] ;glGetTexImage( GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, test3 ) ;


The result was, test3 was filled by the funciton.
This means, the data is uploaded.

What could the problem be?

I bound the texture object before I upload the data and before
I draw the texture.
The glGetTexImage call is performed in the render function, after
the bind call. The result is, as I said, the pixel data.

The immediate following step is the draw call.

But it doesnt work!!!!

Maybe someone has a simple program which utilizes GL_TEXTURE_2D and
glTexImage2D, so I could try this. The nVidia SDK sample works fine, but the author
doesnt use GL_TEXTURE_2D.

I dont know what to do, Im just wasting time! I need this for a course at the university!

I can tell you, I miss D3D!!!!
I had similar problems with the VBOs, where memcpy didnt work to upload the vertex data....

Grateful
Alex
As I said, missing the obvious!

This little...

I havent specified glTexParameteri(.., GL_TEXTURE_MIN_FILTER,...)

In the "red book", the author says that ogl uses default values...
Obviously the default value was a false default value ;)
But the author has defined this parameter, but I havent.
I cant believe it!


This, again, is a great example of wasted resources, not only mine, also yours.

So thanks to all for wasing your time for reading and thinking about
a stupid mistake I made!
I appreciate it much!

Anyway, now I can proceed working.

Thanks again.
Alex
Just want to conclude:

This topic is closed to new replies.

Advertisement