Jump to content
  • Advertisement
Sign in to follow this  
jimbogd

OpenGL please help with simple code to display textured quad?

This topic is 4112 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, I'm trying to draw a 2D textured quad (the size of the screen) using opengl. I'm filling the texture with RED pixes, but it always draws in BLACK (no matter what value I set the "colour" variable to). If anyone can see the mistake(s) I'm making please help - I've been pulling my hair out over this! Thanks jimbogd
#include <iostream>
#ifdef WIN32
#include <windows.h>
#endif
#include <sdl/SDL.h>
#include <GL/gl.h>
#include <GL/glu.h>

#define		GL_CLAMP_TO_EDGE		0x812F
#define		SCREEN_WIDTH			640
#define		SCREEN_HEIGHT			480

unsigned int rgba_to_int(unsigned char r, unsigned char g, unsigned char b, unsigned char a);

int main(int nArgs, char** args)
{
	// init sdl video with opengl
	SDL_Init( SDL_INIT_VIDEO );
	SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 0 );
	SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );
	SDL_SetVideoMode( SCREEN_WIDTH, SCREEN_HEIGHT, 0, SDL_OPENGL );

	// setup opengl context
	glViewport( 0, 0, SCREEN_WIDTH, SCREEN_HEIGHT );
	glMatrixMode( GL_PROJECTION );
	glLoadIdentity();
	glOrtho( 0, SCREEN_WIDTH, SCREEN_HEIGHT, 0, -1, 1 );
	glMatrixMode( GL_MODELVIEW );
	glLoadIdentity();
	glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA );
	glEnable( GL_TEXTURE_2D );
	glEnable( GL_BLEND );
	glDisable( GL_DEPTH_TEST );
	glDisable( GL_CULL_FACE );

	// build the texture data in memory
	unsigned int	colour		= rgba_to_int(255, 0, 0, 0);
	unsigned int	tex_width	= 1024;
	unsigned int	tex_height	= 512;
	unsigned int	data_size	= tex_width * tex_height * 4;
	unsigned char *	data		= new unsigned char[sizeof( unsigned char ) * data_size];
	std::fill( reinterpret_cast<unsigned int *>( data ), reinterpret_cast<unsigned int *>( data + data_size ), colour );

	// generate a texture for it, bind the data and set the texture attributes
	unsigned int texture_id;
	glGenTextures( 1, &texture_id );
	glBindTexture( GL_TEXTURE_2D, texture_id );
	glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
	glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
	glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE );
	glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE );
	glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA, tex_width, tex_height , 0, GL_RGBA, GL_UNSIGNED_BYTE, data );

	// main loop
	SDL_Event e;
	bool is_running = true;
	while(is_running)
	{
		// check for quit event
		SDL_PollEvent(&e);
		if(e.type == SDL_QUIT)
			is_running = false;

		// draw the texture to the back buffer

		// texture coordinates
		double tx1 = 0.0f;
		double tx2 = static_cast<double>( SCREEN_WIDTH ) / static_cast<double>( tex_width );
		double ty1 = 0.0f / static_cast<double>( tex_height );
		double ty2 = static_cast<double>( SCREEN_HEIGHT ) / static_cast<double>( tex_height );
	
		// vertex coordinates
		double x1 = 0.0f;
		double x2 = static_cast<double>( SCREEN_WIDTH );
		double y1 = 0.0f;
		double y2 = static_cast<double>( SCREEN_HEIGHT );
	
		// draw the textured quad (clockwise winding)
		glBegin(GL_QUADS);
			glTexCoord2d( tx1, ty1 ); glVertex2d( x1, y1 );
			glTexCoord2d( tx2, ty1 ); glVertex2d( x2, y1 );
			glTexCoord2d( tx2, ty2 ); glVertex2d( x2, y2 );
			glTexCoord2d( tx1, ty2 ); glVertex2d( x1, y2 );
		glEnd();

		// update the screen
		SDL_GL_SwapBuffers();
	}

	// clean up
	glDeleteTextures( 1, &texture_id );
	delete [] data;
	SDL_Quit();
	return 0;
}

unsigned int rgba_to_int(unsigned char r, unsigned char g, unsigned char b, unsigned char a)
{
	unsigned int col;
	unsigned char *byte_ptr = reinterpret_cast<unsigned char *>( &col );
	byte_ptr[0] = r;
	byte_ptr[1] = g;
	byte_ptr[2] = b;
	byte_ptr[3] = a;
	return col;
}

Share this post


Link to post
Share on other sites
Advertisement
Its been a while since 3D but, maybe some help...

// vertex coordinates
double x1 = 0.0f;
double x2 = static_cast<double>( SCREEN_WIDTH );
double y1 = 0.0f;
double y2 = static_cast<double>( SCREEN_HEIGHT );

Your using the SCREEN_WIDTH and SCREEN_HEIGHT as max dimensions, but as with a default view, the screen coordinates go from -1.0f to 1.0f.

With texturing, min and max values go from 0.0f to 1.0f (without wrapping, tiling, etc)

If your wanting to use screen coordinates, I know DirectX uses RHW as a Vertex Definition. not sure how OpenGL rolls.

Hope this helps..


Edit, well, after re-re-reading your post (sorry, bit tired, just got off work) this actually might not be your problem. But I know Nehe has a tutorial on Ortho...

http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=21

Share this post


Link to post
Share on other sites
Use of screen coordinates has already been enabled with the glOrtho() command. The problem is that you have colour = rgba_to_int(255, 0, 0, 0). You have blending enabled and you are giving an alpha value of zero to the vertices so they are transparent. Change the colour variable to rgba_to_int(255, 0, 0, 255). That will make the vertices opaque and they'll show up on the screen. :D

Share this post


Link to post
Share on other sites
Some aesthetic pointers:

unsigned char *	data		= new unsigned char[sizeof( unsigned char ) * data_size];


The sizeof operator is redundant, since the new operator allocates per "unsigned char".

double ty1 = 0.0f / static_cast<double>( tex_height );


will always yield 0.0 (or a division by zero).

EDIT: what Jakenaattori said :-)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!