Create a texture from a [OpenGL-type] bitmap

Started by
9 comments, last by spiffgq 21 years, 1 month ago
I''m trying to create a texture object in OpenGL from a bitmap. (And by bitmap, I mean a true one-bit-per-pixel OpenGL type bitmap, not that Microsoft abomination file type.) My first thought was to just use glTexImage2D as follows:
  
const unsigned char BITMAP_DATA [] = { /* pixels are here */ };
const int BITMAP_PITCH = 1; // Number of bytes per row

const int BITMAP_ROWS = 12; // Number of rows (the height)

const int BITMAP_COLS =  8; // Number of pixels per column


// Some code goes here blah blah blah


unsigned int texObj;
glGenTextures (1, &texObj);
	
glBindTexture (GL_TEXTURE_2D, texObj);
glTexImage2D (
   GL_TEXTURE_2D, 
   0, 
   GL_INTENSITY, 
   BITMAP_COLS, 
   BITMAP_ROWS, 
   0, 
   GL_LUMINANCE, 
   GL_BITMAP, 
   BITMAP_DATA
);

// and so on

  
Unfortunately, this doesn''t work. The resulting "texture" is just a solid color and does not contain the bitmap''s pixels.
  
glBindTexture (GL_TEXTURE_2D, texObj);
glBegin (GL_QUADS);
glColor3f (1.0, 0.0, 0.0);
glTexCoord2f (0.0, 1.0); glVertex3f (  0.0f,   0.0f, 0.0f);
glTexCoord2f (1.0, 1.0); glVertex3f (300.0f,   0.0f, 0.0f);
glTexCoord2f (1.0, 0.0); glVertex3f (300.0f, 300.0f, 0.0f);
glTexCoord2f (0.0, 0.0); glVertex3f (  0.0f, 300.0f, 0.0f);
glEnd ();
  
The above code creates a solid red rectangle instead of showing the pixels of the bitmap. Am I on the right track or do I need to go another route completely? If I am on the right track, what am I doing wrong? Thanks for your help.
SpiffGQ
Advertisement
Hi!

kills this GL_LUMINANCE and put GL_BITMAP on its place.
Then put GL_UNSIGNED_BYTE on the place where GL_BITMAP was!

kr
cNc
Love dem or love dem not!
I believe the LUMINANCE is for 8 bits... (black & white)
So I guess that''s the problem indeed

| Panorama 3D Engine | Contact me |
| MSDN | Google | SourceForge |
quote:Original post by Subotron
I believe the LUMINANCE is for 8 bits... (black & white)
So I guess that''s the problem indeed


No, I changed the GL_TexImage_2D line from
glTexImage2D (GL_TEXTURE_2D, 0, GL_INTENSITY, BITMAP_COLS, BITMAP_ROWS, 0, GL_LUMINANCE, GL_BITMAP, BITMAP_DATA); 


to

glTexImage2D (GL_TEXTURE_2D, 0, GL_INTENSITY, BITMAP_COLS, BITMAP_ROWS, 0, GL_BITMAP, GL_UNSIGNED_BYTE, BITMAP_DATA); 


but end up with the same, blank quad.

Here is a complete listing of the code. Perhaps the problem lies somewhere else (for example, maybe I''m forgetting to initialize something).


  #include <stdio.h>#ifdef WIN32# include <windows.h>#endif#include <GL/gl.h>#include <SDL/SDL.h>#if 0const unsigned char BITMAP_DATA [] = {	0x41, 0x00, 	0x41, 0x00, 	0x42, 0x00,	0x22, 0x00,	0x22, 0x00,	0x14, 0x00,	0x14, 0x00,	0x1C, 0x00,	0x0C, 0x00,	0x08, 0x00,	0x08, 0x00,	0x30, 0x00,};#endif//#if 0const unsigned char BITMAP_DATA [] = {	0x41, 	0x41, 	0x42, 	0x22, 	0x22, 	0x14, 	0x14, 	0x1C, 	0x0C, 	0x08, 	0x08, 	0x30};//#endifconst int BITMAP_PITCH = 1;const int BITMAP_ROWS = 12;const int BITMAP_COLS = 8;const int NUM_PIXELS = BITMAP_PITCH * 8 * BITMAP_ROWS;const int SCREEN_WIDTH = 900;const int SCREEN_HEIGHT = 750;int main () {	// Init SDL	SDL_Init (SDL_INIT_VIDEO);	SDL_Surface *screen = SDL_SetVideoMode (SCREEN_WIDTH, SCREEN_HEIGHT, 32, SDL_OPENGL | SDL_HWSURFACE | SDL_DOUBLEBUF);	if (screen == NULL){	return 0;	}	SDL_WM_SetCaption ("Test Bitmap OpenGL Texture Program", NULL);		// Init OpenGL	glViewport (0, 0, SCREEN_WIDTH, SCREEN_HEIGHT);	glMatrixMode (GL_PROJECTION);	glLoadIdentity ();	glOrtho (0, SCREEN_WIDTH, 0, SCREEN_HEIGHT, -1, 1);	glMatrixMode (GL_MODELVIEW);	glClearColor (0.0, 0.0, 0.0, 0.0);	glClearDepth (1.0);	glEnable (GL_DEPTH_TEST);	glDepthFunc (GL_LEQUAL);	glPolygonMode (GL_FRONT, GL_FILL);	glPolygonMode (GL_BACK, GL_LINE);	glEnable (GL_TEXTURE_2D);		// Create texture	unsigned int texObj;	glGenTextures (1, &texObj);		if (texObj == 0){		fprintf (stderr, "Unable to create an OpenGL texture object\n");		SDL_Quit();		return 1;	}		glBindTexture (GL_TEXTURE_2D, texObj);	glTexImage2D (			/*target*/ 				GL_TEXTURE_2D, 			/*level*/ 				0, 			/*internalFormat*/	GL_INTENSITY, 			/*width*/ 				BITMAP_COLS, 			/*height*/ 				BITMAP_ROWS, 			/*border*/ 				0, 			/*format*/ 				GL_BITMAP,			/*type*/ 				GL_UNSIGNED_BYTE, 			/*texels*/ 				BITMAP_DATA		);			glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);		// Show a quad with the texture	glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);	glLoadIdentity ();	glTranslatef (200.0f, 150.0f, 0.0f);	//glBindTexture (GL_TEXTURE_2D, texObj);	glBegin (GL_QUADS);		glColor3f (1.0, 0.0, 0.0);		glTexCoord2f (0.0, 1.0); glVertex3f (  0.0f,   0.0f, 0.0f);		glTexCoord2f (1.0, 1.0); glVertex3f (100.0f,   0.0f, 0.0f);		glTexCoord2f (1.0, 0.0); glVertex3f (100.0f, 150.0f, 0.0f);		glTexCoord2f (0.0, 0.0); glVertex3f (  0.0f, 150.0f, 0.0f);	glEnd ();			SDL_GL_SwapBuffers ();		SDL_Event event;	bool done = false;		while (!done){			SDL_PollEvent (&event);				if (event.type == SDL_QUIT){			done = true;		}else if (event.type == SDL_KEYUP){					if (event.key.keysym.sym == SDLK_ESCAPE)				done = true;		}			}	SDL_Quit ();	glDeleteTextures (1, &texObj);		return 0;}  


It is a very simple program that uses SDL as the window manager. It starts, creates the texture, displays the texture, and waits for the user to exit the program. It should compile on Windows and Linux (although it has only been tested on Linux).

Here is a Makefile that will compile the above program (assuming you have SDL and OpenGL libs installed in the same place ):

  CC=g++LIBS= -lSDL -L/usr/X11R6/lib -lGLopengl_test_program: main.o	$(CC) $(LIBS) main.o -o opengl_test_program  


Thank you for your help.

SpiffGQ
Shouldn''t you bind your texture before you draw code?
Before the quad drawing you put //glBindTexture (GL_TEXTURE_2D, texObj);
Maybe uncommenting it would help?
Not a pro, just a thought :D
You''re using one byte per row in the image, which requires the unpack alignment to be set to 1. The default is 4, which means each new row must start on a 4 byte aligned offset from the start address. I don''t know if this is the problem you are currently experiencing, but with a one byte wide image, it will be a problem sooner or later. Put a glPixelStorei(GL_UNPACK_ALIGNMENT, 1) before uploading the texture.

Either that, or pad the image with enough bytes so that each new row starts on 4 byte aligned offset from the start address.
quote:Original post by michel
Shouldn''t you bind your texture before you draw code?
Before the quad drawing you put //glBindTexture (GL_TEXTURE_2D, texObj);
Maybe uncommenting it would help?
Not a pro, just a thought :D


I''ve tried it both ways with similar results.

quote:Original post by Brother Bob
You''re using one byte per row in the image, which requires the unpack alignment to be set to 1. The default is 4, which means each new row must start on a 4 byte aligned offset from the start address. I don''t know if this is the problem you are currently experiencing, but with a one byte wide image, it will be a problem sooner or later. Put a glPixelStorei(GL_UNPACK_ALIGNMENT, 1) before uploading the texture.

Either that, or pad the image with enough bytes so that each new row starts on 4 byte aligned offset from the start address.



I''ll try that when I get home. To pad the image, should I change the data type to unsigned integer or add three more bytes per row in unsigned chars?
SpiffGQ
Add three bytes after the one you already have. If you use unsigned ints instead, you have to make sure the information ends up in the correct byte (little vs big endian issues), but it is possible.
quote:Original post by CnCMasta
kills this GL_LUMINANCE and put GL_BITMAP on its place.
Then put GL_UNSIGNED_BYTE on the place where GL_BITMAP was!


One thing I don''t understand is according to the OpenGL red book, GL_BITMAP and GL_UNSIGNED_BYTE are type constants whereas GL_LUMINANCE is a format constant. That would suggest that I cannot replace GL_LUMINANCE with GL_BITMAP because they are different categories of constants. Where is my reasoning wrong?

quote:Original post by Brother Bob
Put a glPixelStorei(GL_UNPACK_ALIGNMENT, 1) before uploading the texture.


I''m not sure what you mean by "before uploading the texture". I''m assuming that means before the glTexImage2D call. Either way, no such luck.

quote:Original post by Brother Bob
Add three bytes after the one you already have. If you use unsigned ints instead, you have to make sure the information ends up in the correct byte (little vs big endian issues), but it is possible.


I changed the constants to

const unsigned char BITMAP_DATA [] = {	0x41, 0x00, 0x00, 0x00,	0x41, 0x00, 0x00, 0x00,	0x42, 0x00, 0x00, 0x00,	0x22, 0x00, 0x00, 0x00,	0x22, 0x00, 0x00, 0x00,	0x14, 0x00, 0x00, 0x00,	0x14, 0x00, 0x00, 0x00,	0x1C, 0x00, 0x00, 0x00,	0x0C, 0x00, 0x00, 0x00,	0x08, 0x00, 0x00, 0x00,	0x08, 0x00, 0x00, 0x00,	0x30, 0x00, 0x00, 0x00};const int BITMAP_PITCH = 4;const int BITMAP_ROWS = 12;const int BITMAP_COLS = 8 * BITMAP_PITCH;const int NUM_PIXELS = BITMAP_COLS * BITMAP_ROWS; 


but that didn''t work, either. Thanks for the suggestions. Any other ideas?
SpiffGQ
OK, I did make a rather bone-headed mistake. The texture height is not a power of two (12) as OpenGL requires. I added some rows to make it 16 rows high. The pitch is one which is a power of two and the number of columns is 8 which is also a power of two. So now everything should be covered. Unfortunately, this doesn''t fix the problem.
SpiffGQ

This topic is closed to new replies.

Advertisement