SDL + OpenGL + (fonts && textures) == headache

Started by
11 comments, last by Enalis 19 years, 9 months ago
I have been reading over the past posts and am wondering my my code isn't working as well, I'll just post it and hope for something to jump out at somebody This is my texttotexture function:
GLuint RenderTextureFromText(char *text, int r, int g, int b, int a){
	GLuint texture;
	int fontWidth, fontHeight, fontBPP;
	SDL_Color clr = {r, g, b, a};
	SDL_Surface *Text = TTF_RenderText_Blended(font, text, clr);
	//SDL_Surface *Text = TTF_RenderText_Solid(font, text, clr);
	if (Text == 0){
		cout << "TTF_RenderText_Sold() failed:  " << TTF_GetError() << endl;
	}
	SDL_SaveBMP(Text, "TTFOutTest.bmp");
	//Slock(Text);
	fontWidth = Text->w;
	fontHeight = Text->h;
	fontBPP = Text->format->BitsPerPixel;
	//Sulock(Text);

	//Next power of two.
	int width = 1;
	while(width < fontWidth){
		width*=2;
	}
	int height = 1;
	while(height < fontHeight){
		height*=2;
	}
	// Make sure we have the correct endianess.
	#if SDL_BYTEORDER == SDL_BIG_ENDIAN
      Uint32 rmask = 0xff000000;
      Uint32 gmask = 0x00ff0000;
      Uint32 bmask = 0x0000ff00;
      Uint32 amask = 0x000000ff;
    #else
      Uint32 rmask = 0x000000ff;
      Uint32 gmask = 0x0000ff00;
      Uint32 bmask = 0x00ff0000;
      Uint32 amask = 0xff000000;
    #endif

	SDL_Surface *TextSurface = SDL_CreateRGBSurface(SDL_HWSURFACE,
 													width, height,
              										32,
		                                            rmask, gmask, bmask, amask);
	SDL_SetAlpha(Text, 0, 0);

	if (SDL_BlitSurface(Text, 0, TextSurface, 0)!=0){
  		cerr << "Font Blit Failed!" << SDL_GetError() << endl;
	}
	SDL_SaveBMP(Text, "TTFOutTest2.bmp");
	glBindTexture(GL_TEXTURE_2D, texture);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
	glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, TextSurface->w, TextSurface->h,
		         0, GL_RGBA, GL_UNSIGNED_BYTE, TextSurface->pixels);

	SDL_FreeSurface(TextSurface);	
	SDL_FreeSurface(Text); //Might need to NULL these pointers to surfaces!

	return texture;
}
This is my code where I render it:
	sprintf(strFrameRate, "fps: %5d", lastFramesPerSecond);
	cout << strFrameRate << endl;
	fontTexture = RenderTextureFromText(strFrameRate, 255, 0, 0, 128);
	glBindTexture(GL_TEXTURE_2D, fontTexture);
	cout << RenderTextureFromText(strFrameRate, 255, 0, 0, 0) << endl;
	glPushMatrix();
		glTranslatef(0.0f, 0.0f, 0.0f);
		glBegin(GL_QUADS);
			glTexCoord2f(0.0f, 0.0f); glVertex2f(-1.0f, -1.0f);
			glTexCoord2f(1.0f, 0.0f); glVertex2f(0.0f, -1.0f);
			glTexCoord2f(1.0f, 1.0f); glVertex2f(0.0f, 0.0f);
			glTexCoord2f(0.0f, 1.0f); glVertex2f(-1.0f, 0.0f);
		glEnd();
	glPopMatrix();
Douglas Eugene Reisinger II
Projects/Profile Site
Advertisement
i cant help with your code, but i would strongly reccommend checking out NeHe tutorial #17. its a VERY simple to set-up and use font system, and best of all its cross platform and works w / SDL (i set mine up following the linux / SDL version of the tut, and im on windows btw).
FTA, my 2D futuristic action MMORPG
Don't use SDL for that, just make your own texture loading and text functions, It's really easy, and you will learn new stuff too (plus, you have more control).
You mean like have a bitmap and parse through it... this is intriguing... But then again I raise the problem that I'll run into the same thing when loading textures, I'll have to load them onto a surface then comvert them into a glUINT or texture.
Douglas Eugene Reisinger II
Projects/Profile Site
You don't have to load them into a surface, just load them in a memory, and use the glTexImage2D() function.
Well, I went ahead and tried completely rewriting with the nehe lesson 17 linux/sdl code. But I changed a little bit of it! I rewrote the LoadGLTextures func to be generic, but I'm not sure if it would work?
int LoadGLTextures(GLuint texture, char *fileName){    int status = 0; /* Status indicator */    SDL_Surface *TextureImage; /* Create storage space for the texture */    if (TextureImage = SDL_LoadBMP(fileName)){ /* Load The Bitmap, Check For Errors, If Bitmap's Not Found Quit */	    status = 1; /* Set the status to true */	    glGenTextures(1, &texture); /* Create The Texture */	    /* Load in texture */	    glBindTexture(GL_TEXTURE_2D, texture); /* Typical Texture Generation Using Data From The Bitmap */	    glTexImage2D(GL_TEXTURE_2D, 0, 3, TextureImage->w, TextureImage->h, 0, GL_BGR, GL_UNSIGNED_BYTE, TextureImage->pixels); /* Generate The Texture */	    /* Nearest Filtering */	    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );	    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );	}else{     cerr << "SDL_LoadBMP(fileName) failed: " << SDL_GetError() << endl; 	}	SDL_FreeSurface(TextureImage);    /* Free up any memory we may have used */    return status;}
Douglas Eugene Reisinger II
Projects/Profile Site
The main problem I had with using SDL surfaces as textures is that the pixel format is not always what you expect. You have to check that it's the same format that you pass to glTexImage2D. Also, you don't check anywhere that your texture has power-of-two dimensions. You're assuming there that the BMP you're loading has 8-bit and in BGR format, which is great if it is, but sucks if it isn't.

BTW, what's the OpenGL error you're getting?
[teamonkey] [blog] [tinyminions]
I don't think it's that because I can call SDL_SaveBMP(surface, output.bmp) and it works just fine, it outputs the correct bitmap, and it is in power of two because I'm no longer worrying with that, I'm simply using a bitmap as a font and loading it in as in nehe.gamedev.net's tutorial 17 (linux/sdl). But I've come to realize that it's probably something with the way I'm rendering.
No matter what it always just displays the primative, never anything else, like it never applies a texture??? it just draws a big assed box the size specified, like if I just make a quad and try to texture it with the bmp I loaded in which is already a poiwer of 2, it doesn't apply the texture? but I must mention I am in ortho mode, but is there any special SDL flags I need to set or any special rendering things I can't do in ortho mode besides diplay something beyond the specified z distance which for me is from -1 to 1. wow that was long winded. yar?!?
Douglas Eugene Reisinger II
Projects/Profile Site
Quote:
int LoadGLTextures(GLuint texture, char *fileName)


texture is passed by value.

try...

int LoadGLTextures(GLuint& texture, char *fileName)

PS - I'm assuming that you're binding the texture correctly and texturing is enabled.

Cheers,Paul CunninghamPumpkin Games
Quote:Original post by Enalis
I don't think it's that because I can call SDL_SaveBMP(surface, output.bmp) and it works just fine, it outputs the correct bitmap, and it is in power of two because I'm no longer worrying with that, I'm simply using a bitmap as a font and loading it in as in nehe.gamedev.net's tutorial 17 (linux/sdl).


No, I mean that .BMP data can be stored in several ways (monochrome, 16-colour, 24-bit and several other options not found in MS Paint). You have to explicitly pass the correct pixel format to OpenGL. The SDL/NeHe code assumes that the SDL_Surface that is created when the bitmap is loaded is always going to be in a specific format - and it might not be. But in general that leads to a distorted texture, not no texture at all.

If the problem's not what PaulC (hello - I think I've seen you elsewhere :) ) suggested, do check the errors that OpenGL give you. Something like this:

int glErr;while((glErr = glGetError()) != GL_NO_ERROR) {    std::cerr << "OpenGL Error: " << gluErrorString(glErr) << std::endl;}
[teamonkey] [blog] [tinyminions]

This topic is closed to new replies.

Advertisement