Jump to content
  • Advertisement
Sign in to follow this  
TFS_Waldo

SDL_Surface (SDL_LoadBMP) to GL texture?

This topic is 4831 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey there. I am trying to figure out a way to use SDL surfaces (from SDL_LoadBMP) to create textures usable in OpenGL. Well, this is how I am doing it:
void Texture() 
{
	SDL_Surface *bmp = new SDL_Surface;
	bmp = SDL_LoadBMP("C:\\bmp.bmp");

	glGenTextures(1, &texture[0]);

	glBindTexture(GL_TEXTURE_2D, texture[0]);

	glTexImage2D(GL_TEXTURE_2D, 0, 3, bmp->w, bmp->h, 0, GL_RGB, GL_UNSIGNED_BYTE, bmp->pixels);

	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
}

It's not commented or anything, because I just created the function to test it. Well, I can get it to work, somewhat. When I bind the texture to the quad, all the blue is taken from the image. Can anyone tell me what is wrong, please? Thanks in advance, Matt U.

Share this post


Link to post
Share on other sites
Advertisement
I don't know if this will help but try changing 3 to 4 and GL_RGB to GL_RGBA in your MipMaps function. I don't know if that will help though.

Also, did you enable 2d texturing?

-- Brandon Fogerty
http://www.jujikasoft.com

GOD Bless you Always my Friends!!!

Share this post


Link to post
Share on other sites
If the format was wrong the texture would be corrupted far more than just missing blue. I suspect you simply have a glColor3f(1.0f, 1.0f, 0.0f); call or equivalent somewhere before you render your texture. Remember that by default the texture is modulated with the current colour (per-component multiplied). Make sure you specify white as the current colour before you render textured polygons or change the texture mode to GL_DECAL or GL_REPLACE.

Enigma

Share this post


Link to post
Share on other sites
SDL_Surface *bmp = new SDL_Surface;
bmp = SDL_LoadBMP("C:\\bmp.bmp");


Also, you do not have to allocate memory with new, SDL_LoadBMP does that. If you want to free an SDL_Surface you should use SDL_FreeSurface.

Other problem, from the SDL doc:
Quote:
Note: When loading a 24-bit Windows BMP file, pixel data points are loaded as blue, green, red, and NOT red, green, blue (as one might expect).


Here is my bmp loader, that uses SDL_Surface:

#ifndef _H_TEXTURE_
#define _H_TEXTURE_

#include <gl/glu.h>

class TEXTURE
{
private:
GLuint id;
public:
~TEXTURE();

int LoadBMP(const char *);
GLuint GetID(void)
{
return id;
}
};

#endif


#include "texture.h"

#include "SDL.h"

TEXTURE::~TEXTURE()
{
if(glIsTexture(id))
glDeleteTextures(1,&id);
}

int TEXTURE::LoadBMP(const char *fname)
{
int i, size;
unsigned char *data, swp;

SDL_Surface *bmp;

bmp=SDL_LoadBMP(fname);

if(!bmp)
return 0;

if(bmp->format->BitsPerPixel!=24)
{
SDL_FreeSurface(bmp);
return 0;
}

size=3*bmp->w*bmp->h;

data=(unsigned char*)bmp->pixels;
//bgr to rgb
for(i=0;i<size;i+=3)
{
swp=data;
data=data[i+2];
data[i+2]=swp;
}

glGenTextures(1,&id);
glBindTexture(GL_TEXTURE_2D,id);

glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

glTexImage2D(GL_TEXTURE_2D,0,GL_RGB,bmp->w,bmp->h,0,GL_RGB,GL_UNSIGNED_BYTE,data);
SDL_FreeSurface(bmp);
return 1;
}


I hope I could help.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!