SDL_Surface to OpenGL texture

Started by
15 comments, last by trager 19 years, 11 months ago
I''ve just started looking at SDL and I was wondering how easy it is to convert a SDL_Surface to a OpenGL texture? If you know could you please let me know how it is done? Thanks trager
Advertisement
I dont know any sdl, but if you can access the following:
-width
-height
-rgb pixeldata

then you can turn it into a texture with standard opengl code (see nehe tutors)
The following code uses SDL_Image to load an image file, converts it to a raw format, and loads it as a mipmapped texture. It returns the texture number on success, 0 on an error (and hackishly prints to stdout what the error was). Possible problems:

- Printing to stdout. This is easy to change
- It may not work for all bit depths. It hasn''t failed me yet, though

If there are any other problems, just reply here.

And now, the code (pure C, cross platform,etc.):

int loadTexture ( char *filename ) {
GLuint retval;
SDL_Surface *sdlimage;
void *raw;
int w, h, i, j, bpp;
Uint8 *srcPixel, *dstPixel;
Uint32 truePixel;
GLenum errorCode;

sdlimage = IMG_Load(filename);

if ( !sdlimage ) {
printf("SDL_Image load error: %s\n", IMG_GetError());
return 0;
}
if ( sdlimage->format->BytesPerPixel < 2 ) {
printf("Bad image -- not true color!\n");
return 0;
}

w = sdlimage->w;
h = sdlimage->h;

raw = (void *)malloc( w * h * 4 );
dstPixel = (Uint8 *)raw;

SDL_LockSurface( sdlimage );

bpp = sdlimage->format->BytesPerPixel;

for ( i = h ; i > 0 ; i-- ) {
for ( j = 0 ; j < w ; j++ ) {
srcPixel = (Uint8 *)sdlimage->pixels + i * sdlimage->pitch + j * bpp;
switch (bpp) {
case 1:
truePixel = *srcPixel;
break;

case 2:
truePixel = *(Uint16 *)srcPixel;
break;

case 3:
if(SDL_BYTEORDER == SDL_BIG_ENDIAN) {
truePixel = srcPixel[0] << 16 | srcPixel[1] << 8 | srcPixel[2];
} else {
truePixel = srcPixel[0] | srcPixel[1] << 8 | srcPixel[2] << 16;
}
break;

case 4:
truePixel = *(Uint32 *)srcPixel;
break;

default:
printf("Image bpp of %d unusable\n", bpp);
return 0;
break;
}
SDL_GetRGBA( truePixel, sdlimage->format, &(dstPixel[0]), &(dstPixel[1]), &(dstPixel[2]), &(dstPixel[3]));
dstPixel++;
dstPixel++;
dstPixel++;
dstPixel++;
}
}

SDL_UnlockSurface( sdlimage );
SDL_FreeSurface( sdlimage );

while ( glGetError() ) { ; }

glGenTextures( 1, &retval );
glBindTexture( GL_TEXTURE_2D, retval );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );

errorCode = glGetError();
if ( errorCode != 0 ) {
if ( errorCode == GL_OUT_OF_MEMORY ) {
printf("Out of texture memory!\n");
}
return 0;
}

gluBuild2DMipmaps( GL_TEXTURE_2D, 4, w, h, GL_RGBA, GL_UNSIGNED_BYTE, (Uint8 *)raw);

errorCode = glGetError();
if ( errorCode != 0 ) {
if ( errorCode == GL_OUT_OF_MEMORY ) {
printf("Out of texture memory!\n");
}
return 0;
}

return retval;
}
errr... o_0

/* load_image -- basic GL texture loading */int load_image(char *file){    SDL_Surface *tex = SDL_LoadBMP(file);    GLuint texture;    printf("Status:  Loading image ");    printf(file);    printf("... ");    if(tex)    {        glGenTextures(1, &texture);        glBindTexture(GL_TEXTURE_2D, texture);        glTexImage2D(GL_TEXTURE_2D, 0, 3, tex->w, tex->h,        0, GL_BGR, GL_UNSIGNED_BYTE, tex->pixels);        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);        printf("OK\n");        SDL_FreeSurface(tex);    }    else    {        printf("Failed\nQuitting...");        SDL_Quit();        exit(-1);    }    return texture;}


edit: it should work using an existing SDL surface as well.

[edited by - Luke Philpot on October 12, 2003 3:06:03 AM]
The code Luke gives is much simpler and will work...as long as you only want to load .bmp files which are correctly byte-ordered, are RGB only, and are dimensioned in powers of 2. The code I gave was longer because it will handle pretty much any image file (jpg, png, etc.) on any machine (endianness shouldn''t be an issue).

--oberon
The Code Luke suggests is correct, far simpler and also works with all other image formats supported by SDL_Image by just replacing the SDL_LoadBMP() function call. But afaik sdl surfaces are differently organized than OpenGL textures. So the textures you load with it should come out mirrored. Nothing to worry about, but you should consider converting them so that for example writing on textures is readable.
nehe-tutorials converted to SDL:

http://www.libsdl.org/opengl/OpenGL-intro-1.1.1.zip

lesson06.c in archive does som sort of conversion on an SDL_surface which is to be used as gl-texture.
Does anyone know if either of those codes supports an alpha chanel of any kind? It''s just because i want to have an image of a frog jump around and I don''t want a square patch of white to jump around with him :S

any help would be greatly appreciated!
Eh? Last time I looked at the way GL loads textures (awhile ago, to be sure) there was some flag I had to pass that was to the effect of specifying byte order. i.e. BGR RGB RGBA etc...

maybe I'm nuts...

edit: yeah, its right there in Luke's code. GL_BGR. change it.

[edited by - C-Junkie on April 5, 2004 10:45:27 PM]
Alright, but replace it with what? Where can I find a list of options for it? Or should I just guess some and try it out?

This topic is closed to new replies.

Advertisement