Archived

This topic is now archived and is closed to further replies.

trager

SDL_Surface to OpenGL texture

Recommended Posts

trager    122
I''ve just started looking at SDL and I was wondering how easy it is to convert a SDL_Surface to a OpenGL texture? If you know could you please let me know how it is done? Thanks trager

Share this post


Link to post
Share on other sites
Ruudje    100
I dont know any sdl, but if you can access the following:
-width
-height
-rgb pixeldata

then you can turn it into a texture with standard opengl code (see nehe tutors)

Share this post


Link to post
Share on other sites
pnaxighn    122
The following code uses SDL_Image to load an image file, converts it to a raw format, and loads it as a mipmapped texture. It returns the texture number on success, 0 on an error (and hackishly prints to stdout what the error was). Possible problems:

- Printing to stdout. This is easy to change
- It may not work for all bit depths. It hasn''t failed me yet, though

If there are any other problems, just reply here.

And now, the code (pure C, cross platform,etc.):

int loadTexture ( char *filename ) {
GLuint retval;
SDL_Surface *sdlimage;
void *raw;
int w, h, i, j, bpp;
Uint8 *srcPixel, *dstPixel;
Uint32 truePixel;
GLenum errorCode;

sdlimage = IMG_Load(filename);

if ( !sdlimage ) {
printf("SDL_Image load error: %s\n", IMG_GetError());
return 0;
}
if ( sdlimage->format->BytesPerPixel < 2 ) {
printf("Bad image -- not true color!\n");
return 0;
}

w = sdlimage->w;
h = sdlimage->h;

raw = (void *)malloc( w * h * 4 );
dstPixel = (Uint8 *)raw;

SDL_LockSurface( sdlimage );

bpp = sdlimage->format->BytesPerPixel;

for ( i = h ; i > 0 ; i-- ) {
for ( j = 0 ; j < w ; j++ ) {
srcPixel = (Uint8 *)sdlimage->pixels + i * sdlimage->pitch + j * bpp;
switch (bpp) {
case 1:
truePixel = *srcPixel;
break;

case 2:
truePixel = *(Uint16 *)srcPixel;
break;

case 3:
if(SDL_BYTEORDER == SDL_BIG_ENDIAN) {
truePixel = srcPixel[0] << 16 | srcPixel[1] << 8 | srcPixel[2];
} else {
truePixel = srcPixel[0] | srcPixel[1] << 8 | srcPixel[2] << 16;
}
break;

case 4:
truePixel = *(Uint32 *)srcPixel;
break;

default:
printf("Image bpp of %d unusable\n", bpp);
return 0;
break;
}
SDL_GetRGBA( truePixel, sdlimage->format, &(dstPixel[0]), &(dstPixel[1]), &(dstPixel[2]), &(dstPixel[3]));
dstPixel++;
dstPixel++;
dstPixel++;
dstPixel++;
}
}

SDL_UnlockSurface( sdlimage );
SDL_FreeSurface( sdlimage );

while ( glGetError() ) { ; }

glGenTextures( 1, &retval );
glBindTexture( GL_TEXTURE_2D, retval );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );

errorCode = glGetError();
if ( errorCode != 0 ) {
if ( errorCode == GL_OUT_OF_MEMORY ) {
printf("Out of texture memory!\n");
}
return 0;
}

gluBuild2DMipmaps( GL_TEXTURE_2D, 4, w, h, GL_RGBA, GL_UNSIGNED_BYTE, (Uint8 *)raw);

errorCode = glGetError();
if ( errorCode != 0 ) {
if ( errorCode == GL_OUT_OF_MEMORY ) {
printf("Out of texture memory!\n");
}
return 0;
}

return retval;
}

Share this post


Link to post
Share on other sites
Luke Philpot    200
errr... o_0


/* load_image -- basic GL texture loading */

int load_image(char *file)
{
SDL_Surface *tex = SDL_LoadBMP(file);
GLuint texture;

printf("Status: Loading image ");
printf(file);
printf("... ");

if(tex)
{
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);

glTexImage2D(GL_TEXTURE_2D, 0, 3, tex->w, tex->h,
0, GL_BGR, GL_UNSIGNED_BYTE, tex->pixels);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

printf("OK\n");
SDL_FreeSurface(tex);
}
else
{
printf("Failed\nQuitting...");
SDL_Quit();
exit(-1);
}

return texture;
}


edit: it should work using an existing SDL surface as well.

[edited by - Luke Philpot on October 12, 2003 3:06:03 AM]

Share this post


Link to post
Share on other sites
pnaxighn    122
The code Luke gives is much simpler and will work...as long as you only want to load .bmp files which are correctly byte-ordered, are RGB only, and are dimensioned in powers of 2. The code I gave was longer because it will handle pretty much any image file (jpg, png, etc.) on any machine (endianness shouldn''t be an issue).

--oberon

Share this post


Link to post
Share on other sites
MIT_Service    122
The Code Luke suggests is correct, far simpler and also works with all other image formats supported by SDL_Image by just replacing the SDL_LoadBMP() function call. But afaik sdl surfaces are differently organized than OpenGL textures. So the textures you load with it should come out mirrored. Nothing to worry about, but you should consider converting them so that for example writing on textures is readable.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
nehe-tutorials converted to SDL:

http://www.libsdl.org/opengl/OpenGL-intro-1.1.1.zip

lesson06.c in archive does som sort of conversion on an SDL_surface which is to be used as gl-texture.

Share this post


Link to post
Share on other sites
LunarCrisis    122
Does anyone know if either of those codes supports an alpha chanel of any kind? It''s just because i want to have an image of a frog jump around and I don''t want a square patch of white to jump around with him :S

any help would be greatly appreciated!

Share this post


Link to post
Share on other sites
C-Junkie    1099
Eh? Last time I looked at the way GL loads textures (awhile ago, to be sure) there was some flag I had to pass that was to the effect of specifying byte order. i.e. BGR RGB RGBA etc...

maybe I'm nuts...

edit: yeah, its right there in Luke's code. GL_BGR. change it.

[edited by - C-Junkie on April 5, 2004 10:45:27 PM]

Share this post


Link to post
Share on other sites
krumms    126
A word of warning to those using pnaxighn''s code:

Don''t forget to free the memory pointed to by raw. Otherwise, expect massive memory leaks.

Share this post


Link to post
Share on other sites
krumms    126
In fact, there''s lots of little memory leaks.

Don''t forget glDeleteTextures before you return from that function ...

Share this post


Link to post
Share on other sites
krumms    126
Ack ... still more problems with pnaxighn's code:

a) in the for loop, i = h should be i = h-1
b) in that same loop, i > 0 should be i >= 0

Here's his code, with my fixes:


int loadTexture ( char *filename ) {
GLuint retval;
SDL_Surface *sdlimage;
void *raw;
int w, h, i, j, bpp;
Uint8 *srcPixel, *dstPixel;
Uint32 truePixel;
GLenum errorCode;

sdlimage = IMG_Load(filename);

if ( !sdlimage ) {
printf("SDL_Image load error: %s\n", IMG_GetError());
return 0;
}
if ( sdlimage->format->BytesPerPixel < 2 ) {
printf("Bad image -- not true color!\n");
SDL_FreeSurface (sdlimage);
return 0;
}

w = sdlimage->w;
h = sdlimage->h;

raw = (void *)malloc( w * h * 4 );
dstPixel = (Uint8 *)raw;

SDL_LockSurface( sdlimage );

bpp = sdlimage->format->BytesPerPixel;

for ( i = h-1 ; i >= 0 ; i-- ) {
for ( j = 0 ; j < w ; j++ ) {
srcPixel = (Uint8 *)sdlimage->pixels + i * sdlimage->pitch + j * bpp;
switch (bpp) {
case 1:
truePixel = *srcPixel;
break;

case 2:
truePixel = *(Uint16 *)srcPixel;
break;

case 3:
if(SDL_BYTEORDER == SDL_BIG_ENDIAN) {
truePixel = srcPixel[0] << 16 | srcPixel[1] << 8 | srcPixel[2];
} else {
truePixel = srcPixel[0] | srcPixel[1] << 8 | srcPixel[2] << 16;
}
break;

case 4:
truePixel = *(Uint32 *)srcPixel;
break;

default:
printf("Image bpp of %d unusable\n", bpp);
SDL_UnlockSurface (sdlimage);
SDL_FreeSurface (sdlimage);
free (raw);
return 0;
}

SDL_GetRGBA( truePixel, sdlimage->format, &(dstPixel[0]), &(dstPixel[1]), &(dstPixel[2]), &(dstPixel[3]));
dstPixel++;
dstPixel++;
dstPixel++;
dstPixel++;
}
}

SDL_UnlockSurface( sdlimage );
SDL_FreeSurface( sdlimage );

while ( glGetError() ) { ; }

glGenTextures( 1, &retval );
glBindTexture( GL_TEXTURE_2D, retval );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );

errorCode = glGetError();
if ( errorCode != 0 ) {
if ( errorCode == GL_OUT_OF_MEMORY ) {
printf("Out of texture memory!\n");
}

glDeleteTextures (1, &retval);
free (raw);
return 0;
}

gluBuild2DMipmaps( GL_TEXTURE_2D, 4, w, h, GL_RGBA, GL_UNSIGNED_BYTE, (Uint8 *)raw);

errorCode = glGetError();
if ( errorCode != 0 ) {
if ( errorCode == GL_OUT_OF_MEMORY ) {
printf("Out of texture memory!\n");
}

glDeleteTextures (1, &retval);
free (raw);
return 0;
}

return retval;
}


[edited by - krumms on April 18, 2004 11:00:10 PM]

Share this post


Link to post
Share on other sites
krumms    126
In case anyone was wondering too, pnaxighn''s code (with my humble fixes ) IS an improvement on Luke''s, because:

a) It works with image file formats other than Windows Bitmaps.
b) It works with non-32bit images (contrary to what one poster said, if your image is not 32-bit and/or not the same internal format as indicated in the glTexImage2D call, you can expect anything from backwards images to inverted/wrong colours to crashes as OpenGL goes searching through junky memory)
c) It includes alpha data in the texture.

All that said, there are still likely to be problems I''ve missed with pnaxighn''s code (some of the assignments may leave parts of truePixel unset, though I don''t know the C++ spec well enough to say for certain) and Luke''s code is MUCH SIMPLER.

If you''re a newbie, use Windows .BMP files and Luke''s code and you can''t go wrong.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
quote:
Original post by LunarCrisis
Does anyone know if either of those codes supports an alpha chanel of any kind? It''s just because i want to have an image of a frog jump around and I don''t want a square patch of white to jump around with him :S

any help would be greatly appreciated!


BMP''s don''t support alpha channels

Share this post


Link to post
Share on other sites
C-Junkie    1099
quote:
Original post by siaspete
Why not just use SDL''s own SDL_ConvertSurface to convert your surface to the OpenGL format?

No Kidding!

ourSurface = TTF_RenderText_Solid(ourFont, "Yo diggity, dog!", ourColor);
w = pow(2,ceil(log(ourSurface->w)/log(2))); /* round up to the nearest power of two*/
myNewSurface = SDL_CreateRGBSurface(0,w,w,24,0xff000000,0x00ff0000,0x0000ff00,0);
SDL_BlitSurface(ourSurface,0,myNewSurface,0); /* blit onto a purely RGB Surface*/

glGenTextures( 1, &texture );
glBindTexture( GL_TEXTURE_2D, texture );
glTexImage2D( GL_TEXTURE_2D, 0, 3, w, w, 0, GL_RGB,
GL_UNSIGNED_BYTE, myNewSurface->pixels );

snippet of my source code from the other thread I answered SDL->OGL questions in.

If oyu don''t need to resize to a power of two dimension, then SDL_ConverSurface will work, whereas I use SDL_BlitSurface onto a new SDL_CreateRGBSurface

Share this post


Link to post
Share on other sites