Archived

This topic is now archived and is closed to further replies.

Seriema

SDL&OGL - how do I figure out glTexImage2D::format ?

Recommended Posts

Seriema    634
Hi! I''m loading a texture with SDL_Surface and SDL_LoadBMP, works nice. But this call flips out...
glTexImage2D(
    GL_TEXTURE_2D, 
    0,
    pTexture->m_pSDLSurface->format->BytesPerPixel,
    pTexture->m_pSDLSurface->w, 
    pTexture->m_pSDLSurface->h,
    0,
    GL_RGB,  // format... help! :)

    GL_UNSIGNED_BYTE,
    pTexture->m_pSDLSurface->pixels
);
I figured out how to get the most info needed directly from the SDL surface. But how do I figure out the format? Make my own GLenum format and set it with a switch(ByterPerPixel) or something? :/ Not sure what has to be done... any help is appreciated! thanx guys... (oh, and girls to! as if *heh*) "No lies of sugar can sweeten the sournes of reality" }+TITANIUM+{ A.K.A. DXnewbie[onMIRC]

Share this post


Link to post
Share on other sites
C-Junkie    1099
Good question. I assume SDL surfaces all hav the same format.

Nehe''s tuts work. so check which format they use. (the SDL/OGL ones of course)

Share this post


Link to post
Share on other sites
Sork    162
I used gluBuild2DMipmaps instead of glTexImage2D but maybe it can help you. This is my texture loading function :


void textureLoad(char * filename, GLuint textureArray[], int textureID)
{
SDL_Surface * image;

image = IMG_Load(filename);

if (image == NULL)
{
printf("Error while loading %s.\n", filename);
Quit(0);
}

glGenTextures(1, &textureArray[textureID]);

glBindTexture(GL_TEXTURE_2D, textureArray[textureID]);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER, GL_LINEAR);

gluBuild2DMipmaps(GL_TEXTURE_2D, 3, image->w, image->h, GL_RGB, GL_UNSIGNED_BYTE, image->pixels);

SDL_FreeSurface(image);
}


[edited by - Sork on August 10, 2003 11:46:47 PM]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
and your''e sure that the surface have dimensions that are a power of 2?

Share this post


Link to post
Share on other sites
Seriema    634
AP: my texture is 64x128

Sork: you seem to say 3 BytesPerPixel and then use GL_RGB. My surface (pSDLSurface->format->BytesPerPixel) returns 1. And when I check the properties of the image (using explorer) it says 8 bits per pixel, so that seems right...

But what Format is that? :/

"No lies of sugar can sweeten the sournes of reality"

}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
Firstly Im not sure that OGL supports textures that aren''t
square.

1 byte per pixel would mean indexed mode, so either you need
to have SDL unpack it to a true color mode, or use an image editing proggy to change bitdepth or figoure out how to use
ogl with a palette

Share this post


Link to post
Share on other sites
Seriema    634
I resaved the image as a 24bit .bmp

C-Junkie: the SDL versions of the NeHe tutorials uses aux to load images and such, not SDL they only used SDL for window creation etc...

AP: OGL supports non square images, and you can mipmap the image so it becomes power of 2''s (if it wasn''t). I think? well either way... that''s not the problem :/

the problem is figuring out what GLenum format I should use, and how I get that info from SDL or SDL_Surface.
Anyone?

"No lies of sugar can sweeten the sournes of reality"

}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
well if you have it in 24BPP rgb format it really ought to be
GL_RGB

Share this post


Link to post
Share on other sites
Seriema    634
yes... then I have to resave everything to 24 bits and my loading function isn''t very flexible...

don''t know if my question is clear, must be my stupid english.

I have this SDL_Surface I get from SDL_LoadBMP.
How do I call glTexImage2D() with the correct parameters based on that SDL_Surface?

thank you

"No lies of sugar can sweeten the sournes of reality"

}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]

Share this post


Link to post
Share on other sites
MaulingMonkey    1730
you can use this nifty function to reformat the 8byter to whatever your screen is currently set at:

SDL_Surface *SDL_FormatDisplay( SDL_Surface * );

it reformats the image to the same depth as the screen.

24 == GL_RGB
32 == GL_RGBA
16 == no idea

Hope this helps a tad .

edit: that is, by reformats, I mean it returns a new surface that is at the same depth as the screen.

[edited by - MaulingMonkey on August 12, 2003 6:26:49 AM]

Share this post


Link to post
Share on other sites
Seriema    634
wow, cool function! gonna try it out

about the format, my guess is that I could switch() it? if BytesPerPixel == 3 I''ll go with RGB (and hope that nobody uses "swapped" .bmp''s). and so on?..

"No lies of sugar can sweeten the sournes of reality"

}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]

Share this post


Link to post
Share on other sites
genne    122
Uhm, just one thing. Doesn''t sdl describe how the pixelformat works pretty well?

http://sdldoc.csn.ul.ie/sdlpixelformat.php

Share this post


Link to post
Share on other sites
MaulingMonkey    1730
quote:
Original post by Seriema
wow, cool function! gonna try it out

about the format, my guess is that I could switch() it? if BytesPerPixel == 3 I''ll go with RGB (and hope that nobody uses "swapped" .bmp''s). and so on?..

"No lies of sugar can sweeten the sournes of reality"

}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]


Indeed. It was originally made to preprocess images so that blitting would occur faster (so it didn''t need to format on blit, but only once ahead of time).

I''ve had edian problems with this on my linux box, getting the Red and Blue values mixed up, which I had to swap manually...

You could use information from the link gene gave to reformat 16 bit images to a higher depth... but for the most part I don''t see why he posted that link...

Share this post


Link to post
Share on other sites