Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

Seriema

SDL&OGL - how do I figure out glTexImage2D::format ?

This topic is 5522 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi! I''m loading a texture with SDL_Surface and SDL_LoadBMP, works nice. But this call flips out...
glTexImage2D(
    GL_TEXTURE_2D, 
    0,
    pTexture->m_pSDLSurface->format->BytesPerPixel,
    pTexture->m_pSDLSurface->w, 
    pTexture->m_pSDLSurface->h,
    0,
    GL_RGB,  // format... help! :)

    GL_UNSIGNED_BYTE,
    pTexture->m_pSDLSurface->pixels
);
I figured out how to get the most info needed directly from the SDL surface. But how do I figure out the format? Make my own GLenum format and set it with a switch(ByterPerPixel) or something? :/ Not sure what has to be done... any help is appreciated! thanx guys... (oh, and girls to! as if *heh*) "No lies of sugar can sweeten the sournes of reality" }+TITANIUM+{ A.K.A. DXnewbie[onMIRC]

Share this post


Link to post
Share on other sites
Advertisement
Good question. I assume SDL surfaces all hav the same format.

Nehe''s tuts work. so check which format they use. (the SDL/OGL ones of course)

Share this post


Link to post
Share on other sites
I used gluBuild2DMipmaps instead of glTexImage2D but maybe it can help you. This is my texture loading function :


void textureLoad(char * filename, GLuint textureArray[], int textureID)
{
SDL_Surface * image;

image = IMG_Load(filename);

if (image == NULL)
{
printf("Error while loading %s.\n", filename);
Quit(0);
}

glGenTextures(1, &textureArray[textureID]);

glBindTexture(GL_TEXTURE_2D, textureArray[textureID]);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER, GL_LINEAR);

gluBuild2DMipmaps(GL_TEXTURE_2D, 3, image->w, image->h, GL_RGB, GL_UNSIGNED_BYTE, image->pixels);

SDL_FreeSurface(image);
}


[edited by - Sork on August 10, 2003 11:46:47 PM]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
and your''e sure that the surface have dimensions that are a power of 2?

Share this post


Link to post
Share on other sites
AP: my texture is 64x128

Sork: you seem to say 3 BytesPerPixel and then use GL_RGB. My surface (pSDLSurface->format->BytesPerPixel) returns 1. And when I check the properties of the image (using explorer) it says 8 bits per pixel, so that seems right...

But what Format is that? :/

"No lies of sugar can sweeten the sournes of reality"

}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Firstly Im not sure that OGL supports textures that aren''t
square.

1 byte per pixel would mean indexed mode, so either you need
to have SDL unpack it to a true color mode, or use an image editing proggy to change bitdepth or figoure out how to use
ogl with a palette

Share this post


Link to post
Share on other sites
I resaved the image as a 24bit .bmp

C-Junkie: the SDL versions of the NeHe tutorials uses aux to load images and such, not SDL they only used SDL for window creation etc...

AP: OGL supports non square images, and you can mipmap the image so it becomes power of 2''s (if it wasn''t). I think? well either way... that''s not the problem :/

the problem is figuring out what GLenum format I should use, and how I get that info from SDL or SDL_Surface.
Anyone?

"No lies of sugar can sweeten the sournes of reality"

}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
well if you have it in 24BPP rgb format it really ought to be
GL_RGB

Share this post


Link to post
Share on other sites
yes... then I have to resave everything to 24 bits and my loading function isn''t very flexible...

don''t know if my question is clear, must be my stupid english.

I have this SDL_Surface I get from SDL_LoadBMP.
How do I call glTexImage2D() with the correct parameters based on that SDL_Surface?

thank you

"No lies of sugar can sweeten the sournes of reality"

}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]

Share this post


Link to post
Share on other sites
you can use this nifty function to reformat the 8byter to whatever your screen is currently set at:

SDL_Surface *SDL_FormatDisplay( SDL_Surface * );

it reformats the image to the same depth as the screen.

24 == GL_RGB
32 == GL_RGBA
16 == no idea

Hope this helps a tad .

edit: that is, by reformats, I mean it returns a new surface that is at the same depth as the screen.

[edited by - MaulingMonkey on August 12, 2003 6:26:49 AM]

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!