glTexImage2D(
GL_TEXTURE_2D,
0,
pTexture->m_pSDLSurface->format->BytesPerPixel,
pTexture->m_pSDLSurface->w,
pTexture->m_pSDLSurface->h,
0,
GL_RGB, // format... help! :)
GL_UNSIGNED_BYTE,
pTexture->m_pSDLSurface->pixels
);
I figured out how to get the most info needed directly from the SDL surface. But how do I figure out the format? Make my own GLenum format and set it with a switch(ByterPerPixel) or something? :/ Not sure what has to be done...
any help is appreciated!
thanx guys... (oh, and girls to! as if *heh*)
"No lies of sugar can sweeten the sournes of reality"
}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]
SDL&OGL - how do I figure out glTexImage2D::format ?
Hi!
I''m loading a texture with SDL_Surface and SDL_LoadBMP, works nice. But this call flips out...
Good question. I assume SDL surfaces all hav the same format.
Nehe''s tuts work. so check which format they use. (the SDL/OGL ones of course)
Nehe''s tuts work. so check which format they use. (the SDL/OGL ones of course)
I used gluBuild2DMipmaps instead of glTexImage2D but maybe it can help you. This is my texture loading function :
[edited by - Sork on August 10, 2003 11:46:47 PM]
void textureLoad(char * filename, GLuint textureArray[], int textureID){ SDL_Surface * image; image = IMG_Load(filename); if (image == NULL){ printf("Error while loading %s.\n", filename); Quit(0); } glGenTextures(1, &textureArray[textureID]); glBindTexture(GL_TEXTURE_2D, textureArray[textureID]); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER, GL_LINEAR); gluBuild2DMipmaps(GL_TEXTURE_2D, 3, image->w, image->h, GL_RGB, GL_UNSIGNED_BYTE, image->pixels); SDL_FreeSurface(image);}
[edited by - Sork on August 10, 2003 11:46:47 PM]
AP: my texture is 64x128
Sork: you seem to say 3 BytesPerPixel and then use GL_RGB. My surface (pSDLSurface->format->BytesPerPixel) returns 1. And when I check the properties of the image (using explorer) it says 8 bits per pixel, so that seems right...
But what Format is that? :/
"No lies of sugar can sweeten the sournes of reality"
}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]
Sork: you seem to say 3 BytesPerPixel and then use GL_RGB. My surface (pSDLSurface->format->BytesPerPixel) returns 1. And when I check the properties of the image (using explorer) it says 8 bits per pixel, so that seems right...
But what Format is that? :/
"No lies of sugar can sweeten the sournes of reality"
}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]
Firstly Im not sure that OGL supports textures that aren''t
square.
1 byte per pixel would mean indexed mode, so either you need
to have SDL unpack it to a true color mode, or use an image editing proggy to change bitdepth or figoure out how to use
ogl with a palette
square.
1 byte per pixel would mean indexed mode, so either you need
to have SDL unpack it to a true color mode, or use an image editing proggy to change bitdepth or figoure out how to use
ogl with a palette
I resaved the image as a 24bit .bmp
C-Junkie: the SDL versions of the NeHe tutorials uses aux to load images and such, not SDL they only used SDL for window creation etc...
AP: OGL supports non square images, and you can mipmap the image so it becomes power of 2''s (if it wasn''t). I think? well either way... that''s not the problem :/
the problem is figuring out what GLenum format I should use, and how I get that info from SDL or SDL_Surface.
Anyone?
"No lies of sugar can sweeten the sournes of reality"
}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]
C-Junkie: the SDL versions of the NeHe tutorials uses aux to load images and such, not SDL they only used SDL for window creation etc...
AP: OGL supports non square images, and you can mipmap the image so it becomes power of 2''s (if it wasn''t). I think? well either way... that''s not the problem :/
the problem is figuring out what GLenum format I should use, and how I get that info from SDL or SDL_Surface.
Anyone?
"No lies of sugar can sweeten the sournes of reality"
}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]
yes... then I have to resave everything to 24 bits and my loading function isn''t very flexible...
don''t know if my question is clear, must be my stupid english.
I have this SDL_Surface I get from SDL_LoadBMP.
How do I call glTexImage2D() with the correct parameters based on that SDL_Surface?
thank you
"No lies of sugar can sweeten the sournes of reality"
}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]
don''t know if my question is clear, must be my stupid english.
I have this SDL_Surface I get from SDL_LoadBMP.
How do I call glTexImage2D() with the correct parameters based on that SDL_Surface?
thank you
"No lies of sugar can sweeten the sournes of reality"
}+TITANIUM+{ A.K.A. DXnewbie[onMIRC]
you can use this nifty function to reformat the 8byter to whatever your screen is currently set at:
SDL_Surface *SDL_FormatDisplay( SDL_Surface * );
it reformats the image to the same depth as the screen.
24 == GL_RGB
32 == GL_RGBA
16 == no idea
Hope this helps a tad .
edit: that is, by reformats, I mean it returns a new surface that is at the same depth as the screen.
[edited by - MaulingMonkey on August 12, 2003 6:26:49 AM]
SDL_Surface *SDL_FormatDisplay( SDL_Surface * );
it reformats the image to the same depth as the screen.
24 == GL_RGB
32 == GL_RGBA
16 == no idea
Hope this helps a tad .
edit: that is, by reformats, I mean it returns a new surface that is at the same depth as the screen.
[edited by - MaulingMonkey on August 12, 2003 6:26:49 AM]
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement