Sign in to follow this  

OpenGL C++ Texture Loading Troubles

This topic is 1912 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

As the title suggests, I'm having a hard time loading textures with OpenGL, specificly in Windows (XP). I'm basicly just using thecplusplusguy (youtube.com/thecplusplusguy)'s code for loading textures, found [url="http://pastebin.com/HCeRQPjS"]here[/url]. It compiles runs fine in Ubuntu Linux, as well as Linux Mint, but when I tried to compile it in Windows, I got this error: "GL_UNSIGNED_INT_8_8_8_8 was not declared in this scope", so I put "#define GL_UNSIGNED_INT_8_8_8_8 0x8035" at the top of my file. That compiled fine, but when I ran my program, where my texture should be on my plane (GL_QUADS) was just completly white. So I tried to #include <GL/glext.h>, but that had the exact same effect, just a white plane. I tried just taking away the .jpg image (I also tried the .bmp image format) from the folder where my program was in, and as expected, the program crashed, which I guess is a good thing. I'm sorry if I'm using bad terminology or anything, as I am a C++ and openGL noob. Any help would be awesome, as I am completly stuck, and I'm on a bit of a deadline (this is a project for school). Thanks in advance,

Peter

Share this post


Link to post
Share on other sites
Out of curiosity have you tried using GL_UNSIGNED_BYTE instead? My first guess would be that it will work on Windows using that value. I believe that GL_UNSIGNED_INT_8_8_8_8 means the pixels are stored in logical order, and since you specified GL_RGBA as the format that would make red the high order byte. This wouldn't work on a little-endian system as alpha would be the high order byte by specifying GL_RGBA. By using GL_UNSIGNED_BYTE it should upload RBGA properly without paying mention to the endianness of the system.

Although on that note you also have me confused as to why this works on Ubuntu. My total guess would be that GL_UNSIGNED_BYTE_8_8_8_8 is redefined to GL_UNSIGNED_BYTE on that platform / tools. Edited by Saruman

Share this post


Link to post
Share on other sites
Yeah, the GL_UNSIGNED_BYTE isn't working. I heard somewhere that Windows only supports OpenGL 1.0, wheras on Linux it's a higher version? I don't know if this is true, or if it's even related to my problem. I will post my code for my texture testing program, because there is a good chance that I could be doing something else wrong

[CODE]
#include <SDL/SDL.h>
#include <SDL/SDL_image.h>
#include <GL/gl.h>
#include <GL/glu.h>
//#include <GL/glext.h>
//#define GL_UNSIGNED_INT_8_8_8_8 0x8035
unsigned int loadTexture(const char* name)
{
SDL_Surface* img=IMG_Load(name);
SDL_PixelFormat form={NULL,32,4,0,0,0,0,8,8,8,8,0xff000000,0x00ff0000,0x0000ff00,0x000000ff,0,255};
SDL_Surface* img2=SDL_ConvertSurface(img,&form,SDL_SWSURFACE);
unsigned int texture;
glGenTextures(1,&texture);
glBindTexture(GL_TEXTURE_2D,texture);
glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,img2->w,img2->h,0,GL_RGBA, GL_UNSIGNED_BYTE,img2->pixels);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
SDL_FreeSurface(img);
SDL_FreeSurface(img2);
return texture;
}
unsigned int tex = loadTexture("brickFace.jpg");
void init(){
glClearColor(0.0,0.0,0.0,1.0);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(45,800.0/600.0,1.0,400.0);
glMatrixMode(GL_MODELVIEW);
glEnable(GL_DEPTH_TEST);
glEnable(GL_TEXTURE_2D);
}
void render(){
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glTranslatef(0.0,0.0,-20.0);
glBindTexture(GL_TEXTURE_2D,tex);
glBegin(GL_QUADS);
glTexCoord2f(0.0,0.0);
glVertex3f(-2.0,-2.0,0.0);
glTexCoord2f(1.0,0.0);
glVertex3f(2.0,-2.0,0.0);
glTexCoord2f(1.0,1.0);
glVertex3f(2.0,2.0,0.0);
glTexCoord2f(0.0,1.0);
glVertex3f(-2.0,2.0,0.0);
glEnd();
}
int main(int argc,char** argv){
SDL_Init(SDL_INIT_EVERYTHING);
SDL_Surface *screen;
screen = SDL_SetVideoMode(800,600,32,SDL_SWSURFACE|SDL_OPENGL);
bool runLoop = true;
const int fps = 24;
Uint32 start;
SDL_Event event;
init();
while(runLoop)
{
start = SDL_GetTicks();
while(SDL_PollEvent(&event))
{
switch(event.type)
{
case SDL_QUIT:
runLoop = false;
break;
}
}
render();
SDL_GL_SwapBuffers();

if(1000/fps > SDL_GetTicks()-start){
SDL_Delay(1000/fps-(SDL_GetTicks()-start));
}
}
}
[/CODE] Edited by peterlake

Share this post


Link to post
Share on other sites
I don't have time to read through your code but one mistake I see right away is that you are loading a jpg file and specifying your format as GL_RGBA even though jpg does not carry an alpha channel. You need to specify the proper internal format for the data you are giving to glTexImage2d and in this case you would need to pad your data buffer with an A channel... or an easier way would be not to use jpg (since you would never use that for a texture anyways) and use something like a TGA or PNG. Edited by joew

Share this post


Link to post
Share on other sites
That doesn't seem to be working either :( I heard that graphics cards don't like textures that don't have width's and height's of powers of 2, so I was sure that all of my textures were 256x256, I figured that might be worth mentioning

Share this post


Link to post
Share on other sites
Sign in to follow this