Jump to content
  • Advertisement
Sign in to follow this  
silverphyre673

Getting danged textures to display in SDL!

This topic is 4841 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
Using SDL with OpenGL is (IMO) the way to go hands down. It's not hard, there is a ton of sample code out there (shameless plug: in 3,2,1) like in my project, photon. You draw the texture on a quad and then can rotate it with almost no overhead. Also you don't even have to really optimize the OpenGL code because the speedboost over straight DirectDraw is so much that even if you are doing it fairly inefficiently (or what would be inefficient in a high-end 3D OGL app) the fact it's HW accelerated makes up for it.

Share this post


Link to post
Share on other sites
That's what I thought; I'm really more concerned with how much code it takes to set up, and how departed it is from a simple BlitSurface. What do you think? I really don't want to ugly up my code! Thanks.

//EDIT

Uh oh... I spent a couple hours working on a GUI for my game a few days ago; it WAS working great. Now it isn't rendering. For the text, I was using SDL_ttf. I was just rendering the windowing system with SDL_BlitSurface and SDL_FillRect... now nothing is showing up. Am I out of luck, and stuck finding a way to render everything differently, or is there some way to use that rendering code when I need it (orthographically), and to use the OpenGL rendering for the game objects? Thanks.

[Edited by - silverphyre673 on August 17, 2005 1:28:06 AM]

Share this post


Link to post
Share on other sites
Actually, I got it set up... I'm changing the title of the thread. I'm having trouble getting my textures to display. Here is the the relevant code:


bool Render()
{
//Clear the screen
glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT);

/*Reset modelview matrix*/
glLoadIdentity();

/*Every game object will be rendered from here*/
glTranslatef(0, 0, -50);

RenderSurface(&texture, 0, 0, 0.0f); //See below...

/*Finished 3D rendering... on to GUI in ortho mode*/


SDL_GL_SwapBuffers();
return true;
}

void RenderSurface(Texture * tx, int x, int y, float angle)
{
angle += 90.0f;

if (!tx->built)
return;

glBindTexture(GL_TEXTURE_2D, tx->texture_id);

glBegin(GL_QUADS);

glTexCoord2f(0.0f, 0.0f); glVertex3f((float)x * scale, (float)y, 0.0f);
glTexCoord2f(1.0f, 0.0f); glVertex3f(((float)x + tx->w) * scale, (float)y * scale, 0.0f);
glTexCoord2f(1.0f, 1.0f); glVertex3f(((float)x + tx->w) * scale, ((float)y + tx->h) * scale, 0.0f);
glTexCoord2f(0.0f, 1.0f); glVertex3f((float)x * scale, ((float)y + tx->h) * scale, 0.0f);

glEnd();
}


struct Texture
{
Texture():w(0), h(0), texture_id(0), texture(0){}
Texture(const char * fn);
~Texture(){SDL_FreeSurface(texture);}

SDL_Surface * texture;
int w, h;
GLuint texture_id;
bool built;
};

Texture::Texture(const char * fn)
: texture(0)
{
texture = IMG_Load(fn);
if (!texture)
{
fprintf(stderr, "Failed to load texture: %s\n", fn);
built = false;
return;
}

w = texture->w;
h = texture->h;

glGenTextures(1, &texture_id);
glBindTexture(GL_TEXTURE_2D, texture_id);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR); // scale linearly when image bigger than texture
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR); // scale linearly when image smalled than texture

glTexImage2D(GL_TEXTURE_2D, 0, 3, w, h, 0, GL_RGB, GL_UNSIGNED_BYTE, texture->pixels);
built = true;
}




Now, what ends up happening is that a white square ends up getting rendered in exactly the right spot, and is the same size as the texture it is supposed to be rendering. I have three ways of making sure the texture loaded, so I think the problem is something else... Can you spot why the texture isn't being rendered? I turned on GL_TEXTURE_2D before doing any rendering... Thanks a bunch.

Share this post


Link to post
Share on other sites
Quote:
Original post by Enigma
Do you have a valid OpenGL context at the point where you generate your texture?

Enigma


Sorry... I'm not very experienced with OpenGL. I'm trying to figure out what an OpenGL context is. I googled it, and it looks like something to do with win32 specific programming. An HDC of some sort. Can you explain what you mean, please? Thanks.

Share this post


Link to post
Share on other sites
The OpenGL context is basically an encapsulation of the OpenGL state. Since OpenGL is a state machine it has to encapsulate its state somewhere so that multiple OpenGL programs can run at once, otherwise they would all be messing with each others states. Because of this anytime you perform a stateful action in OpenGL (i.e. uploading a texture) it acts on the current context. If there is no current context it fails because there is nowhere to store the state. It looks like under SDL the OpenGL context is created as part of the SDL_SetVideoMode call if you specify SDL_OPENGL as one of the flags, so make sure you've called that before you try to create your texture.

Enigma

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!