Jump to content
  • Advertisement
Sign in to follow this  
lem77

Trouble with SDL / OpenGL

This topic is 4886 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I am absolute noob to game programming. I started a programm with SDL and OpenGL on a very basic level (drawing textured quads and stuff ). Everything went fine, but now I started on doing some alpha blending, but always textures are drawn completly opaque. I load them with my textureload-function (which I saw hundreds of times on the web), but nothing seems to work :( I´m desperate and was searching the web for stuff regarding opengl sdl alphablending, but I can´t find whats wrong. I post some of my routines, maybe one of u guys can see whats going wrong and/or help me on pixelformat, as I suppose, there is something wrong (at least I dont understand it really) Thanx a lot. My OpenGL Init looks like this: SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 5 ); SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 5 ); SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 5 ); SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 5 ); SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 ); SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, m_nScreenDepth); SDL_GL_SetAttribute( SDL_GL_STENCIL_SIZE, 0); SDL_GL_SetAttribute( SDL_GL_ACCUM_RED_SIZE, 0); SDL_GL_SetAttribute( SDL_GL_ACCUM_GREEN_SIZE, 0); SDL_GL_SetAttribute( SDL_GL_ACCUM_BLUE_SIZE, 0); SDL_GL_SetAttribute( SDL_GL_ACCUM_ALPHA_SIZE, 0); I´m loading textures that way: SDL_LoadBMP(sFileName.c_str() conv = SDL_CreateRGBSurface(SDL_SWSURFACE, bitmap->w, bitmap->h, 32, #if SDL_BYTEORDER == SDL_BIG_ENDIAN 0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff); #else 0x000000ff, 0x0000ff00, 0x00ff0000, 0xff000000); #endif SDL_BlitSurface(bitmap, 0, conv, 0); glGenTextures(1, &texture.uiID); glBindTexture(GL_TEXTURE_2D, texture.uiID); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,iMinFilter); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,iMagFilter); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, iWrapS); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, iWrapT); glPixelStorei(GL_UNPACK_ROW_LENGTH, conv->pitch / conv->format->BytesPerPixel); if (bMipMap) { glPixelStorei(GL_UNPACK_ROW_LENGTH, 0); gluBuild2DMipmaps(GL_TEXTURE_2D, 3, conv->w, conv->h, GL_RGBA, GL_UNSIGNED_BYTE, conv->pixels); } else glTexImage2D(GL_TEXTURE_2D, 0, 3, conv->w, conv->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, conv->pixels); } m_TextureInfo[sName] = texture; SDL_FreeSurface(bitmap); SDL_FreeSurface(conv);

Share this post


Link to post
Share on other sites
Advertisement
Well the most likely reason is that your bitmap doesn't actually contain any alpha components. Also I don't see you call glBlendFunc() or glEnable(GL_BLEND).

Share this post


Link to post
Share on other sites
Thanks for answering to that "stupid" post.

I don´t know how to post source-code correctly in the forum , and I didn´t want to spam the message with unreaable code.

Basically I´m using the 3DSprite examplae from GPG1.

<code>
// use material
CMaterialManager::GetSingleton().Use(m_sMaterialName);

// draw the sprite
glBegin(GL_QUADS);

glColor4f(1.0f, 1.0f, 1.0f, m_fAlpha);
glTexCoord2f(0.0f, 0.0f);
glVertex3f(-m_iWidth/2, -m_iHeight/2, 0);

glColor4f(1.0f, 1.0f, 1.0f, m_fAlpha);
glTexCoord2f(1.0f, 0.0f);
glVertex3f(-m_iWidth/2, m_iHeight/2, 0);

glColor4f(1.0f, 1.0f, 1.0f, m_fAlpha);
glTexCoord2f(1.0f, 1.0f);
glVertex3f(m_iWidth/2, m_iHeight/2, 0);

glColor4f(1.0f, 1.0f, 1.0f, m_fAlpha);
glTexCoord2f(0.0f, 1.0f);
glVertex3f(m_iWidth/2, -m_iHeight/2, 0);

glEnd();
</code>

My Use-Material function looks like this:

<code>
if ( m_bBlending )
{
glEnable( GL_BLEND );
glBlendFunc( GL_SRC_ALPHA, GL_ONE );
glDisable( GL_CULL_FACE );
glDisable( GL_DEPTH_TEST );
glDepthMask( 0 );
}
</code>

While initialising the sprite class, I give an alpha value like "0.5f"

BTW I am using the texture from Nehe´s tutorial 8 (about transparency), so this should be 32bit / contain an alpha channel.

HELP :)

Share this post


Link to post
Share on other sites
Is it possible, that library exported GL-calls aren´t executed? When I´m debbuging all values seem to be in order (reading material from xml, buildung object structure for material, use it)
Thanks

Share this post


Link to post
Share on other sites
To post code use the
 and your code here....

[smile]. Hmm it will automaticly turn it into a code block so click on edit on my poste and look.

Share this post


Link to post
Share on other sites
32bit bitmaps don't have an Alpha channel.

so you need to have an image saved to a format that uses one
(png?) and use SDL_Image to load the image in.

Share this post


Link to post
Share on other sites
 
void say_thanx(void)
{
cout << "thx. as i said i am using the texture from nehe blending tutorial" << endl;
}

[wink]

i think i throw out the code and do it again

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!