Archived

This topic is now archived and is closed to further replies.

Gaiiden

sdl alpha in hardware

Recommended Posts

Gaiiden    5710
I just started toying around with alpha-blending in SDL to create some nice screen fade effects and text fades, image transparency, yadda yadda yadda. Well I got all that working just fine (yey) the only problem is when I take it off my laptop (which has this stinky onboard Intel graphics chip) and run it on a GeForce Go 420 - the alpha fades are slow as hell! Yikes! I looked into the SDL docs and tried out SDL_DisplayFormatAlpha, but all that did was convert my color-keyed pixels to an alpha value, which after closer reading I discovered is what it does anyway. Doh. I''m using alpha-blending by utilizing the SetAlpha function to modify the per-surface alpha values (so I''m not using an alpha channel, or RGBA surface). Would an alpha channel be better compared to a per-surface alpha value? Any help would be appreciated - this slowdown on the heavy duty graphics cards kinda splashed some rather cold water in my face _________________________________________________________________ Drew Sikora A.K.A. Gaiiden ICQ #: 70449988 AOLIM: DarkPylat Blade Edge Software Staff Member, GDNet Public Relations, Game Institute 3-time Contributing author, Game Design Methods , Charles River Media Online column - Design Corner at Pixelate Unnoficial IGDA chat! [polaris.starchat.net -> #igda] NJ IGDA Chapter - NJ developers unite!! [Chapter Home | Chapter Forum] "Real programmers don''t work from 9 to 5. If any real programmers are around at 9am it''s because they were up all night." -Anon. "It''s not the size of the source that matters, it''s how you use it" - Self

Share this post


Link to post
Share on other sites
yspotua    122
I don''t have a specific answer to your question, but I do remember several people on the SDL mailing list (including the originator of SDL) said that if you needs lots of alpha blending, its better to use openGL for graphics.

Share this post


Link to post
Share on other sites
Gaiiden    5710
awwww that would just suck I mean I''m not saying OGL is bad or anything, just more work than I wanna do to get alpha working since I don''t have an OGL layer at all in my framework, heh. Would prob be beneficial in the long run but... still too much work

Share this post


Link to post
Share on other sites
Kylotan    10008
Read the SDL mailing list, as this comes up Every Day or so

Short answer: 2D alpha blending blits are not supported on nearly all modern cards. The same goes whether you use SDL or try to use DirectDraw directly. So, use Direct3D or OpenGL to do alpha blending.

Long answer: you may get better results if you make sure you always use software surfaces for everything, since alpha blending fully in software is better than alpha blending across both hardware and software (ie. reading from the hardware, blending, writing back to the hardware). But then your resolution and bit depth is limited as it''s impractical to blit 1024x768x32bpp across the PCI or AGP bus every frame.

[ MSVC Fixes | STL Docs | SDL | Game AI | Sockets | C++ Faq Lite | Boost
Asking Questions | Organising code files | My stuff | Tiny XML | STLPort]

Share this post


Link to post
Share on other sites
deadalive    122
I use opengl and SDL to do this, something like Nehe did. I try to avoid the use of SDL''s Blitting commands because I''ve heard over and over about how much better it is to use opengl commands. It may take a bit longer, but it''s worth it to have better performance, right?

Share this post


Link to post
Share on other sites