Archived

This topic is now archived and is closed to further replies.

Why is SDL so slow?

This topic is 5295 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Yikes. I wanted to try SDL (http://www.libsdl.org/). I was pointed to a set of tutorials at Cone3D: (http://cone3d.gamedev.net/cgi-bin/index.pl?page=tutorials/gfxsdl/index ). And so I wrote a simple frame for SDL projects in VC++ Dot Net. Wow. All it's doing is initalizing SDL and the SDL_mixer library but when I compile and run, it's taking 100% of my cpu usage and 8 friggen megs of RAM (have an AthlonXP 1800+). I've written complete (yet simple) games with GLUT that only take 8% of my cpu time and maybe 6 MB of RAM, max. And this is without optimization. I'm either doing something wrong or SDL is far too inefficent for me to waste my time on. I'm no good at optimization yet. [edited by - Brien Shrimp on June 4, 2003 7:10:32 PM]

Share this post


Link to post
Share on other sites
most games use 100% of the CPU... and most game designers want their games to use 100% CPU, this way, when the game is ready doing one thing, it starts with the next right away! if you want it to only do this and that every frame, you need to tell it to do that.... you can make "a = 1 + 1" use 100% of the CPU if you want...

Share this post


Link to post
Share on other sites
Okay, well, I''ve gotten it to be less of a resource hog by using the SDL_Delay() function. So now it pauses for about 10 ms before looping. This is probably an idiotic brute-force method, but I''m having trouble finding any resources to tell me otherwise.

I don''t want this game to run at 100%. This is going to be a 2D sprite-based game. There is no reason this should need more resources than my full 3D tank game I did with GLUT. I never needed any ''wait'' functions with GLUT. It ran render() to display and in ran render() when idle! It always did the same thing, never demanded more than 8% from my CPU.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
I don''t think you know what CPU usage means. Something will use 100% of the CPU, regardless of what it is doing, if it never yields to the operating system. As uncutno said, a=1+1 willl use the entire CPU if you don''t give the OS some time to do it''s own thing. GLUT is doing this for you, internally. You just don''t realize it because it''s all behind an API.

Share this post


Link to post
Share on other sites
I''m assuming you entered a loop that does nothing but check for an escape key press and/or a close event. If you are clearing the screen in the loop as well, it might even cause the window to appear unresponsive. This has nothing to do with SDL. Your loop is hogging the cpu and because it is using the full timeslice the OS gives it even though it doesn''t need to. That''s why your call to SDL_Delay helped. It would be even nicer to call SDL_Delay with an argument of 0ms (assuming SDL_Delay is the equivalent of Sleep). This basically yields the unused time remaining in the timeslice back to the OS. Once you have a complete game around the loop though, you''ll want to use as much of that timeslice as you can get of course

If you want to experiment, write up a quick DirectDraw windowed app (if you''re on windows), or even OpenGL (without GLUT if possible). Run the standard windows game loop (if peek message do the message stuff, else do the game loop) which does nothing but clear the screen. Try to interact with the window. You''ll see a delay everytime you try to move it or resize it. Even without the call to clear the screen you''ll see a contimuous 100% cpu usage like you are now.

Share this post


Link to post
Share on other sites
I''ve got a full Tetris game built on SDL that isn''t using more than 8% CPU while the game is running.

It isn''t placing any calls to SDL_Wait() either - though SDL_PumpEvents() might do it.

Share this post


Link to post
Share on other sites
You can''t blame SDL for your bad coding. First of all your using .Net? That''s got to be slower than C. Secondly, how optimized is your program? Did you just throw it together? I bet if you coded in straight DirectX /WinAPI, you still would have a slow program. Don''t come up here and start bashing good libraries because you can''t use them properly!

Share this post


Link to post
Share on other sites
quote:
Original post by deadalive
You can''t blame SDL for your bad coding. First of all your using .Net? That''s got to be slower than C. Secondly, how optimized is your program? Did you just throw it together? I bet if you coded in straight DirectX /WinAPI, you still would have a slow program. Don''t come up here and start bashing good libraries because you can''t use them properly!
Stupid reply. If you had bothered to read the entire thread, you''d know that a) the program uses 100% CPU time, which ISN''T a sign of bad coding (how much CPU time do you think Halo uses?), and b) several people has already pointed this out.


For crying out loud, she has fishes coming out of her head on either side. How can you find this hot?!

Share this post


Link to post
Share on other sites
F-you valderman. I'm sick of you kids coming here bashing on SDL when half of you don't even know what it is! The thread is
entitled "WHY IS ***SDL*** SO SLOW"
**I** know it's not true, but some people can be mislead by such a topic title, and stay away from SDL.
Sure, if they read it, they'll find out it isn't true, but what if they don't? They get a false impression.

[edited by - deadalive on June 5, 2003 4:29:53 PM]

Share this post


Link to post
Share on other sites
I have a high performance 3D engine I programmed working and I''m using it on top of SDL, and I find it very fast, the way my engine works it actually sped up the frame redraws and it''s working like a dream, and yes my engine uses 100% CPU, it''s supposed to, you *COULD* give idle processor time but why bother? You want as much speed as you can get =)

~Graham

Share this post


Link to post
Share on other sites
quote:
Original post by deadalive
F-you valderman. I''m sick of you kids coming here bashing on SDL when half of you don''t even know what it is! The thread is
entitled "WHY IS ***SDL*** SO SLOW"
**I** know it''s not true, but some people can be mislead by such a topic title, and stay away from SDL.
Sure, if they read it, they''ll find out it isn''t true, but what if they don''t? They get a false impression.

[edited by - deadalive on June 5, 2003 4:29:53 PM]
The OP got some things wrong. If he hadn''t asked, perhaps he would have believed that SDL is slow, and abandoned it, just like you don''t want people to do. Insulting him just because of that is just plain stupid.


For crying out loud, she has fishes coming out of her head on either side. How can you find this hot?!

Share this post


Link to post
Share on other sites
Alright... I didn't read every reply here, so I dunno if what I'm gonna say has already beeen said, but I'm gonna give my input on the matter...

Basically, in your classic main Windows Message Loop thing, you have something like this:


MSG Msg;
while(1)
{
if(PeekMessage(&Msg, 0, 0, 0, PM_REMOVE))
{
// Handle message

}

// Main game processing

}


This will make your application use all available processing power (Windows will either be processing a message and sending it to some other application, or doing stuff with your app).

If you really wanted to only do processing when needed, there are a few ways to do it. I know of one specific way, but there are more (and probably better) ways, I'm sure.

One way would be to create a Windows Timer thing. You'd set it to run every 33ms (for 30fps) and it would run your game logic/drawing/etc functions. Then you would change your message loop. In the end it would look something like this:


const int TIMER_ID = 0;

VOID CALLBACK timerFunc(HWND hwnd, UINT uMsg, UINT idEvent, DWORD dwTime)
{
// run game

}

int WINAPI WinMain(...)
{
// stuff


SetTimer(hWnd, TIMER_ID, 33, timerFunc);

// other stuff ?


MSG Msg;
while(GetMessage(&Msg, 0, 0, 0))
{
// Do message stuff

}

// more stuff

}


Then Windows will call timerFunc every 33ms (also a WM_TIMER message will be sent to your window). This should use very little CPU time (unless your game processing takes a full 33ms or whatever).

[edited by - BlabberBoy on June 5, 2003 4:48:12 PM]

Share this post


Link to post
Share on other sites
SDL_Delay is not a hack or bad coding. It's returning control to the operating system to do it's other business so that your event loop will not use 100% of the processor.

--- EDIT ----

Like the guy who posted code above said, even doing



// no SDL stuff here, this is just win32 messages.

while( ... )
{
while (PeekMessage( ... ))
{

}
}



will use 100% of the processor. In order to NOT use 100% of the processor, you must either put in a Sleep() or use GetMessage, which will pause the program until a message is received.

In SDL's case, SDL_PollEvent( ... ) will use 100% of the processor if it's inside a while loop and is not delayed. However, using PollEvent allows you to redraw the screen when an event is not in the queue. SDL_WaitEvent( ... ) will wait for an event to occur. This won't use the processor, but it will not allow you to do anything else until an event occurs.

[edited by - Ronin Magus on June 5, 2003 4:59:22 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by deadalive
You can''t blame SDL for your bad coding. First of all your using .Net? That''s got to be slower than C. Secondly, how optimized is your program? Did you just throw it together? I bet if you coded in straight DirectX /WinAPI, you still would have a slow program. Don''t come up here and start bashing good libraries because you can''t use them properly!


This is one of the funniest and most useless comments I''ve ever read on gamedev. Even more useless than this comment I''m making.

Share this post


Link to post
Share on other sites
quote:
Well I came in here and got some things wrong, and you proceeded to insult me, so if I''m correct doesn''t that make you a stupid hypocrite?
It wasn''t intended as a personal insult towards you, I just thought that insulting the OP was stupid.


For crying out loud, she has fishes coming out of her head on either side. How can you find this hot?!

Share this post


Link to post
Share on other sites
This whole thread is stupid and this whole site is becoming retarded. I don't even know why I bother coming here, there are much more informative and intelligable sites around. Oh yea, I originally came here to read what people think and help them. Maybe I shouldn't care so much about what people think or helping anymore. (Maybe then I will get something done) Adios Amigas..

[edited by - deadalive on June 5, 2003 5:05:08 PM]

Share this post


Link to post
Share on other sites
Well SDL simply isn''t slow. On Windows it is actually jsut a wrapper for Direct X, so talk to microsft about that

The best way to minimize CPU usage is waiting, SDL_Delay and SDL_WaitEvent are good ways to do this. Ronin Magusp ointed out that your game won''t beable to do anything esle, but don''t worry, that''s why SDL has a threading subsystem too!

And when in doubt, profile! fidn the functions that are taking all the CPU, post the functions on here and ask for help.

Share this post


Link to post
Share on other sites
To take this back to the original topic, let''s try to clear up some issues. The two that were brought up as being a problem were CPU usage and RAM usage.
First off, 100% CPU usage isn''t a bad thing. Some (a lot of) people would say that it is a very good thing. With GLUT, you wait for either some input message (kb, mouse, joy, whatever) or an idle message. When you recieve it, you will execute the cooresponding function. Then control will go back to the OS and you program will do nothing until another message is received. This can be bad for two reasons. If there are a lot of input messages and you redraw the screen after handling every message, then it can take a while to get through all of the messages. It might be better to handle all pending inputs and then draw, but since you can''t be sure when an Idle message will arrive, it gets complicated to figure out when you should draw. The other potentially bad thing is that your framerate is based on the rate of messages received. If you only get 10 idle messages a second, your framerate can never be above 10 fps.
To compare this to your current SDL loop setup, every time through the loop, you can collect and handle all inputs and then draw. Once you have finished drawing, then you are able to check for more inputs immediately, no waiting for a message from the OS. This can (and most likely will) increase you framerate, especially when there are no inputs to handle. Your framerate will be based on how fast you are able to draw a frame, not based on the OS.

The second item was RAM usage. There isn''t too much infoon your GLUT project, but I don''t see any mention of a sound lib. I also don''t know how much RAM the SDL_mixer lib uses, but that is probably where a lot of the usage is comming from. If you needed to, you could probably find a sound lib that uses less RAM, but unless you have a specific platform that has limited RAM available that you are developing for, a few megs of RAM isn''t something to lose sleep over.

I hope this clears some issues up for you.

Karg

Share this post


Link to post
Share on other sites