Help with sudden drop in framerate in SDL, or Just how long can/should one Blit take?

Started by
7 comments, last by theOcelot 14 years, 2 months ago
With any luck, someone knows what's going on just from the title. I'm sure I'm not the only one to have had this problem. Basically, for the first several, or even for the first couple hundred frames, my game runs at a reasonable speed, taking around 30ms per frame (measured by SDL_GetTicks), but after a while, this time jumps up to 60-80 ms/frame, and sometimes even more. I know its the rendering, because the same behavior is displayed before the actual simulation is running, and the game is just rendering while it waits for input to start. When profiling with gprof, rendering-related functions tended to come up to the top. When profiling with Valgrind, BlitNtoNKey in SDL_blit_N.c was at the very top of the list. I can't think of anything I'm doing weird that would cause this to happen. I get much the same effect when I use a hardware display surface, when I comment out the call to SDL_Delay in the timing loop, and when no sprites are being rotated. I'll post my main loop with the timing instrumentation in case there's anything suspicious, but I'm reluctant to post masses of rendering code if someone might know what's happening just from the symptoms. I'll post it if you need to see it.
while(!statestack.Empty() && running){
        //reset the timer
        prev_time = SDL_GetTicks();
        //update and process input
        inputsys->Update();
        while(!maininput->Empty()){
            InputEvent event(maininput->PollEvent());
            if(event.Type() == SDL_QUIT){
                running = false;
            }
        }
        //update and render the game
        statestack.Render();
        statestack.Update();
        //update at a fixed frame-rate
        Uint32 elapsed_time = SDL_GetTicks() - prev_time; //how long it took to update
        if(elapsed_time < frame_time){ //if the frame finished on time
            SDL_Delay(frame_time - elapsed_time);
            errfile << "::main: elapsed_time: " << elapsed_time << '\n';
        } else {//otherwise, the frame went long, and the most we might do is log it
            errfile << "::main: frame went overtime, time: " << elapsed_time << '\n';
        }
        SDL_Flip(screen);
    }
    errfile << "::main: done with game" << endl;
}

So can anyone tell me what's wrong? I've been struggling with this for a while, and I'd be very grateful if someone could help me with it. edit: real problem below [Edited by - theOcelot on February 20, 2010 12:25:39 AM]
Advertisement
Try leaving out parts of code that use SDL_blit and see how many of these you can remove before the slowdown disappears. Can't recommend much else at the moment. Also you can try measuring different parts of code using SDL_GetTicks in more places than you are doing now. Ie, measure update and rendering separately just to be sure and then measure time of the calls of the one that causes slowdown and so on.
Are you doing any allocations during the rendering? On-the-fly rotations, format conversions, loading new images? If you're doing a lot of this, even with proper deletion of resources, perhaps memory gets fragmented.
After commenting out all the calls to SDL_Blit I could find, it turned out to be the drawing of the background image every frame that was causing problems. The first time with just that image being drawn, the frame time started at 50 ms/frame and stayed around there. The second time, it started at 24 ms/frame for a while then jumped to around 50 ms/frame, just like normal.

So now the question is, why does that one call to SDL_Blit take such a long time, and with such widely varying times?

The image is loaded from BMP with the standard SDL_LoadBmp/SDL_DisplayFormat combo. The display mode is 32 bit. The bitmap is 592x400 px, which, when looked at in this light, may be rather large. I guess I can figure out how to only redraw it when necessary, but it doesn't seem like it should be so hard.

Anyway, thanks to mrjones for reminding me to do what I should have done a long time ago. In my desire to try to be professional and use an actual profiler, I'd almost forgotten about the simple, stupid approach. [smile] And to Kylotan, even though your guess turned out not be the problem at the moment.
Does anyone know what makes it take so long?
Quote:Original post by theOcelot
Does anyone know what makes it take so long?
Did you say you'd already tried profiling? If so, where does the profiler say the time is being spent?

I don't know if it's relevant to the particular problem you're trying to solve, but here is a recent thread on blitting in SDL that might be of some use.
Yes, profiling time was definitely in SDL_Blit, and it stopped when I stopped drawing the background image. I'll try using SDL_DisplayFormatAlpha and RLE, which that thread reminded me of. Thanks for that.

What I'm really more curious about is why the time it takes suddenly jumps up.
It could just be down to cache access. Maybe for a while, the whole thing fits in cache, and then after enough operations happen elsewhere, it's flushed out of cache and has to be re-read again.

If it's in hardware, then obviously the definition of 'cache' is a bit different but the principle is the same; perhaps the graphic has to get transferred over the bus to the card every so often. And it is a whole megabyte of data, not the best thing to be sending 30 times a second if it can be avoided.
Thanks. Moving the surface to the video hardware seems to have improved, if not not fixed the situation.

This topic is closed to new replies.

Advertisement