2400 2d sprites slow down

Started by
26 comments, last by arcice 16 years, 8 months ago
I'm remaking an old 2d game. It was made in directx 3.0 . Some how they managed to get it to run high FPS in most video cards, new and old. in my engine, using and option of DX7 or SDL(which uses DX7) I find that on old video cards after making 1,800 blit calls the fps die way down. The only option I can think of is to use DX 3 which I don't really have time to learn.. any other blit libraries that will be more compatible and keepping good speeds? thanks.
Advertisement
Try reducing the amount of blits. I find it very hard to believe any old game would do 1800 blits using DirectX 3. And I doubt that DirectX 3 will be faster than 7.

Out of interest, which game are you remaking?
We're getting 20000+ normal color keyed blits in software on resonably new single core computers, double that on dual cores. 2500 blits is easily possible in software on any P3-800 or over PC. And used to be possible in hardware too.

The slowdown of hardware accelerated 2D is due to drivers being exclusively written for 3D performance, at the cost of 2D performance. If you want to write games that require that many blits, I'm afraid you don't have many options. Either use OpenGL or D3D 3D hardware acceleration and blit them as 3D quads in ortho projection, or use a software library like our JRA::Library. If you want info on the latter, pop over to the 2DDev Forums There are a few demos available, both in C and C#, as well as the library source code. :)


[edit]
Another thing to check is that all your surfaces are in video ram, and that you're never reading from them manually. That'll destroy performance on even top of the range new PC's
JRA GameDev Website//Bad Maniac
I don't know about it's performance but PixelToaster is another library capable of blitting: http://www.pixeltoaster.com it doesn't use OpenGL nor DirectX.
pixeltoaster doesn't have 2D blitters as far as I know, it's aimed at software 3D engines. It gives you direct access to a rendering buffer in system ram, and then has functionality for transfering that to the video card to display. It would require you to write your own blitters, especially if you want alpha blendnig or any other effects done.
At least that's all their features page mentions.
JRA GameDev Website//Bad Maniac
yeah, 2D on modern cards is slow unless you're actually treating everything as a 3D object and going through OpenGL/Direct3D. There's a book called "Focus on 2D in Direct3D" if you want a simple treatise on that topic.

Even a software renderer ought to beat it out. From memory, my software renderer draws just about a quarter-million 32x32 pixel, 32bit color sprites with clipping and transparency per second on my P4 3Ghz. For the most part it is all optimized C++ code -- No SSE, No assembly in the blitters.

Even relatively naive C++ renderer out to manage that many objects at 60fps.

throw table_exception("(? ???)? ? ???");

Do you actually need to perform 1,800 blits per frame? You can get a speed boost by only trying to blit things that are actually on the screen. Using less bytes per pixel might also help (8 bit, 16 bit, 32 bit color). If you increased the resolution to be higher than what the old game used, that can also decrease performance.
in most cases, the in game chat text brings the blits to a high count. sometimes on old computers, I see 40fps with no text, then after a few messages it goes down to 20 and becomes unplayable. I can find no way around blitting each individualized font bitmap character to render the chat. actuall, i did once try rendering the chat on a buffer each time there is new messages and then blitting the buffer transparently to the game screen but i found the colorkey would bug on some hardware, the origional game didnt do that anyways. the game scrolling map uses 16x16 tiles which needs 1875 blits on a 800x600 screen. theres also a background image which blits as quadrants and moves scalling down from the map scroll speed then the interface bar, toggle buttons, radar takes about 50 more blits. then you have the chat text which uses special font in the game bitmap file, needs to be filled in with different colors for each team the message is coming from, then gits blitted using a colorkey onto the top left corner of the screen.

fact still remains that the other game in directx3, i believe blited everything as I have mentioned and still remained smooth frames on every pc that has ever been interviewed. if its atleast a P1, then it can play it. but mine using SDL needs more power and 2d accelerated video card. forget onboard video on a board more then 4 years old, my game wont have a chance, even with chat text disabled. my old laptop p3 800, s3 video, my sdl version of the game played terrible on it. the origional dx3 game played perfect. i dont want to drop sdl, i wanted a linux version.. it might be possible that im blitting more than neccessary, i hope so, but i dont know what to do to fix it.
If you want to take such a ghetto approach for a ghetto remake, why not use the good ol' dirty rects? Remember everyone's favorite RTS Starcraft? Thats some good dirty-rec'in!

"2d via 3d" is quite easy, though, especially if you don't want to take advantage of 3d effects (lighting, blending, scaling, etc). It could also simply just be the way you are drawing. Try removing all the calculations possible and see how fast you can draw with those removed. If the FPS boost is significant, you need to do some rewriting on your code. If it is not, then you will have to find a way to draw less or cluster stuff together. For example, instead of drawing the radar every frame, use a dirty rectangle to skip updating it. Another option is to draw to a separate off-screen surface, then draw that surface to the backbuffer. This way you have no over-lapping pixels (ie pixels that are drawn, but covered up by others) and just need one draw call. Even if you update the off-screen surface every 5 frames for the radar or background or map or whatever, it is going to be worth it.
NetGore - Open source multiplayer RPG engine
Have you tried using software blitting instead of hardware accelerated blitting?

Also, like Spodi said, try using the dirty rectangles method. There's no need to draw the entire interface bar. Just draw changes, and redraw the section that the mouse is scrolling over. When the screen scrolls, copy the part of it that was visible in the previous frame to its new position and then draw the parts that weren't visible before.

You should be able to render each line of chat to its own buffer and just blit that buffer to the screen when needed, instead of blitting every letter each time the line is displayed.

This topic is closed to new replies.

Advertisement