Sign in to follow this  
arcice

2400 2d sprites slow down

Recommended Posts

arcice    108
I'm remaking an old 2d game. It was made in directx 3.0 . Some how they managed to get it to run high FPS in most video cards, new and old. in my engine, using and option of DX7 or SDL(which uses DX7) I find that on old video cards after making 1,800 blit calls the fps die way down. The only option I can think of is to use DX 3 which I don't really have time to learn.. any other blit libraries that will be more compatible and keepping good speeds? thanks.

Share this post


Link to post
Share on other sites
stevenmarky    369
Try reducing the amount of blits. I find it very hard to believe any old game would do 1800 blits using DirectX 3. And I doubt that DirectX 3 will be faster than 7.

Out of interest, which game are you remaking?

Share this post


Link to post
Share on other sites
Bad Maniac    252
We're getting 20000+ normal color keyed blits in software on resonably new single core computers, double that on dual cores. 2500 blits is easily possible in software on any P3-800 or over PC. And used to be possible in hardware too.

The slowdown of hardware accelerated 2D is due to drivers being exclusively written for 3D performance, at the cost of 2D performance. If you want to write games that require that many blits, I'm afraid you don't have many options. Either use OpenGL or D3D 3D hardware acceleration and blit them as 3D quads in ortho projection, or use a software library like our JRA::Library. If you want info on the latter, pop over to the 2DDev Forums There are a few demos available, both in C and C#, as well as the library source code. :)


[edit]
Another thing to check is that all your surfaces are in video ram, and that you're never reading from them manually. That'll destroy performance on even top of the range new PC's

Share this post


Link to post
Share on other sites
viller    110
I don't know about it's performance but PixelToaster is another library capable of blitting: http://www.pixeltoaster.com it doesn't use OpenGL nor DirectX.

Share this post


Link to post
Share on other sites
Bad Maniac    252
pixeltoaster doesn't have 2D blitters as far as I know, it's aimed at software 3D engines. It gives you direct access to a rendering buffer in system ram, and then has functionality for transfering that to the video card to display. It would require you to write your own blitters, especially if you want alpha blendnig or any other effects done.
At least that's all their features page mentions.

Share this post


Link to post
Share on other sites
Ravyne    14300
yeah, 2D on modern cards is slow unless you're actually treating everything as a 3D object and going through OpenGL/Direct3D. There's a book called "Focus on 2D in Direct3D" if you want a simple treatise on that topic.

Even a software renderer ought to beat it out. From memory, my software renderer draws just about a quarter-million 32x32 pixel, 32bit color sprites with clipping and transparency per second on my P4 3Ghz. For the most part it is all optimized C++ code -- No SSE, No assembly in the blitters.

Even relatively naive C++ renderer out to manage that many objects at 60fps.

Share this post


Link to post
Share on other sites
Vorpy    869
Do you actually need to perform 1,800 blits per frame? You can get a speed boost by only trying to blit things that are actually on the screen. Using less bytes per pixel might also help (8 bit, 16 bit, 32 bit color). If you increased the resolution to be higher than what the old game used, that can also decrease performance.

Share this post


Link to post
Share on other sites
arcice    108
in most cases, the in game chat text brings the blits to a high count. sometimes on old computers, I see 40fps with no text, then after a few messages it goes down to 20 and becomes unplayable. I can find no way around blitting each individualized font bitmap character to render the chat. actuall, i did once try rendering the chat on a buffer each time there is new messages and then blitting the buffer transparently to the game screen but i found the colorkey would bug on some hardware, the origional game didnt do that anyways. the game scrolling map uses 16x16 tiles which needs 1875 blits on a 800x600 screen. theres also a background image which blits as quadrants and moves scalling down from the map scroll speed then the interface bar, toggle buttons, radar takes about 50 more blits. then you have the chat text which uses special font in the game bitmap file, needs to be filled in with different colors for each team the message is coming from, then gits blitted using a colorkey onto the top left corner of the screen.

fact still remains that the other game in directx3, i believe blited everything as I have mentioned and still remained smooth frames on every pc that has ever been interviewed. if its atleast a P1, then it can play it. but mine using SDL needs more power and 2d accelerated video card. forget onboard video on a board more then 4 years old, my game wont have a chance, even with chat text disabled. my old laptop p3 800, s3 video, my sdl version of the game played terrible on it. the origional dx3 game played perfect. i dont want to drop sdl, i wanted a linux version.. it might be possible that im blitting more than neccessary, i hope so, but i dont know what to do to fix it.

Share this post


Link to post
Share on other sites
Spodi    642
If you want to take such a ghetto approach for a ghetto remake, why not use the good ol' dirty rects? Remember everyone's favorite RTS Starcraft? Thats some good dirty-rec'in!

"2d via 3d" is quite easy, though, especially if you don't want to take advantage of 3d effects (lighting, blending, scaling, etc). It could also simply just be the way you are drawing. Try removing all the calculations possible and see how fast you can draw with those removed. If the FPS boost is significant, you need to do some rewriting on your code. If it is not, then you will have to find a way to draw less or cluster stuff together. For example, instead of drawing the radar every frame, use a dirty rectangle to skip updating it. Another option is to draw to a separate off-screen surface, then draw that surface to the backbuffer. This way you have no over-lapping pixels (ie pixels that are drawn, but covered up by others) and just need one draw call. Even if you update the off-screen surface every 5 frames for the radar or background or map or whatever, it is going to be worth it.

Share this post


Link to post
Share on other sites
Vorpy    869
Have you tried using software blitting instead of hardware accelerated blitting?

Also, like Spodi said, try using the dirty rectangles method. There's no need to draw the entire interface bar. Just draw changes, and redraw the section that the mouse is scrolling over. When the screen scrolls, copy the part of it that was visible in the previous frame to its new position and then draw the parts that weren't visible before.

You should be able to render each line of chat to its own buffer and just blit that buffer to the screen when needed, instead of blitting every letter each time the line is displayed.

Share this post


Link to post
Share on other sites
TheGilb    372
1800 blit calls *is* excessive for a DX3 game. There must be a lot of particle effects ... What you need to do is to profile the performance. Figure out if you're CPU or GPU bound and then optimise. "Measure twice, cut once"

A good option if you are CPU bound would be to use HGE (http://hge.relishgames.com/) which is now open source and would allow you to transfer many of your pixel operations directly onto the GPU. HGE uses directx hardware acceleration - well worth looking into I might add. After working quite extensively with HGE I can tell you there is room for further GPU optimisation, but if you just need to get off the ground then this should put you on the right foot.

Hope that helps :-)

Share this post


Link to post
Share on other sites
Rasmadrak    196
Are you using time-based animation?

Are you measuring slowdowns by amount of fps?
- in what range are your fps, 700-900 dropping to 400-500 ... ?

EVERYTHING slows down when you add enough of them. Use time-based updates and you'll be fine!

Share this post


Link to post
Share on other sites
arcice    108
i calculate fps by counting each itteration of the engine loop, after 1 second the fps is set to the number of itterations, then the itteration variable is nulled.

to the guy who mentioned software blitting - software blitting is a joke, it runs at around 5fps on all computers without hardware acceleration.

there really is a limitation with directx7 on how many directdraw calls you can make, and the limitation is less than directx3. this game engine needs to work on windows and linux so i dont have an option to use directx3. this engine needs to work on all hardware just like the origional game does, hardware up to 7 years old so 3d acceleration engines are not an option.

the latest thing ive just tried was to render the entire game world map into video memory wich only works on video cards with at least 64megs of video, the fps remained the same since all video cards that have that much memory could render to the vsync throttle point. all the old hardware which gets fpt will have like 4, 8 megs of vram and the origional game ran perfectly smooth.

i have taken the liberty to run a debugger on the origional game and it very much so blits the 16x16 tiles across the 800x600 screen resolution continuously - it does not repair rect areas that were changed.

i wish i could find somewhere on the internet a complete demo project using directx3 so that i can simply run a loop and in that loop i will blit from a bitmap surface loaded in main and i will create a for loop on the blit to see how many blits i can do, before the slow down, then compare to directx7

Share this post


Link to post
Share on other sites
Kylotan    9853
Your old 2D game - like most in the DirectX 3 era - was probably running in 256 colour mode. If you're now blitting in 32bit mode you'll get 1/4 of the performance.

Also make sure all your surfaces are in hardware. If you're running out of VRAM, then there's a sign that you're certainly not doing the same thing that they did back then.

Share this post


Link to post
Share on other sites
Bad Maniac    252
Quote:
Original post by arciceto the guy who mentioned software blitting - software blitting is a joke, it runs at around 5fps on all computers without hardware acceleration.
This is the most ignorant statement I've read today. First of all, how is software blitting in any way tied to hardware acceleration? The whole point of software blitting is to not require a 3D card.

As for performance, our library can perform 25000 animated color keyed 32x32 blits over a fullscreen 640x480 background image at 30FPS on a modern single core PC(2 Ghz), and over 50000 blits on a dual core. Using nothing but GDI and assmebly language under Win32.
On a Pentium 3 500Mhz we still get around 3000 blits at 30FPS.

Software blitting under DirectX or OpenGL might be a joke, but that's because of poor implementation, not something wrong with software blitting in itself.

Share this post


Link to post
Share on other sites
Spodi    642
Maybe you have a software renderer in mind he can try checking out instead of using DirectX's software rendering, Bad Maniac?

Share this post


Link to post
Share on other sites
Bad Maniac    252
Maybe I have.
Maybe that's why I posted a link in my first post in this thread? And maybe that's why I refer to the library in the previous post as "our library", and mention in the first one it's free and opensource?

Maybe ;)

Share this post


Link to post
Share on other sites
arcice    108
Quote:
Original post by Kylotan
Your old 2D game - like most in the DirectX 3 era - was probably running in 256 colour mode. If you're now blitting in 32bit mode you'll get 1/4 of the performance.

Also make sure all your surfaces are in hardware. If you're running out of VRAM, then there's a sign that you're certainly not doing the same thing that they did back then.


the origional game has support for 8-bit and 16-bit. 16-bit is by default and is what thousands of players ran the game as on all sorts of hardware successfully. when those same players tried my clone 30% of them had unplayable fps

Share this post


Link to post
Share on other sites
Extrarius    1412
Believe it or not, 7 years ago gaming computers had 3d accelerators. The Voodoo3 arrived in 1999 and it was preceeded by many other cards with basic 3d acceleration. For the purpose of drawing unlit textured triangles, just about any computer still running should be able to do it at a respectable pace.

Share this post


Link to post
Share on other sites
Antheus    2409
My guess would be that it has something to do with 8 and 16 bit rendering. Most cards today assume 24 or 32-bit modes, and everything else provides just reference-grade implementation.

I know that even years ago it was considerably faster to accept larger data size, but operate in one of 24/32 bit modes, since the performance difference was drastic.

Share this post


Link to post
Share on other sites
Kylotan    9853
Quote:
Original post by arcice
Quote:
Original post by Kylotan
Your old 2D game - like most in the DirectX 3 era - was probably running in 256 colour mode. If you're now blitting in 32bit mode you'll get 1/4 of the performance.

Also make sure all your surfaces are in hardware. If you're running out of VRAM, then there's a sign that you're certainly not doing the same thing that they did back then.


the origional game has support for 8-bit and 16-bit. 16-bit is by default and is what thousands of players ran the game as on all sorts of hardware successfully. when those same players tried my clone 30% of them had unplayable fps


But you haven't actually specified what textures you're using. Are you using 16-bit textures? Are you using the same 16-bit texture format as the screen is? Are you storing them efficiently - eg. in several composite sprite sheets, rather than all individually?

I can't believe I overlooked that you're using SDL. As the previous poster said, SDL_DisplayFormat() is an absolute must - have you used it on every image you load?

Share this post


Link to post
Share on other sites
arcice    108
Quote:
Original post by Extrarius
Believe it or not, 7 years ago gaming computers had 3d accelerators. The Voodoo3 arrived in 1999 and it was preceeded by many other cards with basic 3d acceleration. For the purpose of drawing unlit textured triangles, just about any computer still running should be able to do it at a respectable pace.


Believe it or not, 7 years ago, non-gaming computers didn't have 3rd accelerators. this soo happens to to be a 3rd of the community who played my clone, whom got a nasty error message when I used a Direct3D Hybride solution for alphablending

just about any computer thats no more than 3 years old should be able to do it at a respectable pace

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this