2400 2d sprites slow down

Started by
26 comments, last by arcice 16 years, 8 months ago
Believe it or not, 7 years ago gaming computers had 3d accelerators. The Voodoo3 arrived in 1999 and it was preceeded by many other cards with basic 3d acceleration. For the purpose of drawing unlit textured triangles, just about any computer still running should be able to do it at a respectable pace.
"Walk not the trodden path, for it has borne it's burden." -John, Flying Monk
Advertisement
My guess would be that it has something to do with 8 and 16 bit rendering. Most cards today assume 24 or 32-bit modes, and everything else provides just reference-grade implementation.

I know that even years ago it was considerably faster to accept larger data size, but operate in one of 24/32 bit modes, since the performance difference was drastic.

.

Quote:Original post by arcice
Quote:Original post by Kylotan
Your old 2D game - like most in the DirectX 3 era - was probably running in 256 colour mode. If you're now blitting in 32bit mode you'll get 1/4 of the performance.

Also make sure all your surfaces are in hardware. If you're running out of VRAM, then there's a sign that you're certainly not doing the same thing that they did back then.


the origional game has support for 8-bit and 16-bit. 16-bit is by default and is what thousands of players ran the game as on all sorts of hardware successfully. when those same players tried my clone 30% of them had unplayable fps


But you haven't actually specified what textures you're using. Are you using 16-bit textures? Are you using the same 16-bit texture format as the screen is? Are you storing them efficiently - eg. in several composite sprite sheets, rather than all individually?

I can't believe I overlooked that you're using SDL. As the previous poster said, SDL_DisplayFormat() is an absolute must - have you used it on every image you load?
Quote:Original post by Extrarius
Believe it or not, 7 years ago gaming computers had 3d accelerators. The Voodoo3 arrived in 1999 and it was preceeded by many other cards with basic 3d acceleration. For the purpose of drawing unlit textured triangles, just about any computer still running should be able to do it at a respectable pace.


Believe it or not, 7 years ago, non-gaming computers didn't have 3rd accelerators. this soo happens to to be a 3rd of the community who played my clone, whom got a nasty error message when I used a Direct3D Hybride solution for alphablending

just about any computer thats no more than 3 years old should be able to do it at a respectable pace
Quote:Original post by Antheus
My guess would be that it has something to do with 8 and 16 bit rendering. Most cards today assume 24 or 32-bit modes, and everything else provides just reference-grade implementation.

I know that even years ago it was considerably faster to accept larger data size, but operate in one of 24/32 bit modes, since the performance difference was drastic.


ok, my first clone attempt used directx7 only and I was able to do pallets and got it to have an 8bit option for windows. 8bit very rarely had a performance boost, but when it did there was a large bracket (30fps+) from 16-bit

in sdl i dont think im going have an 8bit, but it is operating in 16bit only, not 24 nor 32
Quote:Original post by Kada2k6
I read you're using SDL. Do you call SDL_DisplayFormat() on your surfaces before you start to blit them? This is often overlooked, and the difference in speed is like day and night. Try it and see if it helps!


This is the first useful reply, thank you. I will post tomorrow with the results. :) :) ;)
Quote:Original post by Kylotan

But you haven't actually specified what textures you're using. Are you using 16-bit textures? Are you using the same 16-bit texture format as the screen is?


yes mine is 16bit. The sprites are stored in one huge bitmap for each part of the game.

Quote:Are you storing them efficiently - eg. in several composite sprite sheets, rather than all individually?


This statement right here just made me have a very sharp and painful epiphany.

Map tiles 640x1600, people, weapons, animations, menus: 640x1320 far plane image 1280x960 which is under the tiles and is visible where there isn't tiles (fall into a starry twilight space off a ledge)........ These 3 images are loaded into a Hardware Surface SDL_HWSURFACE for each .bmp. If the video card can't handle the resolution over 640x480 like many older ones can't then SDL automatically switches to a Software Surface.

For all this time I hadn't thought once that on the older cards that the tiles will become a software surface and having that type of surface for most of your source blit calls is a big nono. what I have to do is break that up into 4 640x480 surfaces and I bet that right there will solve my problem... thanks Kylotan!

This topic is closed to new replies.

Advertisement