directx performance.

Started by
8 comments, last by jollyjeffers 19 years, 4 months ago
Some years ago, I programmed my first game, a wipeout like race game, it used DirectX 7, and ran fairly well in what now is considered a very low end computers (pentium 200mhz + voodoo 2). Recently I programmed a tetris like game in 3d, using an upgraded to DirectX 9 version of that engine, and, of course, it runs very well in modern computers. Then, I had the chance to test this dx9 project on an old computer, and I noticed it runs extremely slow, and I mean, extremely slow. This puzzled me, because this tetris like game paints much less polygons than the old wipeout game I did years ago. This has driven me to two possible answers: * the directx9 pipeline is much longer than the one on Dx7 , and is designed *only* for modern computers, if someone wants to create a project that runs on old computers, he must stick to dx7, not only because features, but also for performance. * when diretx9 works on a dx7 only driver, it performs some sort of data translation that has a huge performance hit. of course, this is only my guessing, but I would like to know what's really happening...
Advertisement
i'd guess the older computer computes all on the CPU instead of the GPU due to the lack of hardware vertexprocessing. And if you got some more CPU-dependant operations, it got even more to do
Ethereal
Quote:of course, this is only my guessing, but I would like to know what's really happening...
Not the answer you want, but I don't think you'll find out why/what is happening.

Short of reading the D3D specification, it's difficult to tell when the pipeline ends (in D3D) and when it begins in the driver (+GPU).

It is very true that D3D has been handing over more and more work to the drivers/GPU's, which, for modern games, is definitely a good thing. However, as Metus suggested the older hardwares lack of a decent hardware pipeline means that D3D is spending more time "simulating" it on the CPU.

As a slight aside - have you profiled your application? does it do extensive CAPS testing? It may well be that you've inadvertantly written your D3D9 app to (maybe by default) use features that your older hardware can't use and DX silently starts emulating for you. The debug spew and PIX might well help you here.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Quote:Original post by vicviper
* when diretx9 works on a dx7 only driver, it performs some sort of data translation that has a huge performance hit.


Err..? Are you using DirectX9 on a DirectX7 machine? Hmm.. Update your DX-runtime!!
Quote:Original post by Pipo DeClown
Quote:Original post by vicviper
* when diretx9 works on a dx7 only driver, it performs some sort of data translation that has a huge performance hit.


Err..? Are you using DirectX9 on a DirectX7 machine? Hmm.. Update your DX-runtime!!


It's be impossible (unless you decide to hack the dll's just to make me wrong) to run DX9 apps on DX7 runtimes ;)
I suppose he means on DX7 hardware like.. GF2 or something like that
Ethereal
Quote:Original post by Pipo DeClown
Quote:Original post by vicviper
* when diretx9 works on a dx7 only driver, it performs some sort of data translation that has a huge performance hit.


Err..? Are you using DirectX9 on a DirectX7 machine? Hmm.. Update your DX-runtime!!


no, it's an old pentium with a voodoo 2 card, which has directx7 compliant drivers only, the runtimes are directx9.
voodoo you say...
check the caps for the texture size and the dimension of your textures. If you'r ein bad luck the textures you create will be stored i System RAM because Voodoo chipset has a texture-dimension-limit of 512*512 (i think)
Ethereal
Quote:Original post by Metus
voodoo you say...
check the caps for the texture size and the dimension of your textures. If you'r ein bad luck the textures you create will be stored i System RAM because Voodoo chipset has a texture-dimension-limit of 512*512 (i think)

Wasn't that 256x256? That's why they sucked so badly compared to NVidias Riva TNT 1(which supported 1024x1024).
Also the voodoo only ran hardware accellerated in 16 bit fullscreen mode. Windowed apps weren't hardware accellerated at all, IIRC.
Quote:Original post by darookie
Wasn't that 256x256? That's why they sucked so badly compared to NVidias Riva TNT 1(which supported 1024x1024).
Also the voodoo only ran hardware accellerated in 16 bit fullscreen mode. Windowed apps weren't hardware accellerated at all, IIRC.


hmm.. you might RC, if that's the case, perhaps we solved the problem for vicviper
Ethereal
Quote:Wasn't that 256x256?
That's definitely true of the Voodoo-3 cards, I thought the Voodoo-2's were worse (128?)... I, luckily, did very little development on that sort of hardware. Feel free to prove me wrong :-)

The Voodoo cards were amazing cards when they led the field - when they managed to get market share pretty much just for being the first to market. However, over time they were completely flattened by the competition. I for one, as a developer more than a gamer, I am not sorry to see them go [grin].


I still maintain that with a bit of analysis you'll probably find that you're causing the runtime to do some compatability/conversion/whatever that kills your performance. Tailor the app/engine to the hardware and you're likely to get what you originally expected.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

This topic is closed to new replies.

Advertisement