Frame Rate + Hardware Acceleration Problem [SOLVED]

Started by
44 comments, last by Nads 16 years, 1 month ago
Hi Guys, Im afraid I need your help again! Recently I had a scene with a few mesh models (.x) some were taken from directX and some were not. I also had a skybox in position with texture resolution of 1024x1024. The game was running fine and I was getting FPS of about 60. But recently I wanted to update my game scene with more models (to make it look like a game scene), hence I got some models from the internet like 2 different houses, some 3d Grass, and a Car model. All of these were in 3dsmax format. So I then converted them to .x format using Panda Plugin. Now I have put them into my game, but im getting horrible FPS of about 12?!? THe game is slowing down dramatically too. I know there probably is a variety of reasons why this might be happening, and it could be difficult to pinpoint what is going wrong. But does anyone have any ideas? Is there a problem with my code? or is just that the models are most likely converted/developed wrongly (too high poly count)? I thought about the second option, so I commented out the drawings to the scene for the car and house, then I was able to get back to 60FPS. Then what I tried is I replaced those models with current ones I was using, and the FPS didnt fall as badly but it still fell by a bit, the more times I drew those models on the screen the more the FPS kept falling! How can I combat this? Surely there must be a way as games nowadays have thousands of models on the screen, without any slowdown! And im only have around 6-7 models before my game starts to slow down! I know that most games use Low Level of Detail for the models far away in the scene to save memory etc.. But my scene so far I dont think is too big it seems reasonably sized for a small Scene. I can send code over for review if required or post some here? I can also post screenshots here if someone told me how to upload them here :D Thanks everyone in advance for help and sujjestions! [Edited by - Nads on February 29, 2008 9:41:14 AM]
Advertisement
How many triangles are in your models? And how many subsets? The more subsets there are, the worse performance will be, since making repeated DrawSubset() calls is particularly expensive as it breaks batching.

Also, are you using the Debug runtimes? Any warnings or errors from them?
Quote:Original post by Evil Steve
How many triangles are in your models? And how many subsets? The more subsets there are, the worse performance will be, since making repeated DrawSubset() calls is particularly expensive as it breaks batching.

Also, are you using the Debug runtimes? Any warnings or errors from them?


Hi EvilSteve,

Thanks for the help.

Im not getting any errors or warnings currently. But I will make sure im using those debug runtimes when I go home tonight! Then if I get any errors or warning I will let you know.

Im not sure how many triangles and subsets are in the models, can I check them using the direct X model viewer?

Im not particurly making too many calls to DrawSubset(), I would say about 15 calls to it every frame. Is that too much?

Just so that you know, I am creating all my models and textures in the initialise D3D function I have in my main.cpp, this is called once just before going into the main run loop.

Then in my render loop function I am calling the function to draw the models onto the scene!

thanks.
How big are the textures on these models? Would they all fit in video ram? If not that can cause big slowdowns. If that's the problem then resizing the textures, or converting them to DXT should help.

What video card are you running this on?
Quote:Original post by Adam_42
How big are the textures on these models? Would they all fit in video ram? If not that can cause big slowdowns. If that's the problem then resizing the textures, or converting them to DXT should help.

What video card are you running this on?


Hi Adam,

I dont think the textures are causing a problem. This is because the texture sizes are in KB and are very small. Even then theres only 2 models using textures the rest of them arnt!!

I have tried it on two cards. Generally I program on my laptop, which is what I use all my development for, i think it has a ATI X1400.

The desktop i tried it on had a Geforce 6100 nForce 430 (work computer).

Althought they both are not very high specs my laptop is good enough to run most pre-modern games such as Half Life 2, World of Warcraft etc..

I think the problem might lie in the actual models(as EvilSteve mentioned) or my code!

thanks
15 DrawSubset() calls per frame is fine, if it was 1500 then I'd start to worry, even 150 shouldn't be too bad [smile]

You could try running NVPerfHUD (Which you can get from nVidia), that'll tell you where the bottleneck is.


Hi EvilSteve,

I played around with a few things, I also have some questions about the Debug Runtime!


Firstly I think it is because of the models i was getting low FPS. This is because I loaded the .x files in DirectX viewer and ticked the statistics checkbox, it showed me that the normal models i had earlier from dx sdk (which worked fine at 60fps) all were being displayed at 60fps in the viewer!

I then tried the other models one by one, and I got much lower FPS for all the models, some were 5-10fps lower, while others were at 30fps!

I guess thats why my game is being slowed down? Whats the solution to this? Better models? A better Graphics card and Laptop?


Now going to the Debug Runtimes, I implemented them from the properties.
BUT there is a few problems.

The first one is that when I tick the Allow Hardware Acceleration box, and run my program I get a error:

access violating reading location 0x00000000.

It breaks at this line in my render frame function:
d3ddev->Clear(0, NULL, D3DCLEAR_TARGET,  D3DCOLOR_XRGB(0, 0, 0), 1.0f, 0);


Second problem is when I tick use Debug version D3d.
This works, it compiles and runs.

But theres some odd things happeneing. For example, I have a Red spotlight set in my scene pointing in middle of scene on some boxes. But when I enable debug version of d3d then I dont get a red spotlight anymore (supposed to happen on pressing Enter Key), instead it lights up 1/4 of scene with a bright white light!

Any idea what could be wrong?

Also after I exit the program in the end of the debug ouput I notice the line:

Direct3D9: (ERROR) :Total Memory Unfreed From Current Process = 48505356 bytes

That doesnt look to healthy?

thanks
It sounds like you grabbed models offline from random. So I'm guessing they are really high poly.

To find the vert counts, right-click the files and open in notepad. .X is a txt file. It says something like verts towards the top. It'll be a number in the thousands if you don't see it.

NBA2K, Madden, Maneater, Killing Floor, Sims http://www.pawlowskipinball.com/pinballeternal

Often you will find (in my case this is 100%) that discrepencies between the visuals when switching between debug and release runtimes come down to missing state changes that cause different behaviors. You should focus on tracking these down to make sure you don't compound them with further development.

A very useful tool for both performance tracking and render state changing is PIX. This was originally a 360 performance tool and now there is a PC version that is orientated around graphical debugging and performance. Using is has been a little hit or miss on my laptops (seems somewhat prone to crashing) but when working it has provided very useful information - esp when debugging pixel and vertex shaders.

PIX will show you how long each draw call / render state change takes and you can get information on the paramters passed and even wireframe views of draw calls pre and post transformation.
The debug runtimes will try to break things. Sort of. They're just extremely uncooperative, which is a good thing - they're there to warn you if something just so happens on your card, and is because your driver is being nice to you.
The debug runtimes will be slower (obviously), so you shouldn't use them for performance testing, but they should be on most of the time you're developing to be sure that your app will work on most hardware.
One example of things it does is fill surfaces with known data when you Lock() them with flags saying "I will write to the entire surface" (E.g. D3DLOCK_DISCARD), so if you accidently rely on any values being kept, they get trashed so you'll notice (This is particularly noticible if you don't clear your backbuffer ad use D3DSWAPEFFECT_DISCARD - debug runtimes will flash the backbuffer green and magenta, where the release ones will probably "just work").

The access violation reading 0x00000000 means you have a null pointer. If it's on that line, it means your D3D device is null. If you look at the debug output you should see output like in that link I posted ("Direct3D9: (ERROR) Blah"). That'll tell you why your device pointer is null (That's assuming it's null because CreateDevice failed, not that it's null because you Release()'d it then set it to null).
A little offtopic here, you should normally be clearing the backbuffer and Z-buffer in one go by OR-ing the flags together. It's better for performance to do that, since it means one clear instead of two. I.e. your line of code should be:
d3ddev->Clear(0, NULL, D3DCLEAR_TARGET | D3DCLEAR_ZBUFFER, D3DCOLOR_XRGB(0, 0, 0), 1.0f, 0);

As for the spotlight bug, I'm not sure really. Are there any relevant warnings or messages in the debug output? Did you pass sensible parameters to SetLight() (E.g. are the angles for the spotlight in radians rather than degrees)?

The last thing is because you have a memory leak in your app. Unfortunately, they're a bit difficult to track down, because not releasing one surface means that loads of resources aren't freed because they all hold links to each other.
I'd recommend first checking that every time you get a pointer to an interface from D3D, either from a Create*() call or a Get*() call that you Release() it, and if you can't spot the problem then start commenting out chunks of your code until it goes away. When that happens, you know that the last thing you commented out was causing the leak.

This topic is closed to new replies.

Advertisement