Sign in to follow this  
RobMaddison

Engine speeds up when Crysis Sandbox 2 is running in background - how??

Recommended Posts

RobMaddison    1151
This is going to sound a bit strange and I'm not even sure which forum to ask my question in, but here goes: When my engine starts up (terrain rendering only at the moment), the FPS I get at the position I start in is around 1500fps. Just before shutting my machine off for the night tonight, I tried it again and it had dropped to around 900fps but nothing had changed. Well, that's not strictly true... Whilst doing some further development earlier, I happened to have the Crysis Sandbox 2 editor open in the background (nothing loaded). When I close the Crysis editor, my frame rate drops to almost half what it is when the editor is running in the background. When I open the editor and run my engine, the framerate is blisteringly quick again. How can running the Crysis editor in the background make my framerate rocket? I'm fairly certain my framerate calcs are right and there is definitely a noticeable difference in feel when the framerate drops (my movement isn't tied in with framerate - or rather ms per frame, so I do notice framerate changes). I'm stumped. Anyone seen this before? I'm using DirectX9 and ps&vs3_0. Thanks

Share this post


Link to post
Share on other sites
Classless    130
Just my guess, nowadays video drivers often come with optimisations for a predefined set of applications(read popular 3D games), if the driver detects an application from that set is running, some special fast paths for that particular application will get turned on, and that may happen to speed up your application as well?

And FPS goes from 1500HZ to 900HZ is not such a great speed-up actually, remember FPS is not linear, so the actual frame-time improvement is 1/900 - 1/1500 ~= 0.00044 s = 0.44 milliseconds

Share this post


Link to post
Share on other sites
RobMaddison    1151
I had wondered that too, or possibly if the Crytek engine switches on some kind of optimization on the GPU rather than the other way round?

1500 - 900, is only 0.44ms, as you say, but looking at one of the most detailed parts of the map, the frame rate is 750 with Sandbox 2 running and only 350 without, that then increases the frame time difference from 1.3ms to 2.8ms - which is over 200%.

There must be something else it's doing - has anyone else noticed this? Or if you have Crysis, could you try your app with it running in the background? (i'm running mine from within Visual Studio, but in release/full screen mode).

Otherwise I may have to bundle a copy of Crysis with my game :)

Thanks

Share this post


Link to post
Share on other sites
Adam_42    3629
Crysis could be calling timeBeginPeriod(1). That can have some weird effects on frame rate, especially if you're using GetTickCount() or timeGetTime() for your timer.

Share this post


Link to post
Share on other sites
Evil Steve    2017
As Adam_42 hinted: How are you measuring FPS? The best way is to measure the number of frames rendered in a fixed time step (Say 1 second for a very steady FPS), and use that to calculate the FPS.

Share this post


Link to post
Share on other sites
RobMaddison    1151
Quote:
How are you measuring FPS?


Exactly as you suggested, Steve. I've just been reading about timeBeginPeriod (thanks, Adam) and it looks like this could be it. I read on one website that people have windows media player (doing nothing) behind games to increase the frame rate (or at least make it look that way). It appears that some people believe that it does actually increase the physical frame rate (or descrease the ms/frame) and some don't.

It's definitely worth a try at least. The question is though, if timeBeginPeriod in my app does give an increased frame rate, which one is the real one? If timeBeginPeriod gives a more accurate (or granular) timer measurement, I can rely on that being the true frame rate?

Share this post


Link to post
Share on other sites
Evil Steve    2017
Quote:
Original post by RobMaddison
It's definitely worth a try at least. The question is though, if timeBeginPeriod in my app does give an increased frame rate, which one is the real one? If timeBeginPeriod gives a more accurate (or granular) timer measurement, I can rely on that being the true frame rate?
If you're measuring over a short period, then the timer innacuracy will be more evident (I.e. measuring number of frames rendered in 10ms with a timer accurate to 15ms will have a lot of error, but measuring number of frames rendered in 1000ms won't).

If you measure the number of frames drawn in a second, then the timer function you're using shouldn't matter too much (Unless it returns a value in second of course [smile]).

timeBeginPeriod() just increases the accuracy of some timing function, it won't make the game actually faster.

Share this post


Link to post
Share on other sites
ddlox    168
As you can see for yourself in the MSDN:
"...A lower value specifies a higher (more accurate) resolution..."

So that means IF your calculations are right, you are getting the correct and true speed of your engine, that is, more precise, but not faster (Statement1).

Try using high-resolution performance counter (QueryPerformanceFrequency/Counter( ...) etc...):
- if these functions return the same value (Before and After running the Crysis editor) then YOUR FPS CALCULATIONS ARE RIGHT and Statement1 is true for your engine.
- if these functions return different value (Before and After running the Crysis editor) then YOUR FPS CALCULATIONS ARE WRONG and Statement1 is false for engine.

MSDN:
"... Setting a higher resolution does not improve the accuracy of the high-resolution performance counter..."
So that means these functions QPF/QPC etc... should remain unaffected.

Cheers.

Share this post


Link to post
Share on other sites
RobMaddison    1151
My sandbox project, which is where I test new functionality like terrains, lighting, etc is based on the DXUT framework and my FPS frame stats come from the DXGetFrameStats (can't remember the exact sig.) method as all I do is just output their string but I believe this uses QPF/QPC.

In order to measure the ms/frame, I use a high performance counter which tracks the elapsed time since the last frame update - this appears to be pretty accurate.

Quote:
timeBeginPeriod() just increases the accuracy of some timing function, it won't make the game actually faster.


I agree - but my engine does appear to run twice as fast in places. My look direction (using the mouse) is not tied in to the frame rate or ms/frame calcs at all and there is definitely a difference when panning around in detailed areas.

In answer to ddlox, I do use QPF/QPC but running the Crysis editor drastically changes my frame rate so I guess my timings are right in one state and wrong in the other - by deduction, this says to me that the Crysis editor is either doing something to the GPU or CPU timing at a lower level.

Share this post


Link to post
Share on other sites
RobMaddison    1151
Apologies for bring this up again, but I still haven't found a solution. And prior to contacting EA support, I thought I'd ask again. I've tried different ways of timing my engine and they all seem to go faster when the Crysis Editor is running in the background. As mentioned below, my y-axis camera rotation is not tied in to framerate or ms/frame and does feel smoother when the Crysis Editor is running in the background - this says to me that it isn't a timing issue.

A friend made a suggestion that got me thinking - in my DirectX control panel, it shows that DirectX is in Retail mode. Could it be that my engine flips DirectX into Debug mode (somehow) but remains in Retail mode when the Crysis Editor is running? Is there a way I can tell from my debug output whether DirectX is running in Retail or Debug mode?

(I'm using DirectX 9.0c/VS2005/Unmanaged c++)

Thanks in advance.

Share this post


Link to post
Share on other sites
jollyjeffers    1570
Could it be something along the lines of the FP precision switch? I'm pretty sure Direct3D will default to "fast" instead of "accurate" upon a CreateDevice() call, so maybe the Crysis engine changes some low-level state like that which somehow leaks into your process? Bit of an unlikely long-shot, but could be worth checking.

Alternatively, is it possible that Crysis is altering your video card settings? They'll work closely with ATI/Nvidia and may well be using the entry points and setting some sort of performance/compatability setting (e.g. the MSAA mode or some other performance vs quality setting) that has a global effect? Maybe the driver is also detecting that Crysis is running and enables some different codepath? Similar to the shims that got them all in trouble with benchmarks a couple of years ago... if the driver only assumes one game at a time it may not be 'process safe' and thus globally apply the new codepath to your application as well?!

Quote:
Is there a way I can tell from my debug output whether DirectX is running in Retail or Debug mode?
Well you won't get any D3D debug spew if you're running with retail runtimes. Otherwise I suspect you'd have to somehow call into Win32 to see which libraries you've got loaded into your process and check for the 'd' variant of d3d9.dll...


hth
Jack

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this