• Advertisement
Sign in to follow this  

UT2Kx Engine CPU Heavy?

This topic is 4788 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Does anybody have any idea why the UT2k4 Engine is so CPU heavy compared to other games which seem to be more GPU dependent. Is this necessary for a game feature or down to some poor programming on epic's part? thanks.

Share this post


Link to post
Share on other sites
Advertisement
I would hazard a guess that it is a design feature, and not a bad mistake on Epic's part. Simply because, that engine has been in development, like, forever, and during that time, it would have been kind of hard for them to miss a problem that big, if they actually considered it a problem.

Although I could be wrong, as I'm just guessing there.

Share this post


Link to post
Share on other sites
I fail to see how this is a design "problem" or a flaw in any way. Nowadays, the video card is often the most expensive part in a computer and so not everyone is willing to upgrade it every three months. A new AthlonXP 3000+ will only cost you $150 whereas a new ATI X800 will cost you $450. A new CPU can also be easier to justify, since it will speed up everything to some extent, and not just games.

Share this post


Link to post
Share on other sites
Ok, well i really find UT2K4 for instance far too CPU bottlenecked, well that is i guess memory bandwidth. When im at ~3GHz (400FSB) (with a 9800 @ pro) im not satisfied with the way it performs, however if i overclock my CPU FSB wise, to say 460FSB whilst even downclocking the gfx card, there is a vast increase in performance.
A CPU performance of this caliber is not required by other similar-generation FPS games and so what I'm asking really is...does anyone know what features in the UT2k4 engine result in this performance characteristic?

Share this post


Link to post
Share on other sites
CPU is the bottleneck in all the engines out there, its just a question of: to what extent?

Share this post


Link to post
Share on other sites
The sound driver (at least under Mac OS X) appears to drag down the framerate more than anything else. Try turning it off; I'd love to see if sound is stupidly handled entirely on the CPU under NT or Linux.

Share this post


Link to post
Share on other sites
Ok so if we look at AI as a cause ... in a MP game (I'm talking of online play) there is no AI needed in the conventional bot sense so it can't be that.

Quote:
CPU is the bottleneck in all the engines out there, its just a question of: to what extent?
Exactly my point but the question then is why is the CPU a bottleneck in UT2k4 to an extent more than other games?

Share this post


Link to post
Share on other sites
I'll agree with MadMan on that one. Of course, now you've got me curious. I get a steady FPS of much higher than 30 (haven't actually measured in a while, but it's definitely smooth motion) with the graphics turned all the way up on my dual Xeon 2.4GHz box with a Radeon 9700(non-pro). Here's a way to test our hypothesis: do you have the same problem while playing with only human players or while playing with AI on a separate server?

{Edit: so if it's not the AI, then what happens when you turn the graphics all the way down?)

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
I'd say it's cpu bottle-necked cause pretty much the whole thing runs off Unreal*SCRIPT*.

Share this post


Link to post
Share on other sites
well, when the graphics are turned all the way down, there is definitely an increase in performance, though not as immense as i would like for sacrifice in candy. In general i run at 1024x768, with medium textures and pretty much all the other settings set to "off". I do get a lot more than 30 fps, but im just surprised that when at such a high FSB/CPU clock (over 3GHz with 400FSB) as it is, a small increase FSB can make such a drastic performance difference to the game (and that includes multiplayer with no bots involved)

Share this post


Link to post
Share on other sites
AP: I'm pretty sure that the Unreal script gets compiled to bytecode that is JIT compiled to native assembly (which is why the load times for onslaught maps are so much higher than those for regular deathmatch maps).

FSB and RAM access are tightly bound. Maybe the bottleneck you're seeing is a RAM bottleneck? Unfortunately, I can't think of any good way to test this...

Share this post


Link to post
Share on other sites
Yes yes, sorry what i should say instead of merely "CPU Heavy" is that i mean it is bottlenecked by the CPU/Memory subsystem infinitely more than by the graphics card.

Share this post


Link to post
Share on other sites
I believe that the engine uses the software renderer to a very large extent. It was targeted at those people that have fast CPUs but lower end GPUs. So, a lot of the rendering calculations are handled on the CPU. This would explain why in benches on a 6800 Ultra, you'll get framerates that hit 75 and stop. CPU limited.

Share this post


Link to post
Share on other sites
framerates hitting 75 and stopping sounds like a vsync issue. There is also the issue that online play caps framerates to 85 as well I think. If a large part of the engine does graphical calcs in software that could explain a lot but I would have though epic would have made the engine more scalable than that and branch into hardware calc when good enough hardware was present?

Share this post


Link to post
Share on other sites
I find it hard to believe they would choose to do things in software that they can do in HW, especialy on such a high profile engine. Now there is a software rendered to at least one of the UT2Kx versions, but that's optional, not forced.
A 3 GHz and 400 MHz buss sounds like a Celeron CPU (with the new P4s being 800 MHz and the old 533 MHz), and while the new Cels aren't as bad as thier predeserors they're in no way a top of the line CPU. The game could very well be bandwidth starved seeing as you have a large increase in performance when overclocking the FSB.

Share this post


Link to post
Share on other sites
Quote:
http://www.unrealtechnology.com/html/technology/ue2.shtml
# Full support for DirectX8 class video cards (including ATI® RADEON™- and NVIDIA® GeForce™-class cards).
# Fallback rendering support for DirectX6 video cards as far back as the NVidia TNT.
# Rendering subsystems include Direct3D, OpenGL, and RAD Game Tools' Pixomatic software renderer for Windows PCs – bundled with Unreal Engine 2 at no additional cost. The inclusion of software rendering guarantees that any PC with a reasonable CPU will be able to run Unreal Engine 2, regardless of 3D card support.

Share this post


Link to post
Share on other sites
couldn t you disable occlussion culling and use pure frustrumculling instead

Share this post


Link to post
Share on other sites
Quote:
I find it hard to believe they would choose to do things in software that they can do in HW, especialy on such a high profile engine. Now there is a software rendered to at least one of the UT2Kx versions, but that's optional, not forced.
A 3 GHz and 400 MHz buss sounds like a Celeron CPU (with the new P4s being 800 MHz and the old 533 MHz), and while the new Cels aren't as bad as thier predeserors they're in no way a top of the line CPU. The game could very well be bandwidth starved seeing as you have a large increase in performance when overclocking the FSB.

Hi, my mate was actually doing the original post and mis-specified things a little. He has an AMD Barton 3000+ so was equating it to 3Ghz P4. So it is a 400FSB. Although the rest of the information holds. (I.e. small increases in cpu subsystem speed give the greatest performance boost even at already high cpu speeds).

Quote:
It's DirectX 7. As in, no shaders. As in, no GPU.
Well this is just rubbish frankly :D UT2k3 has a large amount of dx8 support and uses shaders to speed up processing of a lot of features.

Quote:
couldn t you disable occlussion culling and use pure frustrumculling instead
. Erm... maybe :D Is there an ini setting to do this? Although wouldnt turning off their own occlusion processing be a bad idea and likely to diminish performance greatly?

Share this post


Link to post
Share on other sites
Quote:
Original post by BigBadBob
Hi, my mate was actually doing the original post and mis-specified things a little. He has an AMD Barton 3000+ so was equating it to 3Ghz P4. So it is a 400FSB. Although the rest of the information holds. (I.e. small increases in cpu subsystem speed give the greatest performance boost even at already high cpu speeds).


Ah, well the argument still holds, if somewhat less strongly. The Athlon XP 3000+ roughly compares to a 2.8 GHz P4 (AMD got a little over entuseastic in the later days of the XP line, fixed now though with the 64 performing better than its PR ratings most of the time), overclocking it from 2.16 GHz to 2.5 GHz gains about 15% which isn't too bad a bost (however not perhaps the same bost as the same amount on a P4 system, since the XP line isn't as memory starved as the P4).
Considering that he runs the game at 1024x768, a res where his gfx card should be no where near strained (atleast not unless AA and AF were activated), it is to be expected that a bost of CPU speed will have a much higher inpact than ovecklocking the gfx card.
He can probably increase the res (or turn on AA and AF) without much performance decrease.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement