Jump to content

  • Log In with Google      Sign In   
  • Create Account


GPU drivers


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
9 replies to this topic

#1 dpadam450   Members   -  Reputation: 862

Posted 25 November 2011 - 07:49 PM

I just started Mass Effect 2. Ran at like mainly 25 fps, but would jump up to 60 or down to 10 throughout playing. The whole game was laggy. Updated drivers and its solid 60 fps now. How is it that drivers are that bad? What exactly do they dictate to get that much more speed, because that is a huge performance increase. Shouldn't drivers only need small updates like new features? I would think these would get carried over to each new generation of cards and be fairly solid, but I'm obviously missing something if they fixed a bug or whatever and increased the cards performance by 250% for this game.

Sponsor:

#2 irreversible   Crossbones+   -  Reputation: 1242

Posted 26 November 2011 - 05:10 AM

I just started Mass Effect 2. Ran at like mainly 25 fps, but would jump up to 60 or down to 10 throughout playing. The whole game was laggy. Updated drivers and its solid 60 fps now. How is it that drivers are that bad? What exactly do they dictate to get that much more speed, because that is a huge performance increase. Shouldn't drivers only need small updates like new features? I would think these would get carried over to each new generation of cards and be fairly solid, but I'm obviously missing something if they fixed a bug or whatever and increased the cards performance by 250% for this game.


For whatever ungodly reason you seem to be assuming that the drivers were written correctly in the first place.

#3 Hodgman   Moderators   -  Reputation: 28613

Posted 26 November 2011 - 06:04 AM

... increased the cards performance by 250% for this game.

I'd assume the 'bug' was resulting in the driver requiring excessive CPU-time, not actual GPU-time. It's even possible that Bioware's QA/programming departments found and reported the performance issues, resulting in better driver-side performance in their game.

#4 Promit   Moderators   -  Reputation: 6340

Posted 26 November 2011 - 11:33 AM

Heh.

So, I spent a little time at NVIDIA a few years ago. An enormous amount of the driver development effort is spent on getting new and recently released games up to full performance (I was there soon after the Vista transition, which made things worse). Any time you buy a new game, you're probably due for a driver update. And for all the major titles out there, the driver is reconfigured in subtle or not so subtle ways to make sure that game runs like it's supposed to. I know you're wondering why.

It's a fairly wide range of things, sometimes the driver's fault and sometimes the game's fault. Remember there's a bunch of possible configurations out there, and the games (like Mass Effect) are frequently brought over from the console world and carry assumptions from there. Other times they expose edges and corners in the driver, oddball code paths that aren't running quite right, or there's even a switch that has to be manually set for the correct fast path. The games also sometimes violate the specifications, and maybe they get away with it because of driver version or vendor or hardware. Then the new GPU comes out and breaks the game, and the consumers are mad.

Some of the things I remember (vaguely and distantly):
* A number of games take DISCARD locks on buffers and blithely expect the data to still be there.
* One game never called BeginScene/EndScene. Not sure how they got away with that one.
* In one instance, a shader generator was in use, and the dev team discovered a bug that was writing extra pointless instructions. Trouble is they discovered it after going gold, so they came to NVIDIA who then modified the compiler to hot-patch out the extra code just for that game.
* SLI situations are an unbelievable headache. I've personally never been convinced the technology was even worth developing for general use, and I would say that fully half of the bugs reported against the driver are SLI-related.
* Sometimes the driver skips normally mandatory steps if possible (for example, discarding a buffer when locking, which was NOT marked for it).
* Threading hassles. The driver is multi-thread aware, and depending on how things go that can be a win or a loss.

#5 hupsilardee   Members   -  Reputation: 486

Posted 26 November 2011 - 10:57 PM

Doesn't the whole situation get even worse when driver writers are hacking together fixes for individual games and stuffing them into their drivers? Drivers would become a labyrinth of game specific special-case handlers.


#ifndef MASS_EFFECT
FIX_MASS_EFFECT()
#endif

#ifdef CRYSIS_2
#endif

/// etc

* One game never called BeginScene/EndScene.


How the flying f...rog did they manage that?
Why don't devs just code properly? How is this fair on vendors, or consumers? "hey nvidia, it's crytek here, can you fix our mistake please? we issued one draw call per triangle"
Probably the marketing departments are to blame as well though, I mean, imagine the following conversation:

"hey is skyrim ready for release yet"
"no it still has more bugs than an anthill"
"but it won't be 11/11/11 for another 100 years, just stick it on the shelves, don't care what it's like"


Sorry. I'm up far too late, and felt like having a rant.

#6 Promit   Moderators   -  Reputation: 6340

Posted 26 November 2011 - 11:14 PM

Doesn't the whole situation get even worse when driver writers are hacking together fixes for individual games and stuffing them into their drivers? Drivers would become a labyrinth of game specific special-case handlers.

From what I remember, the NVIDIA driver clocked in at around four million lines of code supporting every GPU they ever made and practically every game worth mentioning. You'd be hard pressed to find any AAA title that large.

#7 frob   Moderators   -  Reputation: 19818

Posted 27 November 2011 - 01:13 AM


Doesn't the whole situation get even worse when driver writers are hacking together fixes for individual games and stuffing them into their drivers? Drivers would become a labyrinth of game specific special-case handlers.

From what I remember, the NVIDIA driver clocked in at around four million lines of code supporting every GPU they ever made and practically every game worth mentioning. You'd be hard pressed to find any AAA title that large.


I've worked on three major titles so far that were far larger than that. Big companies for major games have huge numbers of libraries all written from scratch. A few hundred thousand lines of networking code that runs lobbies and financial transactions and VoIP, a few hundred thousand lines of rendering code that intelligently handles a huge number of cards, etc., and you quickly reach into the millions.

Then add in all those game's tools, servers, toolchains, and such, those are another few million lines.


Check out my personal indie blog at bryanwagstaff.com.

#8 hupsilardee   Members   -  Reputation: 486

Posted 27 November 2011 - 07:43 AM


Doesn't the whole situation get even worse when driver writers are hacking together fixes for individual games and stuffing them into their drivers? Drivers would become a labyrinth of game specific special-case handlers.

From what I remember, the NVIDIA driver clocked in at around four million lines of code supporting every GPU they ever made and practically every game worth mentioning. You'd be hard pressed to find any AAA title that large.


They give every card the same driver? Why not cut out unnecessary code and rebuild for every card, thus saving memory? Or do they use #ifndef and so on to cut out huge swathes of code for different cards?

The whole situation just sounds horrifying and unsustainable to me

#9 Katie   Members   -  Reputation: 1302

Posted 27 November 2011 - 02:38 PM

Because they'd have a maintenance nightmare getting the right version installed. Particularly since they don't necessarily make the cards -- people use their chipsets and chip IP in other things. Plus, what's written on the card isn't necessarily enough to tell what the silicon is; if you have rev5 or above silicon, you don't need patch 687 in a routine because it was fixed in hardware...

It's a bit off expecting end users to know what revision of what chip design is installed in what memory architecture on the third party manufactured card fitted in a computer they might describe, if asked to say what kind they have, as "a white one".

It's much easier to have all the code in one lump, and then build internal structures like function jumptables of lumps of code by querying the card at boot time.

#10 Promit   Moderators   -  Reputation: 6340

Posted 27 November 2011 - 03:30 PM

Yeah, except all the builds aren't approved for all the cards. That's why the NVIDIA driver site makes you pick your exact GPU anyway.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS