Let's Blame GPU Makers!

Started by
19 comments, last by dwarfsoft 17 years, 11 months ago
At the end of this, it may sound stupid as I am going to ignore economics. I think PC gaming isn't as popular or all it can be because of the stupid gpu makers, maybe other pc parts mfg's as well. In another thread, someone described a card as being for the "I like to play games but I'm not a gamer" demographic. You know what I think? Those people are never gonna be gamers because they will never have the experience that people with better systems do! There should be two cards per generation, one that can play games and one that can play games really really well. Naturally, generations should last longer as well.
Advertisement
So you want to handicap the industry and limit hardware choices? Isn't that a lot like saying "The only programmers we should allow into the industry are are white males ages 18-25 - everyone else gets shot; and to make sure those programmers aren't making things go too fast, we're going to smash their hands with rubber mallets to slow them down."
You're right. From now on, these companies should force developers to only write software that runs on all of their cards and runs really really well on the top end cards. Any program that requires too much effort from a high end card should be blacklisted and not allowed to run. The next time my game won't get decent framerates on my graphics card, instead of rewriting it to me more efficient or less demanding, I'll just place an angry call to ATI and tell them to aim higher!

</sarcasm> (well, obviously)
Quote:Original post by RolandofGilead
At the end of this, it may sound stupid

I just thought I'd highlight this part again.
hardware innovation stifles software stability!
This space for rent.
Quote:Original post by Run_The_Shadows
So you want to handicap the industry and limit hardware choices?

No, I want them and game makers to realize that people use the things they create.
Second thought, yes, there should be good cards and great cards.
I, and presumably others, would play more games if a computer is capable of giving an experience worth my investment of time and energy. For instance, why would someone create an animation for a mod if their computer couldn't play it back fluidly in-game? In other words, PC gaming gets real old, real fast if your framerate sucks.

Quote:Isn't that a lot like saying "The only programmers we should allow into the industry are are white males ages 18-25 - everyone else gets shot; and to make sure those programmers aren't making things go too fast, we're going to smash their hands with rubber mallets to slow them down."

I can only assume you were being sarcastic.

Quote:Original post gumpy macdrunken
hardware innovation stifles software stability!

Uh, kinda, yeah.
That is what a game console is for: peace of mind that you can just stick in a disk and it will play. This is why the console games market is much bigger PC games market.

A PC is all about being next gen and pushing the limits. To play the latest PC games you (usually) have to have the latest hardware. The great majority of people don't want to have to keep up, so they'll use the console for games, and a cheap ass PC for just OTHER STUFF.

IMHO, anyway. :)

- Thomas Cowellwebsite | journal | engine video

One thing most people don't realize is that quality/speed grades on chips are not just marketing but technical as well.

For example: NVidia comes out with the "geforce 9900" graphics card, which runs at (say) a 800MHz core clock with 64 pixel pipelines. Not all the manufactured chips will be perfect; some will only have 32 or 16 pipelines that work, or a chip can run at only 500MHz. Rather than just toss those chips out and force the high-enders to eat the cost, they mark'em down and repackage as the "geforce 9200" or "geforce 9600"

That way, there's an option for consumers who don't want to pay a lot, and gamers can buy more high-end cards.
etothex: Yes, and sometimes they're crippled intentionally because they can't sell them all as high-end.
cow_in_the_well: Yes indeed.

Figured out why I agreed with
gumpy macdrunken: hardware innovation stifles software stability!

Do you realize just how many render paths there are in a 3D engine?
Also, thanks for bringing up consoles, as this brings me to another point: it takes developers until the end of the console cycle to truly bring out the power of the hardware, that cycle lasts years so new generation cards come out faster than devs can grok!

Now does anyone have an opinion on my use-case that the mainstreamers and budgeters don't get further into gaming because their experience is fundamentally different?
Quote:
Do you realize just how many render paths there are in a 3D engine?
Also, thanks for bringing up consoles, as this brings me to another point: it takes developers until the end of the console cycle to truly bring out the power of the hardware, that cycle lasts years so new generation cards come out faster than devs can grok!


A decent 3D engine won't have too many, because each additional path that an object that traverse to appear onscreen increases the complexity of your engine exponentially, which makes maintenence and new development nightmarish.

Consoles are an entirely different matter; they are an entirely different platform that (nowadays) happens to include some hardware elements that are almost identical to their PC counterparts. Consoles also allow you more direct access to things, in many cases, which means that ultimately "bring out the true power" of a console is a much more involved process (getting all of its component parts to operate as effectively and efficiently as possible, possibly by utilizing clever obscure tricks) then maximizing the power of a single GPU (most of which gets done by the API or driver, and the parts that do not are usually matters of "Oh, does this card support this feature? Well then I can enable this particular eye candy effect.").

There ARE good cards and great cards; you don't have to buy the top-end newest GPU to enjoy many games on the market, you can buy the last-generation models (or older) and still have an enjoyable experience. The type of person who "likes to play games but is not a gamer," is not a gamer... perhaps because they don't want to be (i.e., they fall into the massive "casual gamer" demographic who does not purchases the latest- and greatest- hardware upgrades the minute they become available). This market is quite large and professional game developers make every effort to support it, because generally that's where the majority of the profit comes from (the hardcore gamer demographic is miniscule in comparison).

For these people, the upgrade cycle generation does last longer because they don't upgrade as often and when they do it may not be to the latest and greatest card. So for them, the majority, want you want is already true: they have the option of a "good card that plays games" or a "great card that plays games really well," and their upgrade cycle is longer. They often choose the former because they don't see the cost/benefit ratio of the former to be worth it.

So I really don't know where you are coming from, here.

This topic is closed to new replies.

Advertisement