Jump to content

  • Log In with Google      Sign In   
  • Create Account

Is gaming going downhill?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
20 replies to this topic

#1 zeybey1   Members   -  Reputation: 464

Posted 26 July 2012 - 10:16 PM

How much does more advanced technology really add to a game? I know minecraft and other innovative game designs could never run on older gaming consoles, but do better graphics and faster computing really make recent games more fun to play than old style NES-SNES games? To me, it just makes games harder to make and play because of the higher production cost and tougher system requirements. Maybe it's just nostalgia, but older games feel, both atmospherically and fun-wise, very similar to newer ones, and since its harder to make this current style of games, game designers take less risks and have to use a more systematic gameplay(quest systems, campaigns) to make them easier to manage.

What are your thoughts on this?

Sponsor:

#2 Densoro   Members   -  Reputation: 196

Posted 26 July 2012 - 10:50 PM

Maybe it's just because I'm colorblind, but I can't see a damn thing in all these games with so-called HD graphics. I've caught myself running straight at an enemy, unable to differentiate them from the wall and wondering where all the hostiles are, only to get knifed in the face. Similarly, super-advanced physics engines only really add something to games based on using physics, like Portal or Half-life 2. In your average shooter, I hardly notice that exploded barrels are rolling more realistically than they used to. They occasionally provide some fun moments, like shooting an airborne petrol tank in Crackdown, watching it fly off, then getting put on a police hitlist five minutes later...because it rained back down and clocked a civilian upside the head. But all in all, rising tech levels just add some sandbox to an otherwise normal game, and there's such a thing as too much of a good thing.

I've always preferred the graphical level of the PS2 and Gamecube, and games with rigid rules and mechanics that you have to learn to abuse xD Custom Robo comes to mind. The levels are tiny and there's hardly any physics beyond the laughable drop straight down when you get blasted out of the air, but the sheer amount of customization and strategy makes it so much fun. Similar can be said of The World Ends With You or Kingdom Hearts: Birth by Sleep. But I wouldn't say gaming is going downhill; those games delivered, and I loved every second of them.

#3 zeybey1   Members   -  Reputation: 464

Posted 26 July 2012 - 11:33 PM

Yeah, I guess it's more about the developers adding useless additions, where the technology could be used in really cool ways

#4 Bacterius   Crossbones+   -  Reputation: 8836

Posted 27 July 2012 - 03:31 AM

Maybe it's just because I'm colorblind, but I can't see a damn thing in all these games with so-called HD graphics.

It's not just you. The contrast in those modern games is insane - I'm looking at you BF3 - it's difficult to make out anything from the environment. Although, I'm not sure I'm at the point where I get knifed in the face just yet! However you have to give them some credit, in real life you don't see perfectly either, especially when running. So for realistic games, this is progress, in a sense.

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis


#5 Densoro   Members   -  Reputation: 196

Posted 27 July 2012 - 04:18 AM

Hahah, maybe. That's a scary thought though; I'd really rather not think camouflage is super effective against me IRL xD But that's the thing. I think graphics have gotten more definition than real life. I can spot people as distinct shapes out to a much better distance out here than I can in Modern Warfare. It feels like they've overanimated every blade of grass, every grain of sand, and managed to make everything look 'extreme closeup with perfect sunset lighting' good from 50 meters, at which point it just becomes a Picasso painting.

#6 0BZEN   Crossbones+   -  Reputation: 2016

Posted 27 July 2012 - 04:22 AM

It's just a fact of life. Games evolve as technology and expectations involves. The perverse effect is that development teams are far bigger, meaning more expensive, meaning more search for investments, who by nature try to minimise risks and maximise profits.

So yeah, there is something lost in the process, but some innovations make it through, and not necessary via the indie scene. We don't yet have the Christopher Nolan of the games industry, and no it's not Peter Molyneux,or Tim Schafer who is more like a Terry Gilliams character, or John Carmack who is just interested in tech.

Some japanese productions are more adventurous.

Everything is better with Metal.


#7 Icebone1000   Members   -  Reputation: 1084

Posted 27 July 2012 - 10:25 AM

I dont think the problem is the technology, more like cultural.
On the might unbeatable incredible osome SNES (* 3 reverences*) era, we played the same game for 3 years+ , and yet today we still play them in the emulators.

Today games are discardable, you play just once and even if it takes too long, you find it an issue on the gamedesign.

Id guess the problem is the absurd amount of accessibility due internet, theres just too many games, so they arent very unique (a game will always have at least 3 or more games with the same purpose). So, while on old games youd find "new stuff*" on a game by progressing on levels, today you find "new stuff" by playing another game.

*by "new stuff" I mean the reward game design mechanism to keep the player entertained by presenting him new elements.

#8 Arthur Souza   Members   -  Reputation: 1418

Posted 27 July 2012 - 10:42 AM

Well, obviously you cant expect that technology evolving will make games more fun. Games are fun because of game design, mechanics and story, those are helped by technology, not dependent on it.


A.

Lotus - Action RPG In development http://www.gamedev.n...die-rpg-engine/ |
Personal blog In Portuguese: lotuzgames.wordpress.com |


#9 phantom   Moderators   -  Reputation: 7261

Posted 27 July 2012 - 11:02 AM

In short; no.

Lets not get sucked into 'all old games were masterpieces' becaue they weren't... even something like Super Mario which sucked hours of game play for some people didn't really DO that much. Even when it came to 'new stuff' you got a different tileset, maybe a mild variation on enemies and that was it... the rest was 'run right, jump on things, avoid other things'.

A couple of months ago I played, and finished, Max Payne 3.
It was, without doubt, the best gaming experiance I've had. The game was polished, slick, had a great atmosphere (both audio and visual) and was above all fun. It had a story to tell, it told it and didn't drag it out.

In fact I've probably finished more games in the last couple of years than I ever did during 'the old days' of gaming which, to me, speaks volumes.

#10 Orymus3   Crossbones+   -  Reputation: 8954

Posted 27 July 2012 - 11:04 AM

I think the question you're asking is missing the point.
The availability of technology in and of itself is a good thing. Aside from a few historical problems (I'm looking at you A-Bomb) the AVAILABILITY of that tech is good.
Now, systematically using this, and setting production goals as being higher tech may be more of an issue.

If you look at older games which you (and I) seem to have a very fond memory of, you'll see that the team size and structure were so limited that the focus was on what game they could do, and let's be honest, it was the far-west of distribution, so aside from a few bigger titles (hey Doom) you could always hope to get some regional success. I may be disfiguring this a bit as I wasn't around (any veteran developed during, say, the Atari era?)

The advanced graphics tech also came with the internet, and worldwide distribution that is unparalleled (digital downloads). This created what some believed impossible: an even more competitive environment.

Whether they have ideals or not, most players respond positively to a good marketing campaign and companies can sell their product much more efficiently than the competition, regardless of the gameplay. The reason for this is that it's hard, through marketing alone, to show how your gameplay is really fun. What you can easily do, however, is show how immersive your universe is by showcasing kickass visuals.
It's a lot like what you see other companies advertise. They want to give you a feeling of what your brain should think of when shown a brand, not what the brand actually does. Car companies sell you safety, sense of risk, strength, etc. They don't give you the slighest clue about how it actually feels when you drive your car.
It would be easy to point the finger at marketing, and since this is beyond the scope of this thread, let's just say: everyone does it, so why would wouldn't you? Why would you willingly cut your own sales because you're a 'rebel'? That'd be a tough sale to get across to your boss :)

Lately, indies have been in a rise, because, much like earlier day developers, they come to the table with an idea, not powerful tech, and they build on a budget. They have a niche, specifically simple gameplay mechanics with good production values. If they can kick it off (Minecraft, Grimrock) no one will care that their visual or animation (respectively) are not up to par with industry standards because they've had a blast.

With that said however, I strongly believe that the best and most immersive games still come from the industry, but they're extremely rare. I often cite Skyrim as an example of the best game I've played in a long time. True, its more visually stunning than Oblivion and Morrowind, but that's not why it convinced me more than these two previous titles. The developers at Bethesda have iterated in creative ways to get to this point and I respect the choices they've made to make a TES that's even more fun to play.

So to answer your question simply: I think the main problem is that most companies look up at successful titles, and rather than see it as a product of iterating on gameplay (which is fairly complex, especially if you don't take the time to play these games as a competitor) they see its stunning visual. Because it is extremely hard to determine whether a gameplay is going to work or not with the audience, but easier to know whether something is visually stunning, there's been a loop where companies trying to lessen their financial risks systematically opt for the better visuals. The problem is that this generally leaves very small polish budgets, an aggressive timeline to release or the inability to redesign aspects of the game later in production.
Particularly successful (Blizzard) or dissident (Bethesda) companies have managed to secure sufficient funding to do more polish (and somewhat less visual). You rarely get to see a shooter where the recoil feedback is more important that the lush environment lighting, but it happens.

Gaming isn't going downhill, its just taking shortcuts, but even big companies are noticing how indies are doing it, and if they don't, then, google: game industry 2012 layoffs.
Indies that do it right will grow bigger, just like currently large companies have once been startups. It's just a phase.

#11 FableFox   Members   -  Reputation: 506

Posted 27 July 2012 - 04:11 PM

I think games these days are missing the forest for the trees. There are an article that talks about how game developer makes people want to play games, and the science behind it.

Old games focus on game play, today game focus on a) graphics b) monitization (buy gold! pay to get it intantly! buy awesome weapon from RAH!).

But of course, game play is as tricky as mucial notation. Person a uses music notes to make great music b) person b uses music notes, but can be bad. And good games it just like good music (ala We Are Young). We cannot say we are young is good because its note A, then C#, then E, and so on and so forth. It's synergy. Combination of all game play element makes it fun.

Just look at Final Fantasy series. People keep saying FF7 is the best in the series (and I agree). To a point it now in the process of re-releasing. They didn't put remake off the table just yet, just want to see enough profit to justify it. And the only difference between FF7 and FF8 that fan was quite vocal (AFAIK) is that you need to draw magic before casting it. Just to show how fun game play is the sum of it part.

The problem with today games, is that, graphics are being focused on and not the game play. And those purely monitization (and spam all your friend) games are bad, bad, bad.
Fable Fox is Stronger <--- Fable Fox is Stronger Project

#12 Joakim1234   Members   -  Reputation: 111

Posted 27 July 2012 - 07:33 PM

It's because of profit expectation.In the old times not everyone had a computer and it wasn't all profit-oriented,so people did it more for the sake of fun,innovation and discovery and it was more of an art than an industry I guess.I mean when I ran Crysis 2 on max on my brother's computer it looked photorealistic and everything,but it got boring after 30 minutes of playing.Serious Sam The Second Encounter however always felt extremely fun and refreshing to play and the monsters looked very..uhm defined.In Crysis 2 the enemies are so packed with details,spikes,tentacles,lights,eyes,arms that I sometimes can't tell the different types apart.I hate to admit with what Blizzard said about Diablo 2,but they were right,it was something along the lines of "We don't focus so much on high definition graphics,but on a fresh and memorable art style,so we can create a graphics that will be acceptable even in 5 years from now.".

#13 laztrezort   Members   -  Reputation: 965

Posted 27 July 2012 - 08:50 PM

I think sometimes the "new games don't take risks" and "new games are not innovative like the old days" arguments are overblown. Back in the Atari and NES days, there were plenty of derivative games, and a handful of innovators. The derivatives took elements from other games, added their own flavor, and perhaps updated graphics to newer hardware - basically the same process happening today. Time & memory are kind to the handful of good games, the other masses are forgotten.

but do better graphics and faster computing really make recent games more fun to play than old style NES-SNES games?


My personal answer would be yes, in general a good modern day game has more appeal to me than a good old one, and technology is one of the reasons (the other probably being the generally higher production value). Nostalgia aside, of course (I still buy the occasional item from GOG to relive my old favorites). Of course there are plenty crap games with bleeding edge graphics, but an otherwise well-designed, fun game can only be greatly enhanced by a mixture of good art design and solid modern tech.

#14 kryotech   Members   -  Reputation: 881

Posted 27 July 2012 - 10:18 PM

My biggest issue with many new games today is that they tend to be a tweak on an old formula that is known to sell well. Shooters in particular seem to follow this trend the most. Older games had a certain appeal to them. I can't really say what that appeal came from. That being said, they do not really compare to some of the best new ones. In particular, Portal, Assassin's Creed, MGS4, and Skyrim are some games I would cite that are great because of great use of new technology. So in short, yes there are many games today that I believe really are better because of todays tech. The issue is that many devs use tech more for visuals than for anything else. But as the industry matures, and gamers mature, more devs will release games that make good use of tech.
Kryotech

#15 Orymus3   Crossbones+   -  Reputation: 8954

Posted 28 July 2012 - 01:38 PM

I think another point we're missing is that, back when visuals were 'crappy', there was a level of abstraction involved which helped suspension of disbelief. Because the face was symbolic and undetailed, it could be anyone, and the brain filled the gaps. With an attempt of realism with newer generation graphics, we're losing this. You're seeing, as a medium, exactly what the developer had in mind. This makes it a construct of their mind rather than an abstraction, and its very hard to adapt to that, thus, feel involved.
I remember playing dragon warrior 2 and making an entire back story for Prince Cannock in my head and my hero teaching him stuff while he would catch in levels when fighting. That made grinding interestingly fun just because it was abstract.

#16 phantom   Moderators   -  Reputation: 7261

Posted 28 July 2012 - 02:45 PM

This makes it a construct of their mind rather than an abstraction, and its very hard to adapt to that, thus, feel involved.


Really? Because I find it easy enough to immerse myself in games still and feel involved; I'd hesitate to say more so than in the past but certainly going back and playing a few games from the past (such as Deus Ex) I couldn't get into them as the quality of the graphics, compared to modern games, was just so poor to be jaring and keeps me from dropping in.

The point is there are no definitative statements which can be made here; everything depends on the person, the game, heck in their mood when they play.
(I've had games I've tried to play which one day I've disliked and then returned to a few months later and couldn't put down).

I dare say if you could provide some kinda of normalised scoring of games over the years (taken at the time of release) you'd probably find that in general the proportion of 'good' to 'bad' games at the very least remains the same - now, there might well be more in numerical terms than Back In The Day but that's by the by.

#17 zeybey1   Members   -  Reputation: 464

Posted 29 July 2012 - 12:55 AM


This makes it a construct of their mind rather than an abstraction, and its very hard to adapt to that, thus, feel involved.


Really? Because I find it easy enough to immerse myself in games still and feel involved; I'd hesitate to say more so than in the past but certainly going back and playing a few games from the past (such as Deus Ex) I couldn't get into them as the quality of the graphics, compared to modern games, was just so poor to be jaring and keeps me from dropping in.

The point is there are no definitative statements which can be made here; everything depends on the person, the game, heck in their mood when they play.
(I've had games I've tried to play which one day I've disliked and then returned to a few months later and couldn't put down).

I dare say if you could provide some kinda of normalised scoring of games over the years (taken at the time of release) you'd probably find that in general the proportion of 'good' to 'bad' games at the very least remains the same - now, there might well be more in numerical terms than Back In The Day but that's by the by.


Games today use generally better game design(no more NES LOGIC), and there are lots of very good recently made games. I just feel like the biggest desire nowadays is about who has the most technologically advanced game, regardless of the game being fun or immersion, while the technology already out there can already create amazing gameplay and immersion.

Along the lines of abstraction, i've always thought it was more about what the story left out, and not graphical or sound details. Final fantasy ix has quite detailed pre-rendered backgrounds, but still is very abstract because it doesn't explain a lot of story elements(a lot of people hated that about it, but I thought it made it much more fun to fill in the holes yourself).

#18 way2lazy2care   Members   -  Reputation: 782

Posted 29 July 2012 - 11:07 AM

How much does more advanced technology really add to a game? I know minecraft and other innovative game designs could never run on older gaming consoles, but do better graphics and faster computing really make recent games more fun to play than old style NES-SNES games?

They are too different, and "fun" is too vague to really compare all of them with such a blanket. It was really difficult to explore Drama in a meaningful way on SNES/NES (not impossible). You could come up with "fun" games sure, but the breadth and depth we can reach with newer technology allows us to more fully explore and realize our ideas.

To me, it just makes games harder to make and play because of the higher production cost and tougher system requirements.

A lot of newer technology is actually about making games easier to make. Look at Unity, CryEngine, and UE3/4. They're about making high quality games quickly/easily, not just about making high quality games.

#19 Lode   Members   -  Reputation: 982

Posted 30 July 2012 - 01:13 AM

these games with so-called HD graphics


It's so funny if they seriously use that term now.

I was already gaming at resolutions higher than todays "HD" in 1999. UT. On a CRT.

Anyway, games must be fast and snappy for me. I do play games like Skyrim and so on now and then, but then I really dedicate time for it (and need to close everything and specifically boot to Windows etc...).

If I just need a quick break, I fire up something like Doom II. Nothing beats that one for monster shooting fun (and Doom I does not have the double shotgun).

Edited by Lode, 30 July 2012 - 01:26 AM.


#20 Bacterius   Crossbones+   -  Reputation: 8836

Posted 30 July 2012 - 02:08 AM

If I just need a quick break, I fire up something like Doom II. Nothing beats that one for monster shooting fun (and Doom I does not have the double shotgun).

I agree, I sometimes fire up UT1999 in the evening for a couple games, with the fast-paced music banging out the speakers it makes for some mind-blowing fun. Compare this to a modern FPS like BF3, where I need to open up battlelog, let "origin" start, pick a server through the web server browser, have the map load for five minutes, disconnect three times or more, just to join at the end of the round... ugh. Skyrim is actually not bad in this respect, as it only takes ~30 seconds to start it, for me anyway (and I haven't checked with my new SSD yet).

A metric could be designed to estimate just how quickly one can jump into the game. Perhaps a DTI (desktop-to-immersion) measure:
Minesweeper: DTI 2 seconds
UT1999: DTI 15 seconds
Skyrim: DTI 30 seconds
GTA IV: DTI 3 minutes
BF3: DTI 8 minutes
etc... of course this doesn't take into account boot time if you have to switch operating systems (but for light games there's always virtualization alternatives)

I've definitely noticed a trend where games seem to take longer and longer to start up, but this is most likely the result of laziness and poor optimization that an actual technical limitation. After all, on my computer, Just Cause 2 has - I kid you not - a 6 second DTI and is not exactly an obsolete game. I wish more attention was devoted to simple things like that, because it really makes a difference. How would "casual players" feel if Angry Birds took five minutes to load on their iPhones?? Very good point Lode.

Edited by Bacterius, 30 July 2012 - 02:15 AM.

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS