Sign in to follow this  
  • entries
    4
  • comments
    31
  • views
    7500

Future of Gaming: Facts

Sign in to follow this  
Boallods

1674 views

[media]
[/media]

[media]
[/media]

http://en.wikipedia.org/wiki/Voxel Wiki article on Voxels.

http://en.wikipedia.org/wiki/Moore's_law Wiki article on Moores law.

Few days ago, I have made an article on Future of Gaming. Now I am talk about the facts that I should have included in last entry.

Growth of computing power. Five years ago, I had a computer with 1.6 Ghz proccesing speed and 512 mega bytes of RAM. Back then it was enough for every game. I also had PS3 with 3.2 Ghz proccesing power and 512 mega bytes of RAM. I was amazed by graphics and games that PS3 could run. Today, 3.2 Ghz of proccesing power is minimum requirement for average games. 4 GB RAM has becamed standard RAM in our homes. Because of that, gaming companies are having problems in making games for consoles. Moores law says that computing power should double every 18 months.

Speed of a human brain is speed of human thought. By 2020, computers will be as powerful as human brains. Many people asked me "Power of the human brain? What the hell is that?" If computer is as powerful as human brain, it means that its capable of being as smart as human.
Today, however, they can only play chess. Lets do some calculations. If 3,2 Ghz of proccesing power is enough now, how much power would be enough in 2020?.About 204.8 Ghz!!! You dont trust me? Do that calculation by yourself!

If computer is capable of having human-like intelligence, it means that computer itself can be kind of game developer. Now some people got this wrong. I did not said that every game should, or will, be infinite. Maybe you heard of Infinity: Quest for Earth. The studio thats making that game have 11 developers, and yet, they want to make universe thats infinite. That is, their engine will generate infinite worlds. But, most quests will be boring and, most of the time, not have any sense. That would change in the future. Engine (computer that is) would make interesting quests.

And machines are not like humans. Even in WoW, quests get boring, eventually. But computer would think about every aspect of the quest. Is it too long? Too short? Does player need to travel a lot? How much EXP and gold would be enough for the quest? Computer would think about that and even more. Its just not worth counting, believe me smile.png.

Now, graphics. Some people said "Computers as powerful as they can be". They are wrong. Scientists are making, at this moment, nano technology. Little bit of sci-fi right smile.png? Search it on YouTube and you will see what am I talking about smile.png. Anyway nano technology is, basicly, same as technology we have today, only on molecular level. That means that I can put my laptop in a single microchip (and you thought that iPhone 4S is cool) smile.png. That means that future games, because of their graphics, will probably "weight" about 100Gb. Yep, cheap $hit. smile.png

Please put some comments and let me know what do you think of this. ;)
Sign in to follow this  


8 Comments


Recommended Comments

Michio Kaku said it best. It would be 50 to 100 years before AI can best the human mind (make us dance around in a zoo, as he puts it). So, you're not going to see the video game becoming the video game developer any time soon. He also mentions the collapse of Moore's Law within the next 20 years. You're not going to see 204.8 Ghz processors on the current CPU architecture because it's simply impossible, not to mention that CPU speeds haven't increased since about 2005. There may be some small increases yet to come but, otherwise, we're reduced to adding cores for more power. Kaku also talks about quantum computing and this is where the real promise for the future of computing is. At that point, you're no longer talking about gigahertz but well beyond.

Although I love what guys like Kaku and de Grasse are doing, they really are the TV personalities of physics. So, they know their stuff but they dumb it way down for their audience. The "retarded cockroach" is still a bad analogy for computing but, in this context, he's using it to give the audience a perspective on today's computers compared to the quantum computers of the future. Today, we're operating on retarded cockroaches, tomorrow, human brains. Bad analogy, good comparison. However, where you're getting the brain analogy from now makes some sense.

I've been following the development of Infinity since it started years ago (journal is right here on GDNet if you haven't been). Procedural content is one of my main interests and I've been working with agent-based generation in order to achieve varied and interesting gameplay. After working with this for a few years, the limitations have become very, very obvious. It's not so much that we can't create "interesting" content, it's that the human brain is so goddamn hard to fool and "interesting" rarely means "fun". So, fun activities are few and far between and there's oddball things happening all the time.

I'm very enthusiastic about this direction of game development (as you seem to be) but I'm also willing to admit that I may be on a fool's errand. Quantum computing would be a great help (generate TONS of content and have some method of stripping away the silly/boring stuff) but I'll be nearing 70 years old by the time that happens.

It's good that you're excited about this stuff and you should certainly take part in it (we need all the help we can get). But, it's like Kaku said in your second video: "we're the ones that have to build this stuff". It's all so much easier said than done.

Share this comment


Link to comment
[quote name='coderx75' timestamp='1326378294']
Michio Kaku said it best. It would be 50 to 100 years before AI can best the human mind (make us dance around in a zoo, as he puts it). So, you're not going to see the video game becoming the video game developer any time soon. He also mentions the collapse of Moore's Law within the next 20 years. You're not going to see 204.8 Ghz processors on the current CPU architecture because it's simply impossible, not to mention that CPU speeds haven't increased since about 2005. There may be some small increases yet to come but, otherwise, we're reduced to adding cores for more power. Kaku also talks about quantum computing and this is where the real promise for the future of computing is. At that point, you're no longer talking about gigahertz but well beyond.Although I love what guys like Kaku and de Grasse are doing, they really are the TV personalities of physics. So, they know their stuff but they dumb it way down for their audience. The "retarded cockroach" is still a bad analogy for computing but, in this context, he's using it to give the audience a perspective on today's computers compared to the quantum computers of the future. Today, we're operating on retarded cockroaches, tomorrow, human brains. Bad analogy, good comparison. However, where you're getting the brain analogy from now makes some sense.I've been following the development of Infinity since it started years ago (journal is right here on GDNet if you haven't been). Procedural content is one of my main interests and I've been working with agent-based generation in order to achieve varied and interesting gameplay. After working with this for a few years, the limitations have become very, very obvious. It's not so much that we can't create "interesting" content, it's that the human brain is so goddamn hard to fool and "interesting" rarely means "fun". So, fun activities are few and far between and there's oddball things happening all the time.I'm very enthusiastic about this direction of game development (as you seem to be) but I'm also willing to admit that I may be on a fool's errand. Quantum computing would be a great help (generate TONS of content and have some method of stripping away the silly/boring stuff) but I'll be nearing 70 years old by the time that happens.It's good that you're excited about this stuff and you should certainly take part in it (we need all the help we can get). But, it's like Kaku said in your second video: "we're the ones that have to build this stuff". It's all so much easier said than done.
[/quote]

thanks for the comment. but I believe, and so do many scienctists, that nano technology is going to change, or better say, destroy proccesors and I dont believe that quantum computers will grow so fast from 3 x 5 = 15 :)

Share this comment


Link to comment
Then why did you include video of a physicist directly contradicting that belief? He states that things become erratic at small scales (nanotechnology) and that the calculation of 3 x 5 = 15 was a major step forward in quantum computing. He gives a timeline of Moore's Law breaking down in 15 to 20 years and quantum computers being usable in about 30, leaving a 10 to 15 year period of stagnation. We're a ways from quantum computing and nanotechnology is only taking us so far. It isn't, if fact "changing" the processor, only increasing efficiency to a point. Believe what you want but there will never be a 204.8 Ghz on a silicon-based chip, with or without nanotechnology.

Look into light-based processing. I saw an article a few years ago about a team that used light, rather than electrons, to build a (at that time) table-sized processor and they were promising performance in the teraflops. You want pre-quantum speed, that's the best bet that I know of.

Share this comment


Link to comment
[quote name='coderx75' timestamp='1326378294']
Quantum computing would be a great help (generate TONS of content and have some method of stripping away the silly/boring stuff) but I'll be nearing 70 years old by the time that happens.
[/quote]

It just so happens that I am writing up my doctorate in quantum computing at the moment. I am not fully convinced that what quantum computers are currently good for will have that much of an effect on the games industry. They are able to factor numbers exponentially faster, and they can search for marked elements in an unordered set quadratically faster, but other than that their uses are fairly specialised.

Personally I think their main use will be simulating quantum mechanics, allowing us to gain a better insight in to how the world works. This is all assuming that they are actually able to be built. Last year a friend of mine performed an experiment that factored 21 (=3*7) using a photonic quantum circuit. This was incredibly difficult, and they had to use a number of shortcuts (they used the "compiled" version of Shor's algorithm for anyone who is interested), as maintaining the complex superpositions of the photons for the length of the computation is very hard. This leads me to believe that it is a very very long way off, even if it is possible. The machine they built isn't even a universal quantum machine, it just does that one specific task.

As for the rest of the content of this post, I have to say that I think I agree with everything coderx75 has said (apart from the QC bit that I've just discussed).

Nano technology may be the way forward for some areas, I especially think that it will help in materials science, but I'm still sceptical about it as the way to get processors up to that many GHz.

Do you mind if I ask whether you are currently a scientist? Or studying physics at university or something like that?

Share this comment


Link to comment
[quote name='coderx75' timestamp='1326386801']
Then why did you include video of a physicist directly contradicting that belief? He states that things become erratic at small scales (nanotechnology) and that the calculation of 3 x 5 = 15 was a major step forward in quantum computing. He gives a timeline of Moore's Law breaking down in 15 to 20 years and quantum computers being usable in about 30, leaving a 10 to 15 year period of stagnation. We're a ways from quantum computing and nanotechnology is only taking us so far. It isn't, if fact "changing" the processor, only increasing efficiency to a point. Believe what you want but there will never be a 204.8 Ghz on a silicon-based chip, with or without nanotechnology.Look into light-based processing. I saw an article a few years ago about a team that used light, rather than electrons, to build a (at that time) table-sized processor and they were promising performance in the teraflops. You want pre-quantum speed, that's the best bet that I know of.
[/quote]

I just put that video to talk about AI. Quantum computers were in the video so..... hah damn. ;)


[quote name='neutrix' timestamp='1326387611']
[quote name='coderx75' timestamp='1326378294']Quantum computing would be a great help (generate TONS of content and have some method of stripping away the silly/boring stuff) but I'll be nearing 70 years old by the time that happens.[/quote] It just so happens that I am writing up my doctorate in quantum computing at the moment. I am not fully convinced that what quantum computers are currently good for will have that much of an effect on the games industry. They are able to factor numbers exponentially faster, and they can search for marked elements in an unordered set quadratically faster, but other than that their uses are fairly specialised.Personally I think their main use will be simulating quantum mechanics, allowing us to gain a better insight in to how the world works. This is all assuming that they are actually able to be built. Last year a friend of mine performed an experiment that factored 21 (=3*7) using a photonic quantum circuit. This was incredibly difficult, and they had to use a number of shortcuts (they used the "compiled" version of Shor's algorithm for anyone who is interested), as maintaining the complex superpositions of the photons for the length of the computation is very hard. This leads me to believe that it is a very very long way off, even if it is possible. The machine they built isn't even a universal quantum machine, it just does that one specific task.As for the rest of the content of this post, I have to say that I think I agree with everything coderx75 has said (apart from the QC bit that I've just discussed).Nano technology may be the way forward for some areas, I especially think that it will help in materials science, but I'm still sceptical about it as the way to get processors up to that many GHz.Do you mind if I ask whether you are currently a scientist? Or studying physics at university or something like that?
[/quote]

I am not a scientist. LOL! [img]http://public.gamedev.net//public/style_emoticons/default/biggrin.png[/img] And even if Moores law crashes in 20 years... this is 2012 right? We can improve microchips a little too much until that believe me. OH this is a gaming forum? [img]http://public.gamedev.net//public/style_emoticons/default/tongue.png[/img] I followed many physicists and scientists like this one and I believe that microchips, along with nano technology, are going to be more powerful in the future.

Share this comment


Link to comment
[quote name='neutrix' timestamp='1326387611']
This leads me to believe that it is a very very long way off, even if it is possible.
[/quote]
It's good to have someone weigh in with some background in quantum computing (I don't know enough to really give solid arguments) but I think the above statement pretty much wraps up every argument I've made in these two posts. LoreHunter is prophesizing on the future of things that we're already working on every day and I'm trying to give my insight on the things that I have experience with, showing there are drawbacks that cause the technologies mentioned not to match the (mostly his) hype (long way off even if possible). Anyway, it's a lost cause. I need to go do things that have a hope of being productive.


[quote name='LoreHunter' timestamp='1326388140']
...believe me.
[/quote]
Seriously, why?

Jason out *blip*

Share this comment


Link to comment
[quote name='coderx75' timestamp='1326378294']
He also mentions the collapse of Moore's Law within the next 20 years. You're not going to see 204.8 Ghz processors on the current CPU architecture because it's simply impossible, not to mention that CPU speeds haven't increased since about 2005.
[/quote]

Remember moore's law has nothing to do with processor speed... rather the amount of transistors that can be fit in the same size area. The reason cpu speeds aren't increasing, is because the area is not being increase, it's being decreased, in favour of efficiency over brute force.

Share this comment


Link to comment
[quote name='freeworld' timestamp='1326534900']
[quote name='coderx75' timestamp='1326378294']He also mentions the collapse of Moore's Law within the next 20 years. You're not going to see 204.8 Ghz processors on the current CPU architecture because it's simply impossible, not to mention that CPU speeds haven't increased since about 2005.[/quote]Remember moore's law has nothing to do with processor speed... rather the amount of transistors that can be fit in the same size area. The reason cpu speeds aren't increasing, is because the area is not being increase, it's being decreased, in favour of efficiency over brute force.
[/quote]

agree :)

Share this comment


Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now