Archived

This topic is now archived and is closed to further replies.

Mari_p

The evolution of the computer graphics in 3D games

Recommended Posts

I have just watched the movie "The Lord of the Rings - Return of the King".... one word: amazement. How many years will the 3D games take to reach that level of computer graphics?... if they reach...

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
The computers duplicate your capacity at each 18 months... it is exponential. But normally appear new CPU/GPU generations. Maybe in 10 years we will have fantastic graphics.

Share this post


Link to post
Share on other sites
In fact, the only thing that doesn''t seem to be increasing exponentially is the amount of time allocated to us developers to create the games

Share this post


Link to post
Share on other sites
While processor speed seems to follow an exponential curve, things like RAM speed are going to hold that back quite a bit.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
The graphic artists have been more and more occupying their places in the games world. In the future, with power hardwares and super high level engines, the artists will can develop good games on a interface so easy to work as a 3d studio max or Maya... and it won''t be necessary to have a programmer for close. Of course the programmers will develop new tools, but the computer graphics will be so developed that the creativity and art quality will be an enough factor... differently than it happened in the past where graphic artists do nothing (no game) without a programmer for close.

Share this post


Link to post
Share on other sites
Electronic fields are only so small. Eventually, we will hit a cap on how small and how fast our computers can become. We''ll probably have to wait until they invent quantum computers before we can have those kinds of games.

Besides that, though, much more important than the graphics in a game is the gameplay. It is quite unfortunate that, in the computer gaming industsry, the QUALITY of the games doesn''t double every 18 months....

Share this post


Link to post
Share on other sites
Contrary to popular belief, quantum computers aren''t faster than regular computers at regular computing. Quantum computers work in a fundamentally different way, and can take advantage of that to be faster at some calculations (like factoring numbers), but for regular computing are generally slower (because you have to work around the differences instead of ultilizing them to your advantage unless you can come up with a good quantum algorithm to do what you want).

Share this post


Link to post
Share on other sites
Even if CPU and GPU speeds and power became huge, you''d have to then focus on the creation of life-like models and textures. You could argue that when people become more spoilt for quality, they won''t want all their enemies looking the same so models will need to be created for them too.

The next step in realism will be when computers can access your brain directly and allow you to conjure your own imaginative images internally

Share this post


Link to post
Share on other sites
quote:
Original post by Tac-Tics
Electronic fields are only so small. Eventually, we will hit a cap on how small and how fast our computers can become. We''ll probably have to wait until they invent quantum computers before we can have those kinds of games.

The 0.13 micron processor architecture currently in use in today''s Pentium 4 chips is not far from the absolute physical limit of processor minification. Once you get down below 0.10 micron, you have a problem: the gap in a transistor which is set to off is so small that an electron can jump it, rendering the transistor useless. The solution to this problem is a processor which uses photons (light, basically) as opposed to electrons to do the work. It''s predicted that photon processors will be in use within the next ten years, and my Computer Systems lecturer made a conservative estimate that in twenty years, Moore''s Law will cease to apply, as processors will be as small as they can get.

As for the CG aspect of computer games, consider this: when Pixar Studios created Luxo Jr. in 1985, some frames took up to 70 minutes to render on the hardware available. The modern GeForce 3 can run Luxo Jr. in realtime, with (it seems) all the options the pre-rendered 1985 version had. Within a few years, it''s conceivable that we''ll be playing games with graphics comparable to Toy Story or Monsters, Inc. Then, as mentioned above, quality will become an issue. Models, textures and environments will need to be extremely detailed, and Half-Life will look even older than it does now...


Windows 95 - 32 bit extensions and a graphical shell for a 16 bit patch
to an 8 bit operating system originally coded for a 4 bit microprocessor,
written by a 2 bit company that can''t stand 1 bit of competition.

Share this post


Link to post
Share on other sites
it''s coming, don''t be impatient!

I still remember the time of the 8086 vivdly (I still have mine), so to say, in the past 20 years, computers went in leeps and bounds. However, the current technology hasn''t changed drastically since then, and there are a limit to things. So until the new ''transistor'' breakthrough (quantum machines, photon transistors, whatever), I think we are gonna hit the ceiling pretty quickly, when raw performance is concerned.

As far as LOTR quality, I don''t think it''s that far off actually. I can see the troll creature (from the first LOTR) achievable in real time on current hardware. But one troll hardly makes a game We can certainly do better than their sometimes dodgy blue screen techniques

Share this post


Link to post
Share on other sites
Negitivefrags, maybe you haven''t paid much attention in the Toy Story. I know each detail from that movie and I sure: The games still are far to have a Toy Story quality.

Share this post


Link to post
Share on other sites
> How many years will the 3D games take to reach that level
> of computer graphics?...

Probably never if you consider movies & games within the same time frame. Software renderers will always be able to implement realistic imagery far better than hardware renderers because they have plenty of time to generate one frame.

Be happy. We can render in realtime the same image quality of Start Trek:The Wrath of Khan. But that's at least a 22-year lag in engineering ... |8-}

-cb

EDIT: typo, removed paragraph.

[edited by - cbenoi1 on January 7, 2004 7:23:56 PM]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
If one day a PC can run a Toy Story in real time so it also can run the animations from WarcraftIII. If a PC can run Monsters or Finding Nemo then it also can run Lord of the Rings, last Star Wars, Matrix,... do you think there are less resources and effects on the Pixar movies? Do you think they are not capable to develop realistic graphics like LOTR? Pixar maintains a cartoon style and they have, in my opnion, the best creation team of the world.
Just imagine a PC running in real time the ballet of jellyfishes in Finding Nemo...

Share this post


Link to post
Share on other sites
NVIDIA has a nice demo of a semi-clad fairy woman. They can do ONE woman at playable frame rates on their high-end card.

Assuming graphics power doubles every 18 months, that means they can do an interesting number of characters (say, >20) about 7 years from now.

Then, you need physics, and AI, and an environment to match.

Thus, you''ll have fairy-woman quality on the high end systems about 10 years from now, in the form of real games.

Then it''ll take another 3-5 years before the regular computer most people buy has the same ability.

Personally, I think LOTR has better graphics than the fairy woman demo, so it''ll probably be about 20 years before that level of detail becomes mainstream.

Share this post


Link to post
Share on other sites
quote:
Original post by Anonymous Poster
If one day a PC can run a Toy Story in real time so it also can run the animations from WarcraftIII. If a PC can run Monsters or Finding Nemo then it also can run Lord of the Rings, last Star Wars, Matrix,... do you think there are less resources and effects on the Pixar movies? Do you think they are not capable to develop realistic graphics like LOTR? Pixar maintains a cartoon style and they have, in my opnion, the best creation team of the world.
Just imagine a PC running in real time the ballet of jellyfishes in Finding Nemo...
What? I hardly understood a word of that. Let me make some stuff clear. Toy Story was exceptionally expensive in terms of processor time to render frames. We are still a long way from seeing those kinds of graphics real-time, perhaps 7 to 10 years. The example I cited, Luxo Jr., had a more basic shader system, and a much lower polygon count. Your average modern game character''s poly count is around 10,000. Toy Story models had between 100,000 to 250,000 polygons each. Plus advanced multiple-pass shaders. Plus the highly detailed environments. Plus lighting and shadowing. You think Warcraft III is as advanced as Toy Story? Think again...


Windows 95 - 32 bit extensions and a graphical shell for a 16 bit patch
to an 8 bit operating system originally coded for a 4 bit microprocessor,
written by a 2 bit company that can''t stand 1 bit of competition.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
iNsAn1tY, I''m sorry about my poor English. I agree with you. You didn''t undertand what I said. I just tried to say that, since Toy Story has a cartoon style, some people think it has less resources than WarcraftIII, what is not true. So, if one day (in the future) the PCs could run Toy Story, then the animation from WarcraftIII would run easily.

Share this post


Link to post
Share on other sites
quote:
Original post by iNsAn1tY
Then, as mentioned above, quality will become an issue. Models, textures and environments will need to be extremely detailed, and Half-Life will look even older than it does now...


Why pay artists when the computer can generate it for you? The amount of artists in games will plateau, then fall a bit as they are made partially redundant by computer systems that can generate mundane stuff, like tweening, skinning, morphing, and generating models from profiles. More focus will be put into game design, stoylines, etc.

Share this post


Link to post
Share on other sites
I''ve always wondered that if computers go so powerful they can render almost real scenes, will that make us developers lazy. In that we stop scene sorting etc.. and just render everything and let the hardware sort it out.

Share this post


Link to post
Share on other sites
> Personally, I think LOTR has better graphics than the
> fairy woman demo, so it''ll probably be about 20 years
> before that level of detail becomes mainstream.

Typical ILM stuff takes 80 or so layers of texturing, each taking 2K or 4K pictures depending on size on screen (for your stats, Stephen Fangmeyer reported he used 156 layers for the boat in ''Speed 2: Cruise Control''). Rendering a frame takes **terabytes** of data and it''s why RenderMan is used instead of a raytracer: it renders a model at a time in bucket files.

Maybe a reacheable goal in the shorter term would be "Final Fantasy: The Spirits Within" by SquareUSA. This one uses a far less amount of data to render (Square used a raytracer), but you still hear that sound when your lower jaw hits the floor.

-cb

Share this post


Link to post
Share on other sites
Just because movie fx teams use vast numbers of textures and polys to achieve their effects, doesn''t mean we''ll have to use the same stuff. We game programmers are masters of faking stuff like this - finding fast hacks that look right in the majority of cases. While I don''t think we''ll be able to do LOTR quality scenes in the near future, I do think we''ll be able to do scenes that most people can''t distinguish from LOTR qulity in the next 5 years or so.

Share this post


Link to post
Share on other sites
quote:
Original post by Tac-Tics
Besides that, though, much more important than the graphics in a game is the gameplay. It is quite unfortunate that, in the computer gaming industsry, the QUALITY of the games doesn''t double every 18 months....


Hehe I wouldn''t want the quality of gameplay to double every 18 months. It would be the doom of civilization as we know it, as everybody would play games instead of working, eating and sleeping.

Sorry I just could not resist!

Share this post


Link to post
Share on other sites
I think one of the biggest problems with real-time computer graphics at the moment is the fact that you still see jaggies on all but the fastest and most advanced cards which can run FSAA in real-time. When FSAA is common and you can''t see that environments and models are made from polygons, then it will be easier for users to forget that what they''re watching is actually a 3D simulation of something happening, and not the real thing...


Windows 95 - 32 bit extensions and a graphical shell for a 16 bit patch
to an 8 bit operating system originally coded for a 4 bit microprocessor,
written by a 2 bit company that can''t stand 1 bit of competition.

Share this post


Link to post
Share on other sites