Ageia PhysX To Become Standard?

Started by
14 comments, last by orlando2004 17 years ago
Do you think its worth the investment to get a PhysX Ageia chip for a new PC? Will this become a standard type of chip in the game development world? It appears a lot of the major engines are planning to or have incorperated this into there engine. So therefore is it now or soon to be a required chip for next generation games?
Advertisement
Looks nice to me me. Looking at the details a little bit, I think this may truly be the "next big thing" in gaming. What can we do by now as far as graphics? How much more realistic can we get as far as rendered graphics. In reality, not much. Back in the DOOM era, everything was done on the CPU. Graphics cards came out with amazing 2d sprite capability. 3d cards also came. The next logical step is like this Physx processor and it wouldn't surprise me to see it happen. I don't want to invest myself right now, but if you have the money, it isn't too much risk. But then, sometimes things don't turn out like you want them to so who knows in the end?


I think that PhysX is pretty much dead in the water. It's too expensive, too obscure, and emerging GPU technologies geared towards general purpose computing are making it unnecessary.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
I also found this article (thanks to ravuya) declaring what Promit said to be quite correct.
Quote:Original post by Promit
I think that PhysX is pretty much dead in the water. It's too expensive, too niche, and emerging GPU technologies geared towards general purpose computing are making it unnecessary.


Very good points you make, which is why I won't buy one just yet. I think the idea is good and will be what comes next, but at this point, it isn't time yet for it to fully emerge. It may end up like graphics cards with several key players emerging, but right now, if Physx is the only one, prices will still be too high. The problem is that CPUs CAN do the complex physics, so it isn't necesary like graphics cards were. CPUs never could do the graphics like are cards can, but Physics on the other hand CAN be done so, what's the point. It may hold eventually, but who knows for now???


Quote:Original post by kburkhart84
Quote:Original post by Promit
I think that PhysX is pretty much dead in the water. It's too expensive, too niche, and emerging GPU technologies geared towards general purpose computing are making it unnecessary.


Very good points you make, which is why I won't buy one just yet. I think the idea is good and will be what comes next, but at this point, it isn't time yet for it to fully emerge. It may end up like graphics cards with several key players emerging, but right now, if Physx is the only one, prices will still be too high. The problem is that CPUs CAN do the complex physics, so it isn't necesary like graphics cards were. CPUs never could do the graphics like are cards can, but Physics on the other hand CAN be done so, what's the point. It may hold eventually, but who knows for now???


CPU:s can't really do the physics that the PhysX card can (in realtime). (the PhysX chip can do 100-200 times more floating point operations each second than a high-end single core cpu does).

if you don't do things realtime then the CPU can replace the GPU aswell.

The real problem with PhysX is that its main advantage is that it can do advanced physic effects that have an impact on gameplay.

When you allow advanced physics to have a direct effect on gameplay it becomes a requirement, no sane company would release a game that requires a PhysX chip today (as the market is too small) (pure visual effects can be disabled on lower end hardware without any problems and doesn't require you to read the results back to the cpu, thus they are relativly easy to implement using shaders)

What ageia needs to do now is promote their product, cut prices, make attractive bundles and convince computer manufacturers to use their product.

They need to create the market for the games as the games won't create a market for the chip. Once the games and the market is there they can start producing better and more expensive PPU:s.
[size="1"]I don't suffer from insanity, I'm enjoying every minute of it.
The voices in my head may not be real, but they have some good ideas!
Quote:Original post by SimonForsman
The real problem with PhysX is that its main advantage is that it can do advanced physic effects that have an impact on gameplay.


I think PhysX cards are unnecessary, mainly because alot of physics calculations can be done straight on GPU's, heck a GPU still uses the same type of data that a PPU processes...so what's the point. In this day and age graphics card technology has been developing faster and faster...and ATI claimed that their cards supported physics processing.

IMOO, PhysX could of never taken off...past or future. In that past it would of failed because no one really did much physics.. it wasn't thought of. In the future it will fail, because CPUs and GPUs will end up doing all the number crunching.
Quote:Original post by kburkhart84
What can we do by now as far as graphics? How much more realistic can we get as far as rendered graphics. In reality, not much.


Wow! Nice one. Maybe you should take a look out your window (it's actually kinda nice out there sometimes) and compare with even the latest and greatest games. Sure, games have come a long way, but there's still plenty of development to be done. There are people researching realtime global illumination right now. If that's not a significant step up I don't know what it.

I'm afraid I'll have to side with the "dead in the water" crowd. I have this hope that GPU's are heading towards becoming a general stream processor, and once they reach that point physics on GPUs should be practical.
___________________________________________________David OlsenIf I've helped you, please vote for PigeonGrape!
With the way processors are heading down the road toward ultra multi-cored with 100+ cores, it wouldn't surprise me to see graphics cards disappear in the next 10 years or so, and merge into more of a general floating point cells. Throw in a better designed multi memory bus and things should be great.
Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.
I think that the dedicated PPUs will probably end up finding a home in military simulation, etc., where their costs aren't as noticable. I think, in the end, we are likely to see something like DirectPhysics added to DirectX, and that it will transparently use the GPU or, if present, a PPU... making a proprietary PPU API sort of a non-issue.

This topic is closed to new replies.

Advertisement