• Advertisement
Sign in to follow this  

Ageia PhysX To Become Standard?

This topic is 3977 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Do you think its worth the investment to get a PhysX Ageia chip for a new PC? Will this become a standard type of chip in the game development world? It appears a lot of the major engines are planning to or have incorperated this into there engine. So therefore is it now or soon to be a required chip for next generation games?

Share this post


Link to post
Share on other sites
Advertisement
Looks nice to me me. Looking at the details a little bit, I think this may truly be the "next big thing" in gaming. What can we do by now as far as graphics? How much more realistic can we get as far as rendered graphics. In reality, not much. Back in the DOOM era, everything was done on the CPU. Graphics cards came out with amazing 2d sprite capability. 3d cards also came. The next logical step is like this Physx processor and it wouldn't surprise me to see it happen. I don't want to invest myself right now, but if you have the money, it isn't too much risk. But then, sometimes things don't turn out like you want them to so who knows in the end?

Share this post


Link to post
Share on other sites
I think that PhysX is pretty much dead in the water. It's too expensive, too obscure, and emerging GPU technologies geared towards general purpose computing are making it unnecessary.

Share this post


Link to post
Share on other sites
Quote:
Original post by Promit
I think that PhysX is pretty much dead in the water. It's too expensive, too niche, and emerging GPU technologies geared towards general purpose computing are making it unnecessary.


Very good points you make, which is why I won't buy one just yet. I think the idea is good and will be what comes next, but at this point, it isn't time yet for it to fully emerge. It may end up like graphics cards with several key players emerging, but right now, if Physx is the only one, prices will still be too high. The problem is that CPUs CAN do the complex physics, so it isn't necesary like graphics cards were. CPUs never could do the graphics like are cards can, but Physics on the other hand CAN be done so, what's the point. It may hold eventually, but who knows for now???

Share this post


Link to post
Share on other sites
Quote:
Original post by kburkhart84
Quote:
Original post by Promit
I think that PhysX is pretty much dead in the water. It's too expensive, too niche, and emerging GPU technologies geared towards general purpose computing are making it unnecessary.


Very good points you make, which is why I won't buy one just yet. I think the idea is good and will be what comes next, but at this point, it isn't time yet for it to fully emerge. It may end up like graphics cards with several key players emerging, but right now, if Physx is the only one, prices will still be too high. The problem is that CPUs CAN do the complex physics, so it isn't necesary like graphics cards were. CPUs never could do the graphics like are cards can, but Physics on the other hand CAN be done so, what's the point. It may hold eventually, but who knows for now???


CPU:s can't really do the physics that the PhysX card can (in realtime). (the PhysX chip can do 100-200 times more floating point operations each second than a high-end single core cpu does).

if you don't do things realtime then the CPU can replace the GPU aswell.

The real problem with PhysX is that its main advantage is that it can do advanced physic effects that have an impact on gameplay.

When you allow advanced physics to have a direct effect on gameplay it becomes a requirement, no sane company would release a game that requires a PhysX chip today (as the market is too small) (pure visual effects can be disabled on lower end hardware without any problems and doesn't require you to read the results back to the cpu, thus they are relativly easy to implement using shaders)

What ageia needs to do now is promote their product, cut prices, make attractive bundles and convince computer manufacturers to use their product.

They need to create the market for the games as the games won't create a market for the chip. Once the games and the market is there they can start producing better and more expensive PPU:s.

Share this post


Link to post
Share on other sites
Quote:
Original post by SimonForsman
The real problem with PhysX is that its main advantage is that it can do advanced physic effects that have an impact on gameplay.


I think PhysX cards are unnecessary, mainly because alot of physics calculations can be done straight on GPU's, heck a GPU still uses the same type of data that a PPU processes...so what's the point. In this day and age graphics card technology has been developing faster and faster...and ATI claimed that their cards supported physics processing.

IMOO, PhysX could of never taken off...past or future. In that past it would of failed because no one really did much physics.. it wasn't thought of. In the future it will fail, because CPUs and GPUs will end up doing all the number crunching.

Share this post


Link to post
Share on other sites
Quote:
Original post by kburkhart84
What can we do by now as far as graphics? How much more realistic can we get as far as rendered graphics. In reality, not much.


Wow! Nice one. Maybe you should take a look out your window (it's actually kinda nice out there sometimes) and compare with even the latest and greatest games. Sure, games have come a long way, but there's still plenty of development to be done. There are people researching realtime global illumination right now. If that's not a significant step up I don't know what it.

I'm afraid I'll have to side with the "dead in the water" crowd. I have this hope that GPU's are heading towards becoming a general stream processor, and once they reach that point physics on GPUs should be practical.

Share this post


Link to post
Share on other sites
With the way processors are heading down the road toward ultra multi-cored with 100+ cores, it wouldn't surprise me to see graphics cards disappear in the next 10 years or so, and merge into more of a general floating point cells. Throw in a better designed multi memory bus and things should be great.

Share this post


Link to post
Share on other sites
I think that the dedicated PPUs will probably end up finding a home in military simulation, etc., where their costs aren't as noticable. I think, in the end, we are likely to see something like DirectPhysics added to DirectX, and that it will transparently use the GPU or, if present, a PPU... making a proprietary PPU API sort of a non-issue.

Share this post


Link to post
Share on other sites
Quote:
Original post by smitty1276
I think that the dedicated PPUs will probably end up finding a home in military simulation, etc., where their costs aren't as noticable. I think, in the end, we are likely to see something like DirectPhysics added to DirectX, and that it will transparently use the GPU or, if present, a PPU... making a proprietary PPU API sort of a non-issue.


I was led to believe that the PhysX API does this already.

Share this post


Link to post
Share on other sites
I haven't looked into it too much, but from comments I have read posted here, the PhysX API could still be highly viable even without the PhysX cards.

The cards will go, but I think the API will stick around.

Share this post


Link to post
Share on other sites
Quote:
I was led to believe that the PhysX API does this already.


I think it does, but if I understand correctly it will only use the Ageia PPU, I think. I'm not real familiar with it, but I don't think that the PhysX API will utilize extra horsepower on your GeForce8800, for example.

I can imagine a scenario, though, where Microsoft releases a DirectPhysics API, which will use your Ageia PPU, or another manufacturers PPU, or uses the GPU to pick up the slack in a manner that is transparent to the developer. I think the future holds "DirectX 11.0 compatible" physics cards... or something like that at least.

Share this post


Link to post
Share on other sites
Isn't the problem with GPU-based physics that GPUs are optimized for one-way communication, and thus are best for physics calculations that don't affect gameplay, but merely visuals? If GPUs in the next generation or two address this, then I could see GPUs being used for physics quite heavily. Barring that, I'm guessing that multi-core CPUs have the best future for physics. I agree with SimonForsman; the advantage the PhysX cards have is significantly improving gameplay mechanics, but it's hard to design a game around gameplay mechanics that need hardware acceleration for physics, and expect the game to make any money in the current market. And it's hard to design a game around gameplay mechanics that can make excellent use of hardware acceleration for physics, but still plays well without it. If it really makes good use of the acceleration, then the gameplay with acceleration is going to be noticeably different from gameplay without acceleration. If it was different enough, then they'd just drop the requirement for acceleration, and few people would care to buy the card.

Share this post


Link to post
Share on other sites
I believe that the GPUs don't have enough branch ability to be useful for a number of intense physics (and AI) operations that a chip like the PhysX can do. The actual bigger competitor for Ageia is the quad-core CPUs, and/or the SPUs in the Cell, rather than the GPUs, IMO.

Of course, the mind share is all about GPU physics, even though that's not really the right way to do the end-to-end solution, again IMO.

Share this post


Link to post
Share on other sites
it would be interesting to have one. right now im build a new desk top for a graphic rig for my xsi software.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement