Quote:Original post by hplus0603Do you know of an extensive study of this to see to what degree chip makers comply with the IEEE standard? This is obviously not the sort of thing you'd risk basing the network model for a large game project on without a great deal of testing.Quote:Same exe, different machines.
That means it's a problem with the exe. You have to make sure that the internal precision is set to 64 bits (not 80, because only Intel implements that), and that the rounding mode is consistent. Furthermore, you have to check this after calls to external DLLs, because many DLLs (Direct3D, printer drivers, sound libraries, etc) will change the precision or rounding mode without setting it back.
Also, you can't use SSE or SSE2 for floating point, because it's too under-specified to be deterministic.
One might easily imagine a slight bug in a transcendental function on a VIA processor might go undetected for a long time. Or (more likely) an emulator skimping a bit on the floating point precision for performance reasons, e.g. if the Itanium had taken off, PS2 backwards-compatibility on the PS3, or a PPC Mac user running Virtual PC.
But then again perhaps they test these sort of things extensively nowadays, at least I don't see Intel forgetting the whole Pentium FDIV debacle anytime soon.