Jump to content
  • Advertisement
Sign in to follow this  
DaBookshah

Double arithmetic overflow question

This topic is 4439 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi there. As part of a project of mine (using C#, Visual Studio Pro 2005), I need to be able to determine inaccuracy in double-valued arithmetic. For example, I mean, if I calculate a+b, I want to know if there has been any rounding/information loss in the calculation. Obviously this is the simplest case, but I want to do the same for Multiplication,Subtraction and division too. I'm just posting here to check if there is a solution using C#. Failing that, I probably know enough assembly to use the values of the flags returned from the calculation, but thats a bit yucky. Any help appreciated.

Share this post


Link to post
Share on other sites
Advertisement
There's no real way to avoid precision loss, or even determine when it's happened. (which is basically all the time).

Instead, the best strategy is to arrange your computations so that you get the least loss of precision possible.

Otherwise, you can of course implement your own datatype to replace double. (or use an existing library for it) It'll be slower, but would let you work with numbers of arbitrary precision

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!