Jump to content
  • Advertisement
Sign in to follow this  
Sagar_Indurkhya

C# vs. C++ for Scientific Computing

This topic is 5043 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Yeah, so I noticed that a lot of people use FORTRAN, C, C++ for scientific stuff. I think a big reason is that much of the existing code base is in these languages. But I recently read an article that provides some good reasons for C#'s candidacy: <href="http://msdn.microsoft.com/msdnmag/issues/04/03/ScientificC/"> clicky </href> I think some of the good reasons are the portibility, the availability of Decimal(128bit floating point), etc. Also, JIT works better for applications that are running for a long time. Unlike Java, C# does let you use pointers and unsafe code.

Share this post


Link to post
Share on other sites
Advertisement
Hi there Sagar_Indurkhya,

Im new to C# by the way but I would say that from what I have seen of C# I think it can be used to perform scientifically if one really wanted.

I mean I don't know what else is required from a language to be considered for scientific use, but as you said the 128 bit decimal data type that C# has is one good reason, because of that kind of precision not found in float or double.

As for pointers and unsafe code I don't know what application they would have to a scientific program, thats not to say they don't, Im saying I dont know! Surely a scientific program is one that carries out scientific tasks, so if it can be coded in one language why not another since its all about how you design and write the code to solve your specific problem. C# has that one advantage of 128bit precision for decimal numbers and a host of mathematical functions, but besides that I cannot tell you why else it should or should not be used.

There might be some problems in the language which I dont know of, maybe in performance that may not be good for scientific applications but then again I don't know, Im just guessing :P


DarkStar
UK

Share this post


Link to post
Share on other sites
I'm fairly certain that in most benchmarks, math was the main area where .net was significantly slower. Mostly floating point and trigonometry were bad iirc.

Of course, that could be fixed by now.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Essentially if you can use Java (and take the speed hit) than C# is a perfect canadate.

If you want really optimized code then use C/C++ or a language that has compile time optimizations.

Share this post


Link to post
Share on other sites
Reason1: C# is practically nonportable(ATM, MS only) . Powerful scientifical mahines and clusters are running unixes and linuxes, not windows, and have no .net (and it would take a lot of time to implement good JIT compilers for 'em)

"Decimal" ,guessing by name, probably should be not 128 bit but some weird and slow kind of BCD, at least if it's properly named. And, in C++ 128-bit precision is not a problem at all. In c++ you have no benefit to have it built-in, custom implementation is as fast. If in c#, custom implementation is slower than built-in, 128-bit numbers is not exactly adwantage, need to have it built-in is opposite.

And of course performance.

Also, how i could separate my integration method(RK4) and my simulator in C#? I guess it's harder than in C++ because probably there's no reliable way to use C++'s memory hacks for it.
All those nice features OOP gives for modularity often offer very small benefit for scientifical programs, becoz 'em is sometimes orthogonal. For example OOP concept doesn't make it significantly easier to separate integrator and simulator, it only helps a bit to make "packages" out of 'em, and to swich integrator at runtime(something that is probably never really needed in practice).
But science may switch to C# under pressure of not having competent C++ programmers....

In summary: C# was designed mainly not for science. That is, it's was the weakest side of C# ,and probably still is, optimizing JIT for math probably doesn't have too big priority on their "todo" list.
Yes, it's probably not a defect of language, but there may be defects in VM design. (say, JVM code by design isn't quite good for science)

Share this post


Link to post
Share on other sites
Quote:
Original post by Dmytry
Reason1: C# is practically nonportable(ATM, MS only) . Powerful scientifical mahines and clusters are running unixes and linuxes, not windows, and have no .net (and it would take a lot of time to implement good JIT compilers for 'em)

Utterly wrong. Before stating incorrect remarks please note DotGNU and the Mono projects. I have tested some C# applications running on various Linux distros as well as FreeBSD. There are even some features that are implemented in those that Microsoft has yet to have working.

C# is also a standard along with the CLI. It was offered to the standards committee not only by Microsoft but by other companies.

Quote:

"Decimal" ,guessing by name, probably should be not 128 bit but some weird and slow kind of BCD, at least if it's properly named.

Wrong.

Quote:
MSDN
The decimal keyword denotes a 128-bit data type. Compared to floating-point types, the decimal type has a greater precision and a smaller range, which makes it suitable for financial and monetary calculations.

Range: decimal ±1.0 × 10−28 to ±7.9 × 1028
Precision: 28-29 significant digits


Quote:

But science may switch to C# under pressure of not having competent C++ programmers....

And this is where you are deadly wrong. Higher level languages actually allow for BETTER scientists, because they can concentrate on the problem at hand and what they do best, not how to manage memory, create event systems, and other things not related to their field.

I am not saying that C# would be the best for this, just that higher level languages in general will be.

Quote:

In summary: C# was designed mainly not for science.

C# was designed for writing managed applications on top of the CLI/CLR. It is not meant for operating systems, drivers, etc.. but is specialized to application building.

Quote:

That is, it's was the weakest side of C# ,and probably still is, optimizing JIT for math probably doesn't have too big priority on their "todo" list.

Do you just make these things up as you go along? Where do you get this info on optimizing the JIT for math? It's odd because according to their developers and the advancements I've already seen between in C# 1.0 -> 2.0 you are utterly wrong.

Quote:

Yes, it's probably not a defect of language, but there may be defects in VM design. (say, JVM code by design isn't quite good for science)

Defects in VM design? You do realize that C# does not run in a VM right?

Share this post


Link to post
Share on other sites
Another thing to take note of is that Decimal isn't quite the same as float or double. It was written primarily for dealing with currency operations because it doesn't have any rounding errors, but it also doesn't allow for high exponents. Here's an exerpt from Microsoft's documentation:

"The binary representation of an instance of Decimal consists of a 1-bit sign, a 96-bit integer number, and a scaling factor used to divide the 96-bit integer and specify what portion of it is a decimal fraction. The scaling factor is implicitly the number 10, raised to an exponent ranging from 0 to 28.

Therefore, the binary representation of a Decimal value is of the form, ((-296 to 2 96)/ 10(0 to 28)), where -296 is equal to MinValue, and 296 is equal to MaxValue."

Operations on Decimals are also quite a bit slower than on floats/doubles, so overall the Decimal struct wasn't really meant for scientific math, but more for financial calculations. Just my two bits worth.

EDIT: Sorry, I guess Saruman already kinda touched on this.

Share this post


Link to post
Share on other sites
Quote:
Original post by psamty10
This probably explains why C++ is emerging over C#, since its easy to integrate with Python.


Actually Python would push C# not C++.

Currently Python.NET is being worked on and is in far stages by a couple groups. Any application written in Python.NET will seemlessly integrate with any C# application.

So I don't see that as one of the reasons C++ has a larger userbase than C#. I mean note that MANY places including governments, scientific research labs, etc will not use a language or platform until it has fully proven itself... I mean look how long it even took game developers to start using C++ over C. THAT alone is the single biggest reason at the current time that a lot of major development places will not be using C# right now.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!