• Advertisement

Archived

This topic is now archived and is closed to further replies.

ALGOL->BCPL->C->C++->D?

This topic is 5826 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
www.digitalmars.com/d/ I''d click here instead

I''d say it is a bit like the empty restaurant. No one goes in while it is still empty. C++ works just fine for most people, so why are they going to switch when there is no demand? The only reason might be if other people are using it, which they aren''t. Having said that, I haven''t actually checked it out yet. Mob mentality rules

Trying is the first step towards failure.

Share this post


Link to post
Share on other sites
Sorry, about the link, but...

I read it in DDJ and it seamed to have a few changes in it. For example, they use ''modules'' instead of .h''s.

Share this post


Link to post
Share on other sites
ragonastick has a point there.. It's no use in mastering a new language incase you never will be able to use it.. I'll stick to C++.. Atleast until D becomes industry standard, which I doubt will ever happend.. But thats just me...


Edited by - Rickmeister on February 2, 2002 1:18:49 AM

Share this post


Link to post
Share on other sites
quote:
Original post by Andrew Nguyen
Mabye if it adds some groundbreaking thing or technology into it.

It''s not though - it''s like Java or C#, it''s taking groundbreaking features away, in interest of making the compiler easier to write.

Share this post


Link to post
Share on other sites
Mabye it''s trying to follow the tradition of the languages!
BCPL was a stripped down version of ALGOL, B was a stripped version of BCPL and C was a stripped down version of B and C++ was just not following the tradition of stripping languages down to it''s core! We have made D!

From now on, we will code with the 3 basic boolean gates, AND OR and NOT!

Share this post


Link to post
Share on other sites
I find it extremely interesting to read Andrew Nguyen''s posts. It seems that everything he posts is somehow related to the utter evilness of C++ and the perfections of other languages. He''s created threads advocating Python, some sort of LISP dialect, and now D. The fact is, Andrew is not fluent in C++ by any means, which has been more than adequately demonstrated by the occassional C++ code snippets he provides us with (I remember seeing one in which he used a pointer to store the result of some boolean operations in a thread that was shortly after deleted).

What I want to know is how Andrew is ever going to be a competent programmer when he spends more time looking for a "better" language instead of actually learning one. I think his time would be better spent learning to actually implement some of the things he would like to have instead of whining that there isn''t a language there to spoon-feed him.

Andrew, I have no desire to upset you with this post. However, I would like to suggest that you be more productive with your time. I think many of us are a little tired of your "new language for every day of the week" posts.

Now, for some of my own feelings. I''ve really had enough of these new, "revolutionary" languages that claim to have all the answers, and that claim to solve all the problems that plague programmers. We don''t need any new languages to help us program smarter. What we need is smarter programmers.

Let me repeat that: you can all quote me if you want.

"We don''t need any new languages to help us program smarter. What we need is smarter programmers." --Brad Fish

Share this post


Link to post
Share on other sites
As quoted from my friend Thomas Neil,

"This isn''t even about bashin'' it''s about your thoughts on it!"

and

"How am I bashing anything?"

And, from me, how am I bashing C++? I was just wondering what were your thoughts on D!

Share this post


Link to post
Share on other sites
I''m not pointing this at you, Andrew, since I have never really looked at your code, but here''s my view. Frequent new languages support the Darwin award in a sense for stupid programmers, who spend too much of their time finding which bad language isn''t bad to ever enter the real programming world. At least that''s how I comfort myself. As for ''D'', I don''t see any reason to use it. C++ is standard, supported, ultra-documented, and overall a safer choice than D. That''s my opinion, and that''s all it is.

Alex Broadwin
A-Tronic Software & Design
-----
"if you fail in life, you were destined to fail. If you suceed in life, call me."
"The answer is out there."
"Please help, I''m using Windows!"

Share this post


Link to post
Share on other sites
Point out where I said I loved it. All I did was read about it. I asked for thoughts on it. I did not say that I use it. (NOTE: I don''t use D) Now if you can post in that mindset, I would be greatful.

Share this post


Link to post
Share on other sites
I gave you my answer. I don''t see any reason to use it, and I gave the reasons I see not to use it. But I will post them again for your viewing pleasure.

REASONS TO STICK WITH C++
* C++ is standard
* c++ is widely supported
* c++ is ultra-documented
* C++ is overall a safer choice than D

for comparison, here''s for D
* D is not standard
* D is not widely supported
* D is not ultra-documented
* D is not overall a safer choice than D (er, duh)

Incase my view is not clear enough, I WILL NOT USE D. There you go.

Alex Broadwin
A-Tronic Software & Design
-----
"if you fail in life, you were destined to fail. If you suceed in life, call me."
"The answer is out there."
"Please help, I''m using Windows!"

Share this post


Link to post
Share on other sites
maybe a bit of topic: but are you the Nguyen I saw in a TV interview about the future of the silicon valley?


Yesterday we still stood at the verge of the abyss,
today we''re a step onward!

Share this post


Link to post
Share on other sites
ATronic: "REASONS TO STICK WITH C++.".

while valid reasons to use "C/C++" over "D". one should always consider "what is the best language for the job".

To the vast majority of mankind, nothing is more agreeable than to escape the need for mental exertion... To most people, nothing is more troublesome than the effort of thinking.

Share this post


Link to post
Share on other sites
I won''t use D until it becomes a standard of the industry!

But how can D become a standard if no one gives it a chance?


Yesterday we still stood at the verge of the abyss,
today we''re a step onward!

Share this post


Link to post
Share on other sites
I think we should just code with NAND gates instead. Or maybe program for a turing machine VM... it has great cross platform compatibility

Seriously though, I think you need to look at why some languages become standards and why some don''t. I''m quite sure that while quality is important, there are other things. Mainly, the idea of scratching an itch, marketing and ease of use.

My guess is that had C++ been developed with a syntax which was different to C (I don''t see why or how, since it is called C++ which kind of indicates that there is something to do with C in there ), but if the C coders couldn''t immediately transfer all their skills over, then there would have been a problem.

With D, there isn''t much scratching of an itch. I''m not sure what the current itches are, but you don''t see people complaining about C++ very much (unless you look at Andrew''s posts ).

With ease of use is familiarity too. As an example, I use VB primarily, and as such I''m a half decent programmer (if I don''t say so myself ). But, I frequently see people who are used to programming in C++ who can''t adapt to it, "I can''t work without pointers" comes to mind. For me, it is fine (partially because I know how to work with pointers in VB) because I taught myself to program, and I taught myself without pointers, so when solving problems there are my own unwritten personal "design patterns" which I use. Fortunately for me though, I do understand pointers (sometimes it seems, even better than C++ists), so when I am in a different language, I can then solve problems differently. If the makers of D decided that since VB got by without pointers and everything could be done (not strictly true, but you get the idea) thus rendering pointers useless and got rid of them, even if the language was superior, the mindset which people use to program with would have to change a lot which doesn''t foster growth.

Trying is the first step towards failure.

Share this post


Link to post
Share on other sites
Jesus christ.

The so-called "language" sucks for all manner of reasons.

Mostly the things it leaves out of C++.

e.g.:
quote:
The C preprocessor.

Whilst D implements some things from the CPP into the language itself (though the only notable addition to C++ is its import declaration), it doesn''t have facilities to do everything the CPP can do.

quote:
Multiple inheritance

Just because MI is there doesn''t mean people have to use it. It is sometimes the best solution to a problem, and interfaces are not a suitable alternative. Interfaces, by their very nature, prevent the sharing of implementation amongst classes. This necessarily requires copy-and-paste programming, which is one of the things OO is designed to avoid.

quote:
Creating object instances on the stack.

The stack has some important benefits over the heap and the free store -- notably, it''s significantly faster to allocate memory on the stack than it is anywhere else in the system. Some bits of code need that performance. There is no reason to sacrifice the ability to sacrifice that decision.

It then says:
quote:
This eliminates the need for copy constructors,

Which is wrong. It doesn''t. Even heap/free store objects might be copies of existing objects. For instance, imagine I had a mutable string class and I wanted a function to modify that class. Even with free-store objects and references:
  
void myFunction(string& s)
{
string* t = new string(s);
// do something to t

// assume it gets garbage collected

}

I need a copy constructor -- to construct the object referred to by t such that it matches the object referred to by s.

quote:
assignment operators, complex destructor semantics,

Granted, there are issues about what happens when you throw from a constructor, but they exist in D anyway. The normal case -- stack objects are destroyed as they pass out of scope in the reverse order to their creation -- is simple, and beats the pants off of non-deterministic finalization any day of the week.

quote:
and interactions with exception handling stack unwinding

Who cares? That''s something for the compiler-writer to deal with. Making the language crippled just to make writing the compiler a little easier (it''s a problem that''s been solved by many people on many occasions, so it can''t be that hard) is assinine.

quote:
Operator overloading

operator overloading lets my classes work like native types. That is an excellent feature. It means that people can use my classes without having to spend a long time learning how they work. No-one is forced to use them, and there is nothing to be gained from removing them.

quote:
Non-virtual member functions

Whilst non-virtual member functions are a mixed blessing, there is genuine value in allowing them. Simply put, v-tables aren''t free. So don''t make me use them if I don''t want them.

quote:
Bit fields of arbitrary size. Bit fields are a complex, inefficient feature rarely used.

They''re rarely used, but they''re not any less efficient than the widely used manually masking off bits, and they''re certainly easier to use. They are not a bad feature, and they are useful.

And some of the reasoning is gibberish.

For instance, trigraphs and digraphs are omitted because:
quote:
Wide char is the modern solution to international character sets.

Which is pig ignorant.

Trigraphs and digraphs are a solution to the problem of certain terminal keyboards not having certain characters on them; for instance, []{}#|.

Trigraphs provide a moderately convenient mechanism for using those characters if they do not exist on the keyboard. They are not used -- or even usable (as they get translated by the preprocessor) -- as a mechanism for solving international issues. Just because we have wide characters does not mean that every keyboard can type in all the characters the langauge requires.

The language does have some nice aspects. It has try/catch/finally, which removes the need to create dummy objects to get finally semantics. It has design-by-contract features (whilst I have precondition/postcondition/ensure invariants are invariant blocks written by hand, it might be useful to have some kind of compiler support -- though it doesn''t do anything that can''t easily enough be done with a macro. And the unittest function is an interesting idea that might be useful. Though again, it doesn''t require any special language support, and could easily enough be mimiced in C++.




There are few new features of the language, but many of them can be done reasonably well in C++ anyway, and the omissions are unforgivable.

So, all in all, it doesn''t look particularly nice.

Share this post


Link to post
Share on other sites
a REALLY stupid thing is how it changes this
#define VALUE 5
to this
int VALUE=5;

as we all know, this will only waste space, make you create stack space when you don''t need it and more importantly, slow the program down with additional memory calls.

There are lots of flaws in this thing that I can see. It would really degrade performance.

Other things it does is say that compilers don''t do things, which in reality they do.
eg; it says that D will automatically set inline and register calls, but C compilers don''t. Guess what Andrew - go and look in the project options in Borland (what you use), and it even says in there that it does. It has done this for many years, before D was even started.



Beer - the love catalyst
good ol'' homepage

Share this post


Link to post
Share on other sites
quote:
Original post by Dredge-Master
a REALLY stupid thing is how it changes this
#define VALUE 5
to this
int VALUE=5;

as we all know, this will only waste space, make you create stack space when you don''t need it and more importantly, slow the program down with additional memory calls.

No it doesn''t.

For a start, you should be doing this in C++ already.
Second, it should be const int, not just int.
Third, using #defines doesn''t necessarily eliminate stack usage, as people often use them as parameters and things, where they may well get copied around and such.
Fourth, your compiler will almost certainly eliminate the variable, and use the instruction forms that take a number directly.

Requiring people to use const int is no bad thing at all.

quote:
There are lots of flaws in this thing that I can see. It would really degrade performance.

Other things it does is say that compilers don''t do things, which in reality they do.
eg; it says that D will automatically set inline and register calls, but C compilers don''t. Guess what Andrew - go and look in the project options in Borland (what you use), and it even says in there that it does. It has done this for many years, before D was even started.

They don''t have to. I think the real point is to get rid of the register and inline keywords (which arguably do not belong in the C++ language anyway -- if you really want to hint to the compiler, you shold use a #pragma directive or similar) and state that the compiler *must* optimize these things for itself (rather than the current C++ situation, whereby the standard says you can specify but most compilers will ignore you (certainly for register, less so for inline).

On the other hand, he then goes and inserts something into the language that C++ has the sense to leave out. Rather than using a #pragma to specify alignment of struct members, he introduces a keyword.

This kind of thing should not be in the language.

If not with some kind of pre-processor style directives (just because they''re called #pragmas doesn''t mean the compiler itself can''t process them) then with .NET-style attributes.

Share this post


Link to post
Share on other sites
DrPizza - I agree with you mostly there, but on "Non-virtual member functions", the intention was that the compiler would choose whether the function should be virtual or not.

I suggest you post your arguments on their newsgroup.

Share this post


Link to post
Share on other sites

  • Advertisement