# C or C++ or C#

## Recommended Posts

Isn't this getting getting a bit too far off the origional topic. The OP wanted to know which language to start learning between C, C++ or C#.
He didn't ask which was closer to the operating system or which would teach the most about computer architechture.
C# is good enough for a begginer. If later down the line you want to change to C or C++ then you'll be able to pick up these languages very quickly.
Most things that you can do in C you can do in C# apart from the really really low level stuff which as operating systems become more and more abstracted away from hardware you are less likely to need.

##### Share on other sites
OP: to learn about coding, learn all three. And learn them correctly.

My ideal sequence of learning would be to start with a simple functional language such as Caml Light to familiarize yourself with elementary algorithms without all the clutter of real-world industry languages. Once you learn how to do things in the small, tight and controlled environment of a very simple language, you can move on to learn more detailed concepts. The next step would be object-oriented development, as part of C# (Java is fine, but I find it more frustrating due to the lack of first-class functions). Then, learning C, Scheme and Prolog for the sheer cultural importance. And finally, once you have the maturity, learning C++—you need to be a lawyer to understand how it works.

Quote:
 Original post by HodgmanC pointers are numbers, Java pointers are numbers in bubble-wrap.Hence, to learn low-level computing concepts, C is probably more appropriate than Java.

In the C language, Pointers. Are. Not. Numbers. This is possibly the greatest misconception about the C memory model that could be imagined.

A C pointer is either null, or combination of a buffer B, an offset O, with the constraints that sizeof(B) >= O and O >= 0; and has a type T. Notice that pointers do not necessarily contain a number (a null pointer contains no numbers at all), and that non-null pointers also reference the buffer within which they exist, in addition to their offset within that buffer. So the ANSI sayeth.

This is not merely a theoretical rambling. In real life, on a segmented memory model (such as the one in Joe User's 2007 x86 processor), pointers are represented as a 48-bit Segment + Offset combination, stored separately into an offset register (32-bit) and a segment register (16-bit). If you consider C pointers to be just numbers (which they aren't), then you are actually hiding low-level concepts under the rug, instead of learning about them.

In short, some hardware uses numbers to represent memory addresses (the GeForce 8800 uses uint32_t for this), some hardware uses a segmented memory model, and most operating systems add virtual memory, paging, swaps, memory-mapped devices and files on top of everything to further muddle everything.

This, at least to me, shows that both the C memory model and the Java memory model are both woefully irrepresentative of how the low-level memory models (yes, there are several, at least on the x86) actually work. If anything, the C memory model is a weak approximation of the x86 segmented mode, and the Java memory model is a weak approximation of the x86 protected mode, which hardly classifies either Java or C as being more representative of lower-level concepts than the other.

Ironically, I could even argue that Java tells us more about how the typical memory models work, than C ever could. This is because almost every modern operating system in existence provides its users with virtual memory. This means that the system allocates memory pages when the program needs them, and may choose to garbage-collect them when the program doesn't need them anymore, or may opt to move them around in actual physical memory, for instance swapping them to disk to increase the speed of other processes, or even saving them to disk for hibernation and restoring them later on. All of these "memory moves around and is garbage-collected" mumbo-jumbo is neatly and cleanly abstracted away by both the Java and the C memory model, which both say "here's a pointer to something, you can always access that something through this pointer". However, by making garbage collection of unused memory and movement of memory buffers during garbage collection an explicit part of the Java way of life, Java gives a little insight about how the virtual memory works, something that C could never have any hope of explaining.

##### Share on other sites
Sorry everyone I didn't mean for this to happen, I did read through all your posts and learned about a few things. Thank you everyone for posting, I do believe I have recieved my anwser through a few various posts.

I'll be sure to post soon again at GD with another problem probably after I have progressed a bit, Thanks again.

##### Share on other sites
Since this thread is already derailed beyond any imaginable change of getting back on target, I have a simple question...

Many of you who have contributed to the derailing of this thread completely are people who at the same time are consistently discussing the dillema of people derailing threads. wt?

- Ricky

##### Share on other sites
Quote:
 Original post by PaulCesarSince this thread is already derailed beyond any imaginable change of getting back on target, I have a simple question...Many of you who have contributed to the derailing of this thread completely are people who at the same time are consistently discussing the dillema of people derailing threads. wt?- Ricky
For what it's worth, I don't consider this thread derailed at all.

##### Share on other sites
As an all around programmer learn them all.

As for which is best to use:

There are no silver bullets in programming. Each problem you encounter will always have a "best" solution. After learning all of them you will be able to choose which language to use for what.

So before you choose a language from your tool kit you should ask, "What exactly is it that I want?".

How much control over memory do you want? How large is the project? Will there be extensions to said project in the future? All of these questions and more should go into deciding which language you wish to us.

So in short learn them all for the sake of more tools. After doing this you will probably know which tool will best fit your problem.

Chris

##### Share on other sites
It appears I've sparked a bit of controversy. In hindsight, I regret the phrase about "Learning how the computer works as a machine." -- I reality, I meant that C exposes a lower-level, logical model (not physical as the word machine would imply) of the machine.

As has been noted, everything above the microcode layer is an abstraction, technically speaking. So I think the question is really this: At what level of abstraction do you find the optimal position for a beginner to enter into the world of programming?

For the majority of people of the bottom-up mindset, I believe that position to be C (or another small, procedural or even functional language) for 95% of people, with Assembly picking up the remainder. The top-down crowd would be more at home with C# or java or even C++ (though again, I don't recommend it as a first language.)

Why do I believe that C is the optimal beginner language for so many people? I believe that because it introduces programming concepts (storage, conditionals, loops, etc) at a sufficiently high level to be friendly, but also doesn't introduce a lot of high-level concepts of its own in the way that Java, C# or even C++ does. Null-terminated strings are the only abstraction that comes to mind (and I do agree that null-termination was a bad design choice in the long term, probably driven by memory constraints at the time.) While, at the same time, it also exposes the low-level workings of the computer (again, as a logical model not the physical machine.) and facilitating a programmer to create their own higher-level abstractions.

C was, after all, explicitly designed as "a portable assembly language" -- the lowest convenient abstraction above disparate architectures. And while I also believe that a modern reinterpretation of C might benefit by including concepts such as multiple processors and the memory hierarchy, I'm glad C itself does not -- whether by design or glorious accident -- because it could not have served its position as least common denominator if it had.

##### Share on other sites
Quote:
 Original post by ravyne2001It appears I've sparked a bit of controversy. In hindsight, I regret the phrase about "Learning how the computer works as a machine." -- I reality, I meant that C exposes a lower-level, logical model (not physical as the word machine would imply) of the machine.

Given the extremely complex set of failure cases in C's model, I'd argue that higher level languages (by virtue of having sane, defined behavior for such things) actually exposes a simpler, saner, lower-level logical model than C does!

In fact, this is the exact reason I don't recommend C! I think a beginner should be able to give their full undivided attention to learning programming, based on a simple, sane, dependable logical level rather than learn the dearth of complexities involved with that genuine-imitation-real-fake-computer-"logic"-programming-model that layers atop that concept to implement it.

I'd argue one should take it bottom-up from the logical model, rather than the hacked up legacy once-upon-a-time-physical* model, in my opinion. And C's model is anything but logical. C's model is all about statistical analysis of mapping failure cases to probable cause, to such an extent that it's really closer related to lawyers and the court system in general than it is to mathematicians.

Modern logic, modern hardware, or a legacy model that's a terrible substitute for either. That's how I see the choices.

(*I remember a time when we used near and far pointers, and when compilers payed heed to the register keyword)

##### Share on other sites
This is getting out of hand; lets return to the origin of this off-topic argument:
Quote:
 Original post by ravyne2001On the other hand, C is a great language to learn how the computer works as a machine, because it provides no high-level abstractions of its own and essentially frees the programmer to do whatever they wish (Hardware/OS restrictions aside.) You can, for example, impliment features in C that are commonly found in OO languages like C++ or C#, such as polymorphism. C can give you a window into the workings of things that you might take for granted in another language. Another plus is that C is a much smaller language than, for example, C++ (in both the number of keywords and the scope of its built-in features) so you can get your head around it more quickly.

Ravyne's statement that "C is a great language to learn how the computer works as a machine" is the genesis of this off-topic debate.

This statement can be interpreted a few ways, such as:
A) Learning C will teach you how a computer works as a machine.
B) If you are learning how a computer works as a machine, then C will aid you in your studies.

Anyone with a degree in computing will know that statement (A) is false, and seeing as Ravyne isn't a newb, it's pretty safe to assume that he intended this statement to be interpreted as something closer to (B), especially when reading the surrounding context of the statement.
However, cshowie assumed that Ravyne actually meant (A) and decided to start a debate about whether this misinterpretation of Ravynes argument is false or not.

Any debate about statement (A) is useless, because noone actually made that statement. Debate based on this controversial statement should be aimed at whether C is a useful aid for students of computer science, not whether C is a useful teacher of computer science!

Nathan Baum then joined the debate, backing up cshowie on his irrelevant argument against the non-existent statement (A). As I already mentioned, these additions to the debate are useless, because there is only one side to this debate (cshowie's view VS cshowie's misinterpretation of ravyne's view).

Baum's comments aimed at me were not actually against statement (B), they were against statement (A). Seeing as I was putting forth an argument in support of statement (B) (that C is lower level than C#, therefore suitable for implementing lower level concepts), and that we all already know that statement (A) is not only wrong, but irrelevant, I found his comments to be of a condescending tone (Just as if during an argument about how to best slice a pizza, your opponent started explaining to you how to tie your shoes as if you were a child).

Ok? So if you're arguing that learning C will not teach you computer science, then you can stop now, because noone has or is supporting that statement.

Quote:
 Original post by CrypterThey are C concepts, not low level concepts. Strings dont need any null termination. Pointers are used ALOT in low level programming, but only as dereferencing addresses and address arithmetic.

Arrays, pointers and null-termination are not just C concepts. The "null terminated string" may be used by C, but null termination is just a general computing concept that any student will learn at some point.
Therefore if you are a student learning about the general concept of null termination, or arrays, or pointers, then C will be a useful tool to aid you in your studies of these concepts.

Quote:
 Original post by Crypter*list of facts*

Noone is disputing this.

Quote:
Original post by MaulingMonkey
Quote:
Quote:
Original post by Nathan Baum
Quote:
 e.g. #1 Learning how to deal with strings correctly in C will teach you a lot more about "how the computer works as a machine" compared to using a C++/C#/Java/etc string class.

No, it'll teach you about how C-style strings work. Given that they work the same on all implementations of C, even when running on machines with wildly differing architectures, that should be a clue that string handling doesn't reveal the secrets of "how the computer works as a machine".

Again, you're not arguing against my point - I didn't say that it would "reveal the secrets", I was trying to illustrate that using lower level constructs teaches you more about the machine than using higher level constructs.

No, it teaches you more about just those "low level" concepts. Are those concepts useful to programming, or actually teach you anything about how computers work? No. They'll show you a sliver of how they worked on a 286, but modern hardware merely emulates that model for backwards compatibility.

Hell, even x86 is a bit of a sham now. Modern CPUs merely decode it into their own internal microformats for actual execution.

Noone ever said that learning C will teach you how a computer actually works.
What I was trying to say is that if you are learning how a computer actually works, then a lower level language, like C, is a more appropriate choice for practising these concepts than a higher level language.

Computers still use arrays, computers still use pointers, computers still use null termination. If you're a computer scientist you will know these basic concepts, to know them you must learn them, to learn them you must use them. An appropriate way to use them is to learn C.

Quote:
Original post by MaulingMonkey
Quote:
Quote:
Original post by Nathan Baum
Quote:
 e.g. #2 You can write a program in C that accepts an arbitrary hexadecimal value from the user and tries to read 17 bytes from that memory address (which may well cause a seg-fault etc...). Higher languages such as C#/Java won't let you do silly low-level things like this because they are *more abstracted*.

C# will let you do silly things like that, not that it matters. The ability to crash your program by probing random memory addresses does not make a language "closer to the metal". BASIC allowed you to probe random memory addresses, yet it was, in many ways, "higher level" than C.

Ok, I'm not that experienced with C# so scratch it from my quote. My point still stands though - AFAIK, Java's version of 'pointers' ('references') can't be assigned to arbitrary memory addresses - the language's abstraction of memory is "more abstract" than C's abstraction of memory.

I thought we were talking about C#, not Java. I mean, it's just in the freakin' thread title. Yes, there are HLLs that will forcibly abstract things even higher, we generally don't have much love for Java around here.

In the context of that quote, I was talking about HLLs in general (such as Java/C#) in an attempt to illustrate that they are more abstracted than lower level languages. It's a fairly simple concept (that higher level == more abstract), but for some reason argumentative people keep trying to disprove it...

Quote:
 Original post by MaulingMonkeyForcibly deabstracting things like C does, however, is also the wrong answer. With flexibility comes the ability to do whatever you want whenever it's appropriate. Telling a beginner to use a language that's unsafe by default in it's very design, before mastering even the most basic of concepts of programming -- even the simplest things such as the flow of execution -- is not appropriate.

You're arguing against points that haven't been made! I didn't recommend C to beginners, I just supported ravyne's view that low-level languages are ok to use when learning low-level concepts.

Quote:
 Original post by ToohrVykLots of info on pointesr

Sorry ToohrVyk, when I said compared C pointers to numbers and HLL pointers to numbers in bubble wrap, I didn't intend you to read that far into it. What I wanted you to focus on was the bubble-wrap part of the analogy (such as garbage collectors, etc...).

##### Share on other sites
I know that you're from Australia, but do you realize that "no one" is, in fact, two words?

##### Share on other sites
Quote:
Original post by Hodgman
Quote:
 Original post by MaulingMonkeyNo, it teaches you more about just those "low level" concepts. Are those concepts useful to programming, or actually teach you anything about how computers work? No. They'll show you a sliver of how they worked on a 286, but modern hardware merely emulates that model for backwards compatibility.Hell, even x86 is a bit of a sham now. Modern CPUs merely decode it into their own internal microformats for actual execution.

Noone ever said that learning C will teach you how a computer actually works.

Quote:
 Original post by Hodgman, again, earlier, emphasis mine.C is *closer* to machine level than higher level languages like C# (i.e. it has raw pointers, can be freely mixed with assembly code, etc...).Everyone (including ravyne) knows C isn't machine level, but it's C L O S E R.e.g. #1 Learning how to deal with strings correctly in C will teach you a lot more about "how the computer works as a machine" compared to using a C++/C#/Java/etc string class.

Okay, so maybe you didn't mean it like that. Hell, ravyne already addressed this point and moved on, although still leaving things horribly muddled (seemingly, to me,) which I've tried to address. My basic point is this:

A) Closer to the physical machine
B) Closer to the logic
C) Closer to the virtual machine

Pick one. Okay, I hear you, you're not trying to pick both choice A and choice B. You really mean choice C, not choice A. That's still two, distinct, incompatible choices. I've already elaborated on the difference between B and C, and which one the language C is.

Quote:
 Computers still use arrays, computers still use pointers, computers still use null termination. If you're a computer scientist you will know these basic concepts, to know them you must learn them, to learn them you must use them. An appropriate way to use them is to learn C.

C# still uses arrays, pointers, and yes, null termination if you need to support that ungodly legacy hack of a string format. You can do it, just like you could in C. If you're a computer scientist you will know such basic concepts such as garbage collection. To know them you must learn them, to learn them you must use them. An appropriate way to use them is to learn C#.

Quote:
Quote:
 Original post by MaulingMonkeyI thought we were talking about C#, not Java. I mean, it's just in the freakin' thread title. Yes, there are HLLs that will forcibly abstract things even higher, we generally don't have much love for Java around here.

In the context of that quote, I was talking about HLLs in general (such as Java/C#) in an attempt to illustrate that they are more abstracted than lower level languages. It's a fairly simple concept (that higher level == more abstract), but for some reason argumentative people keep trying to disprove it...

The fairly simple concept that I'm trying to explain to you is that "totally lacking anything higher level whatsoever" isn't a necessary prerequisite to having low level access. C# is a prime counterexample that this isn't the case at all. It's a fairly simple concept (that you can have the best of both worlds at the same time, sometimes), but for some reason argumentative people keep trying to disprove it... as if low level was only the realm of C.

Quote:
Quote:
 Original post by MaulingMonkeyForcibly deabstracting things like C does, however, is also the wrong answer. With flexibility comes the ability to do whatever you want whenever it's appropriate. Telling a beginner to use a language that's unsafe by default in it's very design, before mastering even the most basic of concepts of programming -- even the simplest things such as the flow of execution -- is not appropriate.

You're arguing against points that haven't been made!

I'm elaborating further on the cons of your suggestion -- unless you don't consider espousing C's supposed generalized pros in the context of a For Beginners forum thread, the author of whom is clearly language hunting, to not be making any sort of implicit suggestion whatsoever. Not that I was talking only to you -- others here have quite clearly, explicitly, suggested C, and in the very original post the author has made it quite clear he is, in fact, looking for the pros and cons of the various suggestions, making my comments clearly relevant to this thread.

Quote:
 Original post by ToohrVykLots of info on pointesr

Sorry ToohrVyk, when I said compared C pointers to numbers and HLL pointers to numbers in bubble wrap, I didn't intend you to read that far into it. What I wanted you to focus on was the bubble-wrap part of the analogy (such as garbage collectors, etc...).[/quote]

1959:   GC1978:   8086 (segmented 16-bit)1985:   386  (32-bit)Now:    CPU & GPU:  Still distinct address spaces.        At home:    Virtualized memory still bubble-wrapping                    over cache, RAM, pagefile, mapped files        At work:    Unified addressing across multiple nodes of a cluster                    still extremely uncommon (although not unheard of)        GC:         still operating on the same principle of unreferenced                    data being collectible garbage.

Active memory as a single integer-addressed whole is the bubble-wrap. Garbage collection, however, is clearly a fundamental concept to programming logic. Not one every project needs to use (at least as an automated process), but one nevertheless.

##### Share on other sites
Quote:
 Original post by HodgmanOk? So if you're arguing that learning C will not teach you computer science, then you can stop now, because noone has or is supporting that statement.

Most of the points made in favour of C in this thread focused on the "features" of C, such as null-terminated strings, pointers, and similar concepts. Most of the counter-points explained that none of these concepts actually provided any relevant information about how the computer works as a machine. These features may well teach you about some implementation details of some high-level languages (though most HLLs use some tricks that cannot be explained using C concepts, such as tail recursion).

Quote:
 Arrays, pointers and null-termination are not just C concepts. The "null terminated string" may be used by C, but null termination is just a general computing concept that any student will learn at some point.

I would argue that null-termination is a narrow and rarely used feature (outside of C) that deserves less focus than the more important concepts of type-safety, polymorphism, or concurrency, that C++ iterators are a much better way of learning about pointers, and that arrays are an almost universal feature of programming languages.

Therefore if you are a student learning about the general concept of null termination, or arrays, or pointers, then C will be a useful tool to aid you in your studies of these concepts.

Quote:
 Original post by Crypter*list of facts*

Noone is disputing this.

Quote:
 e.g. #1 Learning how to deal with strings correctly in C will teach you a lot more about "how the computer works as a machine" compared to using a C++/C#/Java/etc string class.

Quote:
 Noone ever said that learning C will teach you how a computer actually works.

Quote:
 What I was trying to say is that if you are learning how a computer actually works, then a lower level language, like C, is a more appropriate choice for practising these concepts than a higher level language.

This, we know. However, you have so far only provided a list of C features that do not actually teach, or help learning, anything useful about the inner workings of the computer. You have mentioned C-strings and pointers, for example, none of which have any relevance or bearing to the way the computer works.

The correct statement, where everyone would agree, is that C teaches you how low-level programming works. But both high-level programming and low-level programming have become completely disconnected of the actual way the computer works.

Quote:
 Computers still use arrays, computers still use pointers, computers still use null termination.

Actually, programs use these. Computers have no notion of array, pointer or null termination: these are merely programming language concepts, not hardware concepts.

Quote:
 In the context of that quote, I was talking about HLLs in general (such as Java/C#) in an attempt to illustrate that they are more abstracted than lower level languages. It's a fairly simple concept (that higher level == more abstract), but for some reason argumentative people keep trying to disprove it...

Actually, everyone agrees with this. What we don't agree with is that "lower level == closer to the machine", which isn't the case. See below.

Quote:
 Sorry ToohrVyk, when I said compared C pointers to numbers and HLL pointers to numbers in bubble wrap, I didn't intend you to read that far into it. What I wanted you to focus on was the bubble-wrap part of the analogy (such as garbage collectors, etc...).

So, let's focus on the bubble-wrap part. Java provides garbage collection while C doesn't. Therefore, the abstraction of the Java memory model is actually closer to the machine than the abstraction of the C memory model, by virtue of the machine using a garbage collection scheme to allocate, deallocate and move objects around. C pointers are in fact implemented as Java references behind-the-scenes, but C programmers never know it. I could even go as far as saying that a compacting garbage collector like that of Java or C# teaches important things about the importance of cache coherency, a subject totally eluded by the C language, as previously noted in this thread.

In short:
• The C language has a very simplistic view of the world.
• Higher-level languages have a more complex view of the world.
• The computer, as a machine, is extremely complex.

Some of the complex concepts seen in higher-level languages are bound to represent and explain the hardware-level concepts.

##### Share on other sites
Quote:
 Original post by SLaYaHI am stuck trying to decide, originally I was only deciding between C or C++ but now theres C# and I'm not sure what to do. If at all possible I'd like to know pros and cons of each and whcih you would suggest to learn coding for an ALL AROUND programmer, not just for gaming but may be part in gaming.Thanks in advanced.

Oh god, not another one of these threads? Do we not learn people?

Here's some tips: Search the site for C# vs. C++, C# vs. Java, C# vs. Godzilla and so on.. there's a billion of these threads and they all end in tears. Promit did a nice catalogue of the threads a while back.

My bit of advice since you seem to be new:
Pick one, try it, don't like it? Move on and until you find something you're comfortable with. Believe me, there's plenty of information about each language floating around and enough people here proficient in all 3 to give you advice should you need it. In all honesty, it doesn't hurt to know as many languages as you can. It'll give you a broader perspective. At the end of the day you should eventually try and learn all 3 and many more besides if you have the time and patience for it. Just don't bite off more than you can chew.

Good luck.

##### Share on other sites
Quote:
 Original post by HodgmanNoone ever said that learning C will teach you how a computer actually works...e.g. #1 Learning how to deal with strings correctly in C will teach you a lot more about "how the computer works as a machine" compared to using a C++/C#/Java/etc string class.

By "how the computer works as a machine" I meant low level computing concepts in general (as I've already said, multiple times - stop it with the straw men already please!), using the example of all the crap you have to do manually in C compared to a language with proper string support.

In one approach you just use a few function calls, in the other you have to manually use basic computing concepts, therefore, the manual approach requires more knowledge about how computers work at a lower level.

Quote:
 I'm elaborating further on the cons of your suggestion

That's ok, I just wanted to point out that I never actually suggested that a beginner should choose C over something else (you implied that I did), just that it can be a useful learning tool.

Quote:
 Active memory as a single integer-addressed whole is **the** bubble-wrap. Garbage collection, however, is clearly a fundamental concept to programming logic. Not one every project needs to use (at least as an automated process), but one nevertheless.

It may be bubble-wrap, but it wasn't THE bubble wrap I was talking about.
It's already been said many times that C is an abstraction (hence a C pointer is going to have bubble-wrap compared to what's really happening), I guess I should have explicitly said that *managed* pointers have *more* bubble wrap than *raw* pointers.

Quote:
 The fairly simple concept that I'm trying to explain to you is that "totally lacking anything higher level whatsoever" isn't a necessary prerequisite to having low level access.

Ok, I've never disagreed with that.
There's nothing wrong with making your own string class in C# or your own linked list in Java - you can still learn these low level concepts in a HLL. As the above quoted example shows, I was saying that doing them in C (i.e. manually) will teach you more about these low-level concepts than using a string API.
If you're doing them manually in C and manually in C# then there's not much difference (the main one being that you don't have to do it manually in C#).

Quote:
 C# still uses arrays, pointers, and yes, null termination if you need to support that ungodly legacy hack of a string format. You can do it, just like you could in C. If you're a computer scientist you will know such basic concepts such as garbage collection. To know them you must learn them, to learn them you must use them. An appropriate way to use them is to learn C#.

Yeah, I agree.

So, seeing as C# can do all the (unsafe) things C can do, you could use it to learn all of these old C techniques. However, I (personally) still think it's appropriate to learn them using C for the same reason that you don't conduct survival training in a hotel resort - you do it in the desert.
(On that note, it's probably important for programming students to use some assembly code at some point too, to understand even lower concepts, like how a function call is implemented -- but don't take this as a suggestion that you should build apps in ASM over C#!!!)

Quote:
 Original post by ToohrVykMost of the points made in favour of C in this thread focused on the "features" of C, such as null-terminated strings, pointers, and similar concepts.

As I've already said, I pointed out these "features" as part of an example of the amount of manual work required in C compared to a langage with a good standard library.
I.e. the important part is that you do it manually, not that you use a particular "feature".

Quote:
 Original post by ToohrVykYou have so far only provided a list of C features that do not actually teach, or help learning, anything useful about the inner workings of the computer. You have mentioned C-strings and pointers, for example, none of which have any relevance or bearing to the way the computer works.

Again, I've already said that I wasn't talking about the "inner workings of the computer", but the "low level workings of the computer".

Quote:
 Original post by ToohrVykJava provides garbage collection while C doesn't. Therefore, the abstraction of the Java memory model is actually closer to the machine than the abstraction of the C memory model, by virtue of the machine using a garbage collection scheme to allocate, deallocate and move objects around. C pointers are in fact implemented as Java[-style] references behind-the-scenes, but C programmers never know it. I could even go as far as saying that a compacting garbage collector like that of Java or C# teaches important things about the importance of cache coherency, a subject totally eluded by the C language, as previously noted in this thread.

Now this is a counter to the argument about LLLs being closer to the machine than HLLs.
However, my point about doing things manually being a better learning activity than using an API (and C being a good language for this exercise) still stands.

[Edited by - Hodgman on August 30, 2007 2:29:24 AM]

##### Share on other sites
I just want to point out that null terminated strings are a C concept. They have nothing to do with the low level workings, or inner workings, or any of that bullshit. Most languages don't even use null terminated strings; C++, Java, C#, and Pascal/Delphi all store explicit lengths.

##### Share on other sites
I'm not nearly a wiz with the technical side of the machine as others might be so I can't really say but I can say this. I learned VB6 first as the first real language that I programmed in and I never quite felt confident as a programmer, I felt that too much was hidden from me. But when I learned C++ I felt like I had a better concept of how things worked. A light went off when I learned that a string was characters in an array. a light went off when I learned about pointers. A light went off when I learned about the stack and the heap. (which was explained in every tutorial I've read about C++ but not once on a tutorial I have read on Java or C#, could be coincidence) Now, maybe C#, Java, VB, etc can teach all of that too but it's not intuitive like C++. And maybe how they treat all these features are completely different then C++ but, what I can say is that I feel a lot more confident after learning C++.

If I can make a poor analogy, it's like baking a cake. With VB and I'd argue C# and java are the same, I felt like I was grabbing a cake mix adding water and making cake. It was fast, it was good and it worked but I wouldn't get the same feeling of understanding a cake until I mixed the floor, chocolate, etc by hand. One could say that truly, the cake is in the chemical compound. And neither one will teach me this but having to do everything by hand made me feel like I understand it all better.

I don't know if any of that makes sense but I definately see a good value behind learning C++, maybe not right away but soon after. And of course assembly.

There's all this debate about if it teaches logic, the machine, or virtual machine and what not. Place whatever I said where ever you want it. I'm just saying C++ helped a lot, I feel the natural structure of the language made me look a little bit deeper.

##### Share on other sites
I think you've missed the point entirely.

You're claiming C++'s "natural structure" made you "look deeper," and that you suddently felt you like a better concept of how things worked after you switched to C++. This is wrong. You may have a better understanding of the concepts in C++, but that doesn't scale to the general case; that doesn't scale to the level of being remotely connected to the actual metal running the actual software... But this is what the majority of mis-information about C and C++ leads one to believe, that in learning C++ they learn what's going on. The whole point of the recent discussion here has been to illustrate how, while the models chosen by C or C++ to represent some things (strings, pointers) might have a lower-level nature than what you may have experienced before, they're not the be-all-end-all of how the machine works. A string is not neccessarily characters in a null-delimited array, pointers are not neccessarily numbers, the stack and the heap are not universal concepts and they are not even neccessarily the same thing across programming languages.

This isn't to say that C++ isn't worth learning, this isn't to say that the C++ model for the aforemention things is wrong. What is wrong is to assume that the C++ model for those things is universal. Like strings, "pointers" or references, the stack and the heap in just about every other language and platform, all of those things are abstractions.

##### Share on other sites
I think C can be useful to understand how some features of other high level language works because that features are implemented in C. But asm is the only language that can teach something about the inner working of the system (even if also the assembly language is an abstraction).
Both aren’t the best suited languages for beginners. They can be useful, but at the beginning you should learn the higher level stuffs. You should learn how to arrive to a solution. The other things are secondary. And in C is harder to learn that things than in other languages like C#.

##### Share on other sites
As a fan of C, I'm a little agitated by the number of insults flying around against the language, but I'm also a little surprised that people are trying to defend C as being better for learning about how your computer works on the inside. While I can certainly see some limited examples of that being true (most of which have already come up), that's really not what you should be learning the language for.

In my eyes, none of the 3 languages really has a whole terrible lot over the rest. Anything you can do in C you can do in C++ and C# (with the exception of some of the picky cases you probably ought not be messing with anyways). The decision of which to learn comes down to which one you find easiest to work in. I use C because I'm not a real big fan of OOP. Just a personal preference. I haven't worked with C# yet (actually installing VC# right now), but I think I would probably recommend it over C++. C++ is the weakest of the 3 languages, imo. It's just plain uglier than C or C#, in my experience with it. But again, I can't stress enough that it's almost entirely your personal preference. Whatever language enables you to solve your problems fastest is the language you should use.

Toward that end, I recommend you at least dabble in all 3, then decide which one you like best.

##### Share on other sites
Quote:
 Original post by PromitI just want to point out that null terminated strings are a C concept. They have nothing to do with the low level workings, or inner workings, or any of that bullshit. Most languages don't even use null terminated strings; C++, Java, C#, and Pascal/Delphi all store explicit lengths.

Ok, but no-one said null terminated strings weren't just a C concept.
(I did say that null termination is a LL concept that C will expose you to)

##### Share on other sites
You don't seem to have any comprehension of the difference between C concepts and "low level" concepts.

##### Share on other sites
Ok, seeing that I can't comprehend this, can you at least explain why the general concept of null-termination is specific to the C language (instead of just insulting me)? It is funny tho, that you're insulting my comprehension just after launching a rebuttal to an argument that wasn't actually made[/edit]

The way I see it: Null termination is a general low-level computer concept (albeit not a very useful one) that transcends use by a single language. I was taught about null termination (or 9999-termination, etc...) back in school in a module about general computing algorithms, before I had even learned C yet.

##### Share on other sites
This topic is now closed to further replies.

• ### Forum Statistics

• Total Topics
628305
• Total Posts
2981939

• 9
• 11
• 11
• 11
• 10