Could use a hand with void**

Started by
28 comments, last by MaulingMonkey 17 years, 6 months ago
Quote:Original post by Zahlman
Quote:Original post by Anonymous Poster
Quote:Original post by Zahlman
0) For something like this, there is very little justification I could accept for "I have to use C" - except possibly "I'm doing this for school and my prof is stuck in the 80s". Although a prof stuck in the 80s wouldn't speak of ZeroMemory() but instead memset().

1) In particular, if you can't figure out these kinds of things for yourself, you have no business trying to do them in C. It's not as if there's any significant amount of fat *available* to trim out of a standard C++ (using the standard library) implementation, and there are much better ways to learn how low-level algorithmic ideas.


</holierthanthou>

"If you can't figure out these things for yourself, you have no business blah blah...."

If you can't figure out these things for yourself you have no business learning them? Hello? He's obviously taking some kind of course here...


It is my considered opinion that very few people in the world have any business doing anything in plain C these days, at all. In a great majority of cases, the extra work done and mental burden taken on (versus using C++ and taking advantage of the full standard library) are ridiculous for no performance benefit whatsoever; in a great majority of the remaining cases, the performance benefit is barely measurable and the extra work is still ridiculous; in almost every remaining case after that, the performance benefit is not enough to matter while the burden would still be eased significantly by things like built-in-to-the-language RAII. C++ can be treated, to a large extent, like a preprocessor for C which is orders of magnitude more powerful than the built-in one, and that's still not harnessing the full power of the language. And if you're one of those oddballs using C99 explicitly for the few minor changes that actually let you do very specific optimizations that C++ won't trust you to do... then you'd probably have more luck writing it in FORTRAN.

Seriously, the only real excuses for using C that I can consider are "a C++ compiler does not exist for the target platform" or (which usually goes hand-in-hand) "the target platform requires the final executable to be smaller than what is realizable if I include <iostream>".

Lots of profs - colleges - programming curricula in general - have some *very* strange ideas about how to teach "a deep understanding of the machine". (At least, I hope that's what they're after. I shudder to think that they expect to teach people *how to program* like this! Absolutely disgusting.) It's better done via instruction in what they call "computer organization", which might (but needn't) get as high-level as an introduction to some form of assembly language (the one chosen doesn't matter).

Quote:
The code is all mine EXCEPT for the function protos. It HAS TO BE, NO QUESTION ASKED void**


See? Pure idiocy. Nothing is gained by throwing away the idea of *data types*. In fact, this idea is *fundamental* to programming. And there's no reason the function can't accept a StackNode** instead, because that's the only kind of pointer it will *ever* receive. Nothing else is valid. Notice that the dereferenced parameter is cast to StackNode* *unconditionally*.

Please, PM me contact information for your prof (and/or the department of computer science, if appropriate). I am volunteering - with discretion assured, and at no cost and assuming full legal responsibility - to tell him/her/them off.


For someone who only recently learned what actually happens if you use the & to take an address of an array you sure sound high and mighty up there.
Advertisement
Been watching me, have you? :P

Forget it. MM is arguing the point better than I can, at least for the time being.
So uhmm...I have to ask:

What low level thing does this actually show? It's nice & fun to tout that you're learning low-level concepts, but all I see is a large number of poorly designed type casts and some crummy memory management.

I mean, I'm for-sure not a high & mighty expert on the topic, but it seems that if you want to demonstrate some of the concepts here there are a lot of better ways to do so other than horrid code.

But like I said, I could just be "missing the big picture" but somehow I doubt it. The whole thing about being coddled by a GUI Debugger made me LOL.

Quote:If you understand the low level layer of something then you can very easily master the higher level.


By your logic, Assembly is a better 1st language than VB/Pascal/Python.
Quote:Original post by _Sigma

The code is all mine EXCEPT for the function protos. It HAS TO BE, NO QUESTION ASKED void**, etc.



How is this?

#define void Nodeint Pop( void** node );int Push( void* node );


Beautiful :-)
deathkrushPS3/Xbox360 Graphics Programmer, Mass Media.Completed Projects: Stuntman Ignition (PS3), Saints Row 2 (PS3), Darksiders(PS3, 360)
Quote:Original post by ShadowWolf
So uhmm...I have to ask:

What low level thing does this actually show? It's nice & fun to tout that you're learning low-level concepts, but all I see is a large number of poorly designed type casts and some crummy memory management.

I mean, I'm for-sure not a high & mighty expert on the topic, but it seems that if you want to demonstrate some of the concepts here there are a lot of better ways to do so other than horrid code.


Horrid code is a great lesson. There will be plenty of it in the real world. Better learn how to deal with it as soon as possible. Those who have not seen horrid code do not know good code.

Some people here get a hissy fit as soon as they see something that isn't theoretically perfect C++ with the standard libraries and boost. Don't become one of them - learn bad code, because it is a fact of life that will not end any time soon.

Quote:Original post by Vermeulen
hey, i am in your class also
I completely understand why he is using Unix/C, and I don't think its about using the right tool for the job, he is just trying to teach very low level. If your not interested in the very low level workings of a computer go into engineering and more application based study.

Alright, I reverse engineer for fun, so I think I have an interest and understanding of the low level code. Thats not a problem.

Regarding the OP code. I think you'll have an easier time with it if you remove the allocation of memory for the data pointer from the stack code. Just use the stack as a mechanism for managing a chain of nodes each of which points to some block of memory unrelated to the stack. In addition, a stack chain doesn't need both a next and a prev pointer. One or the other will do and the semantics are easier to grasp with a next pointer.

The follow might help focus on just what a stack is about (LIFO) by contrasting it with a queue (FIFO):

A stack using an array

datatype stack[MAX];
int index = 0;

Push:
ASSERT(index < MAX-1);
stack[index++] = temp;

Pop:
ASSERT(index > 0);
temp = stack[--index];

A queue using an array

datatype queue[MAX];
int head = 0; int tail = 0;

Put:
ASSERT(tail < MAX-1);
queue[tail++] = temp;

Get:
ASSERT(head < MAX-1);
temp = queue[head++];

"I thought what I'd do was, I'd pretend I was one of those deaf-mutes." - the Laughing Man
As mentioned -- you can expose more than one interface.

If you have to expose a bad interface (Pop(void**)), what you should do is first write a good C interface, then have a short stub function with the required signature call the good C interface.

There is bad code out there. Sometimes you have to deal with it.

You don't deal with it by writing bad code. Learn how to write good code and deal with bad code.
Quote:Original post by NotAYakk
There is bad code out there. Sometimes you have to deal with it.

You don't deal with it by writing bad code. Learn how to write good code and deal with bad code.


QFE.

If you get a proper grounding in the *theory*, and first learn to write good code, you are sure to find that dealing with bad code is actually pretty trivial. You either are calling it (and invoking stuff is trivial; if you need to make an adapter for that interface, then that's a skill you'll already need to have picked up in order to write good code) or fixing it (and refactoring is another skill you already need to learn if you're learning to write good code).

OTOH, if you immerse yourself in an environment of bad code at the start, learning to write good code is going to be practically impossible. (Heck, it's hard enough as it is). I really do feel that getting off to this kind of start poisons the mind. There's so much stuff I wish I'd learned about C++ back when I was actually taking classes in it. (To my teachers' defense, the '98 standard was recent and not well adopted at that point - but still...)
Quote:Original post by Zahlman
OTOH, if you immerse yourself in an environment of bad code at the start, learning to write good code is going to be practically impossible.


I'm going to disagree with this. Or rather, I agree with you that it's going to be practically impossible, unless you have an extremely open mind.

I grew up on C and printf and carried those traits over into C++. I learned from bad - and often downright wrong - books. Combined with listening to some of the advantages listed by C++ standard library proponents - combined later with "Oh gee, this exists in the standard library? Nifty! Time to stop reinventing the wheel!" - I learned to set aside bad, older habits, and to stop treating C++ as C with classes.

This was probably aided by a few attempts to teach my peers programming. Let's just say a lot of time was wasted just helping them with minor junk like debugging their misshapen printf statements (typically, mismatched format specifiers and types). This helped underscore some of the disadvantages of my old methods.

Later, I learned of all sorts of wonderful things. assert(), unit testing, const correctness, patterns, antipatterns, seperating interface from implementation... the list goes on.

A lot of times, I see arguments against moving to newer technology - arguments of "It's too slow", etc, seem to dominate. I shudder to think of all the crappy code I'd be writing if I'd been closed minded like that.

This topic is closed to new replies.

Advertisement