C# and typedef

Started by
130 comments, last by TangentZ 19 years, 11 months ago
It is a pity that this discourse has (like so many others) deteriorated into a language war. I program with mainly c++ professionally and C# at home (though it is starting to be used at work lately) I like both languages for different reasons. I too have wanted a typedef in c# before now. I accept that there is no absolute need for it but I would have found it a nice to have (maybe simply because it is what I am used to)

What I don''t quite understand, is why people in this thread are slagging off typedef in c++. There have been a number of claims that it is flawed/broken etc. What is the problem with it? I suppose it can be used badly but if that is its only flaw then all languages are inherently flawed by the fact that there are incompetent programmers about. My main use of the typedef is for templates, where it is very useful.

Actually, I will be interested in seeing the syntax of the c# generics. Does anyone know of a site where I may see the planned syntax?
Advertisement
quote:Original post by MindGames
It is a pity that this discourse has (like so many others) deteriorated into a language war.
Oh really? See most of the latest posts, they''re still all about the usefulness of creating type aliases like typedef does. The wrapper stuff and whether typedef is more useful in C++ than C# because of other language features, is relevant to the discussion. There are just a few isolated off-topic posts and you say this is a language war..
quote:Original post by BillPo
davepermen: At first I was offended that you were telling me to behave because as far as I figured I was being behaved. But then I looked at my last post and noticed it looks like I was attacking desertcube. I did not mean that. It was a joke to the people who understand that typedef is a language flaw, and a small prod at those who think C++ is without flaw.


now you won''t get banned. but about all posts together before looked like you''re just an **s. sometimes people don''t realise that. count me in to those people.

anyways. yes, c++ typedef has a big flaw: it is essencially the same as a #define. it does not create a new, unique type, but really is just a textreplacement of the original one.

in D, this is not true. in D, a typedef float Float; results in two types. float, and Float. both have the same features, but they are NOT the same. you can cast between them manually, but you have to.

this makes code like

typedef float Degree;
typedef float Radian;
typedef float Grad; // i think that exists, too?

and typedef uint Hour;
typedef uint Minute;
typedef uint Second;

rather useful for rapid development.

glRotatef(Degree angle,float x,float y,float z)

and

glRotatef(Radian angle,float x,float y,float z)

as well as

float sin(Degree angle);
and
float sin(Radian angle);

would get possible.

this would allow in a very simple way, the ability to help coding. a number can e anything. a Radian can not.

you could of course write wrapper classes (and in c++, they could with good optimizing compile away to get free), but a simple typedef would be much simpler, if all you need is a number, wich only exists in a specific domain.



oh, and, the std::size_t, the std:trdiff_t (or how ever called), are useful types, wich always have the right size on every platform.

what type is guaranteed to be of the same size as a pointer? depends on platform. void* can not do math, so you would need a char* for example.

having it to return some sort of std:trdiff_t instead helps hiding how it actually gets implemented, it abstracts away this knowledge.

but yes, c++ IS flawed in not creating new types with typedef.

as i said. it creates "referencees". there is no way to create new types.. thats bad.



If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net
If that's not the help you're after then you're going to have to explain the problem better than what you have. - joanusdmentia

My Page davepermen.net | My Music on Bandcamp and on Soundcloud

quote:Original post by MindGames
What I don''t quite understand, is why people in this thread are slagging off typedef in c++. There have been a number of claims that it is flawed/broken etc. What is the problem with it?

It doesn''t define a type, it just aliases an existing type. That can be a problem, depending on the scenario. At the very least, it makes typedef somewhat misleading.
quote:
I suppose it can be used badly but if that is its only flaw then all languages are inherently flawed by the fact that there are incompetent programmers about.

This line of argument gets spouted all the time, and seems fallacious to me. There are obviously going to be some language features that are more problematic than others. Typedef does have its uses, but is still a misleading language feature.
quote:
My main use of the typedef is for templates, where it is very useful.

This use is actually a crude simulation of metatypes, or maybe type inferencing. When you are simulating some feature that another language supports, it is often worth considering that other language for the task you are carrying out. Obviously, the more features you are simulating which are present in another language, the stronger the case becomes. Using Python to illustrate what you are simulating...
class C:    def __init__(self, T1):        self._x = T1()            def increment_x(self):        self._x += 1        def get_x(self):        return self._x    x=property(get_x)>>> c1 = C(int)>>> print c1.x0>>> c1.increment_x()>>> print c1.x1>>> c2 = C(str)>>> print c2.x>>> c2.increment_x()Traceback (most recent call last):  File "<interactive input>", line 1, in ?  File "<interactive input>", line 5, in increment_xTypeError: cannot concatenate ''str'' and ''int'' objects

In Python, we can pass types around like objects and so we can generalise a class in that manner. However, you wouldn''t actually do things that way in Python, which is already type-generic by virtue of its dynamic typing. The point here is that different languages do things differently; The uses of typedef in C++ might not transport well to another language which offers different features for dealing with the same needs.
I somewhat find typedef valid in D, as davepermen has pointed out that it creates actual types. Well, aliases that are used in type checking.

I'm somewhat adverse to using typedef to create types representing units. Minutes, Hours, Days, Radians, Gradients, Millimeters, etc are units, but they are not really types. They are [usually] of the real number system and thus typed as some kind of floating point type.

The problem I have with this is that in your code you're going to end up with a lot of variables names that include their "type" in their name, most notably for time measurments. For example
Minute minutes;Second secondsSinceMidnight;Day numDaysRunning;  

The day type is quite confusing in itself because the programmer can not determine from the type whether its an enumeration of the days of the week, or its count of days in some duration and whether that is an integer count or a real count.

Types like units of physical measurement don't have this problem because names for the variables would be things like "length" or "theta." I still do not see any real advantage to using Radians theta over float theta, or if you like float thetaInRadians. You might suggest that having the unit as the type documents what unit is used for the variable, but what if the variable is declared in a class in a header file? Then either the programmer will have to look up this declaration [where a comment about the unit of measurement could be kept] OR the programmer has memorized the unit.

So there is no real advantage when using a typedef in this way. And for more extraneous types like Dollars it would be more useful to have a class that provides some methods for working on that type. Types like Dollars is not something that needs to be lightning fast and can do with a little class wrapping.


One of the original uses of typedef was to do away with the nasty struct keyword in variable declarations. Remember way back in the old days [possibly before I was born] when C required you to state the struct keyword when declaring a structure variable? You now have a lot of old legacy C code, and much new C++ code written by legacy C programmers, including John Carmack and many of my coworkers, that looks like this:
typedef struct _DBGPARAM {    CHAR   lpszName[32];    CHAR   rglpszZones[16][32];    ULONG  ulZoneMask;} DBGPARAM, *LPDBGPARAM; 

I cannot understand why people still do this, except possibly that no one does anymore but all there old libraries are written this way and no one will ever rewrite them. Regardless how is this any better than
struct DBGPARAM { ... };and then using it asDBGPARAM params;DBGPARAM *ptrToParams;  

I think the only real use for typedef these days is to alias types, whether they be simple types getting a new name, a math type that needs to be compiled as float or double, or really large and ugly template types.

I see that these uses of typedef have their validity but they are not really too significant. Although, the more I think about it now the more I feel that it would actually be nice to have typedef in C# just for those rare occasions when it really is needed, most notably in a math or physics library that wants to be compiled in either single or double floating point precision.

-out

EDIT: The source tags produce boxes that are way too big. I wish they would minimize the size based on the contents within.

[edited by - Billpo on January 16, 2004 2:59:13 PM]
-out
Davepermen: I missed your point given by the example on overloading trig functions with both Degrees and Radians. Very valid point. I like it. Typedef in D is more what typedef should be.

Of course I being a mathematician know that we should always use radians and nothing else

-out

[edited by - Billpo on January 16, 2004 3:04:16 PM]
-out
of course



If that''s not the help you''re after then you''re going to have to explain the problem better than what you have. - joanusdmentia

davepermen.net
If that's not the help you're after then you're going to have to explain the problem better than what you have. - joanusdmentia

My Page davepermen.net | My Music on Bandcamp and on Soundcloud

quote:Original post by SabreMan
It doesn''t define a type, it just aliases an existing type. That can be a problem, depending on the scenario. At the very least, it makes typedef somewhat misleading.


True, I hadn''t really considered that. Thanks for the reply. In so many of these sorts of topic people make statements with out giving any sort of reasons.

quote:
quote:
I suppose it can be used badly but if that is its only flaw then all languages are inherently flawed by the fact that there are incompetent programmers about.

This line of argument gets spouted all the time, and seems fallacious to me. There are obviously going to be some language features that are more problematic than others. Typedef does have its uses, but is still a misleading language feature.


Yes I realised when I made that statement that it can be used as defence for just about anything. Unfortunately diehard fans of one language or another can and will use it to justify even the most obvious language flaw. There is however, an element of truth in it. In this particular case, I would say that while typedef could perhaps be better implemented, it is quite useful and I can''t say that I have seen it abused all that often. Should it (or some variant) be put in C#? I don''t particularly care. Once I got over my initial ''what happened to my typedef'', I have managed quite ok without it. I have programmed in many languages in the last 20 years or so and all languages have differing features and paradigms that take a bit of getting used to.
ok, I want to start with one thing ... D typedefs are superior to C typedefs in nearly every way.

with that out of the way though, I don''t understand why people talk about a feautres desirability or usefullness only from one point of view, not all connected aspects of it. A real typedef is more costly that a fake typedef in terms of compiler complexity and other subtleties - the D solution seems to be the best balance of cost to benifit I have yet seen (and I felt that way when i read about them over a year ago, long before I ever started using C# and hating the missing typdef).

Yes, being able to overload based on a typedef type could be usefull, and in fact I have often wanted that ability in C++ ... but I want you to understand, to have the perfect C++, BOTH features would need to included in the language, so that I, as the developer can choose to easily create first class types, and force manual typedefs and such, or I can use the int type with nothing but a more descriptive (to me) name.

As for people who suggested that you would have many more situations like this:

Minute minute;
Hour hour;

you are completely missing the point ... in a case where you have a simple temp value you might use such naming ... that is only in contrast to situations where you would have done this:

int minute;
int hour;

but what about these situations:

// 1 of these
float shipLength;
float shipLengthInMeters;
DistanceInMeters shipLength;

// 1 of these
float averageHeight;
float averageHeightInInches;
DistanceInInches averageHeight;

// 1 of these
float currentTemp;
float currentTempInCelcius;
DegreeCelcius currentTemp;

now I would add, that these are most usefull if a compiler with intellisense abilities shows the alias, not the root type by default ... in such cases, READING the code makes the most sense with the least number of typed characters.

once again I admit it is not perfect, not perfectly safe, not perfectly correct, not perfectly anything ... except a perfect compromise - in my opinion - between compiler complexity costs, run-time costs, correctness of modelling, and ease / speed of usage ... it wins in run-time cost - 0, it come in second on compile time complexity - only requiring basic symbol table entries, an root type evaluation, it is not a correct model - but is close for the amount of work required, and it is pretty damn fast and easy to use.

and BillPo, typedef was NOT created for struct naming, if that had been the case, they would have just added automatic name promotion of structs like they have now ... it was added to the language before it was added to structs (the struct thing was a free side effect of the original implementation - a symbol table alias), as exactly the thing i use it for, a way to create new DOMAIN LEVEL type systems, in a time when the compilers we''re not enourmous and we''re hard to write even so, and run-time costs of any sort we''re unacceptable (to the people who made languages like C at least).

and yes, typedef is like a #define - a simple alias, except that there is no chance for malicious macro side-effects, and that it has scope ... so it is as much better than #defines as using consts are - because of it''s first level language status.

In C# I would argue that D like typedefs would be idea for that language, particularly because of the distinction between value and reference types (the typedef would necessarily be of the same category as it''s source type) - and therefore, another weakness of C# is identified, that refactoring a value type to a reference type and vice-versa is a non-trivial issue once in use in code ... and that code which look the same in C# is very different depending on the value/reference category of it''s operations / parameters.

Is C lying to you when you do this:

typedef int DatabaseID;
DatabaseID id;

maybe ... any yet the only real world case where this lie ever bites anyone is in the operator / function overloading arena (the lack of first level status also prevents usefull casting, but that is a missing ability, not a misleading and incorrect lie). And the compiler tells you when the lie has struck - and that is the point at which you refactor the class into a real type if you need to.

And more importantly, typedef has been the means by which most cross-platform code has been written (well, typedef and #define actually). Types like:

int8; uint8; int16; uint16 - etc ... these can be immensely usefull for cross platform development ... tell me how to write a proper cross-platform encryption or checksum algortihm without them - without many extra lines of code. you must have access to a set of types meeting requirements above and beyond the actual C specification in order to acomplish certain task correctly or efficiently ... such as knowing if the signed integral type is 1 or 2''s compliment (if either), and being able to know exactly what shift operators will do and what overflow situations will apply in certain cases. Even a good rand algorithm cannot exist without these usefull typedefs (because it counts on size and overflow).

Sure a system like C# or Java which defines the "platform" can work just fine ... but at a definate cost to those platforms which do not make the exact assumptions of the language. It is guaranteed that code written in C# or Java today - and thought of for a 32 bit architecture, will nearly always run poorly on a 16 bit platform, and usually not benifit from a 64 bit one either (and in fact according to the spec, could run slower, due to the langauge requiring a certain size not neccisarily matching that of the platform) ...

And for team management and following rules - what size teams / companies do you think I work for, and doing what?

SabreMan, you are correct, my statement - better in every way, should read "equal to or better in every way"

but also, your statement "That''s the short-term view. Those benefits may not scale as a project grows larger over time." Is true in it''s second half, by virtue of using the word "may" and yet is valueless in that, I have yet to be on a project for which it did not scale well. I will admit the teams I have been on usually vary from 3-9 programmers, and that that is not a large number - but I see no reason to assume that the junior programmers who are told to work on my system will not / can not be trusted to look up the type in our help file or wiki and see what the SPECIFIED rules of the type are? How is this any different than real types which have functions which have certain illegal cases, such as - an exception will be thrown if the ReadXXX function is called becfore the EstablishConnection has been invoked ... most programmers do not experience this exception, they call the functions in the order they we''re told to.

And for nearly best of all worlds, in the compiler was just a tiny bit smarter, it would be possible to write a class that would be used in the debug version and a typedef that would be used in the release version ... so that only code using the correct subset of operators would compiler, only those never setting out-of-range values would not throw a debuggin exceptions, and yet the release version would actually write code that represents what is necessary to acomplish the processing task at hand, no more, no less.
quote:Original post by Xai
SabreMan, you are correct, my statement - better in every way, should read "equal to or better in every way"

And worse in some ways.
quote:
"That''s the short-term view. Those benefits may not scale as a project grows larger over time." Is true in it''s second half, by virtue of using the word "may" and yet is valueless in that

That doesn''t render it valueless. It just means I''m not prepared to make a commitment that it holds for every single case.
quote:
I have yet to be on a project for which it did not scale well.

I''ve been on many projects where a typedef would have been better implemented as a fully realised type. You''re concerning yourself with rapid development, seeming to believe typedef gives you something for nothing. The more use you make of typedef, the less type-safety you have. As a project grows larger, the lack of type-safety starts to bite hard.
quote:
I will admit the teams I have been on usually vary from 3-9 programmers, and that that is not a large number - but I see no reason to assume that the junior programmers who are told to work on my system will not / can not be trusted to look up the type in our help file or wiki and see what the SPECIFIED rules of the type are?

I''m not making an assumption, I''m relating actual real-world experience.
quote:
How is this any different than real types which have functions which have certain illegal cases, such as - an exception will be thrown if the ReadXXX function is called becfore the EstablishConnection has been invoked ... most programmers do not experience this exception, they call the functions in the order they we''re told to.

You can''t see the difference between your own two hypothetical scenarios? In your own words: "an exception will be thrown if the ReadXXX function is called before the EstablishConnection has been invoked". You''re conducting argument by false analogy.
quote:
And for nearly best of all worlds, in the compiler was just a tiny bit smarter, it would be possible to write a class that would be used in the debug version and a typedef that would be used in the release version ... so that only code using the correct subset of operators would compiler, only those never setting out-of-range values would not throw a debuggin exceptions, and yet the release version would actually write code that represents what is necessary to acomplish the processing task at hand, no more, no less.

That would require the ability to perform full static analysis of all code paths within the program to prove that replacing a full type with a typedef would not change the meaning of the program (I imagine it wouldn''t take long with any input program to prove the equivalence to be false). For that to be true, you wouldn''t be able to use any of the facilities of full types that are unavailable to typedefs. You know, little things like type discrimination.

This topic is closed to new replies.

Advertisement