# Eight, Nine, Ten...

## Recommended Posts

MilchoPenchev    1178

edit:
Oh gosh, I didn't even realize the TEN issue... That's horrible.

That edit made my day :D - that is exactly why this is so scary.

##### Share on other sites

an explanation for the more noobish of us? what's wrong with that definition? >_<;

##### Share on other sites
Rattrap    3385

an explanation for the more noobish of us? what's wrong with that definition? >_<;

0x10 in 16 in hex.

##### Share on other sites

Could be binary coded decimal... and it makes outputting a text value easier as well ;)

Manic Miner on the ZX Spectrum didn't even store the game score or high score in a variable... it just used ASCII text on the screen (the screen area where the score was displayed was never cleared).

The algorithm only added to your score either 1 or 10 or 100 at a time IIRC. It just increased the ASCII value at the screen position by 1 and if that makes it > 9 it made the current digit 0 and added 1 to the next digit along (and looped if that was bumped to > 9 as well).

It then just copied the score to the high score location if the score was larger than the current high score (check was done with another ASCII comparison).

##### Share on other sites
MilchoPenchev    1178

Could be binary coded decimal... [snip]

That sounds so convoluted it requires its own thread.

Unfortunately that wasn't the case here - this code, according to a colleague who originally fixed it - caused a fatal crash in the software.

##### Share on other sites

Obvious, the coder has 8 fingers on each hand

Edited by Tournicoti

##### Share on other sites
cr88192    1570

Obvious, the coder has 8 fingers on each hand

... maybe the use of named constants is what was at fault here ...

...
#define FOURTY_TWO (042)
#define NINETEEN_EIGHTY_FOUR (NINETEEN, EIGHTY, FOUR)
...

##### Share on other sites
ApochPiQ    23003

#define FOURTY_TWO (042)

Eurgh. I have many horror stories of Octal gone awry.

##### Share on other sites
mhagain    13430

#define FOURTY_TWO (042)

That's even worse!!!

##### Share on other sites
Bacterius    13165

That's even worse!!!

This used to happen to me when I was trying to align constants properly. Nowadays I just put spaces instead, been burned too often by this damn octal notation "feature" which I'm guessing nobody actually uses, except the odd raw socket hacker (if even). Permission bits are another one, but constants for those are already defined anyway.

##### Share on other sites
cr88192    1570

That's even worse!!!

This used to happen to me when I was trying to align constants properly. Nowadays I just put spaces instead, been burned too often by this damn octal notation "feature" which I'm guessing nobody actually uses, except the odd raw socket hacker (if even). Permission bits are another one, but constants for those are already defined anyway.

at one point I made things more orthogonal (for my script language) by adding several number notations:
0b... //binary
0c... //octal
0d... //decimal

then got into a big mental debate as to whether or not to keep '0' by itself as an octal prefix, make it decimal, or maybe just deprecate it and issue a warning...

note, '_' is also a spacer, so:
0d999_999_999
is the same as:
999999999

likewise:
0x5CD13A42_3B9AC9FFL
or:
999_999_999_999_999_999L
or:
0x0DE0_B6B3__A763_FFFFL

##### Share on other sites

Octal was big back in the day when C was being made, maybe even moreso than hexadecimal. These days nobody really uses it since it can't be aligned nicely to 8-bit (some computers back then had words with a bit count multiple of 3, so octal probably made a lot more of sense).

##### Share on other sites
Cornstalks    7030

#define ONE     +1
#define TWENTY  +20
#define HUNDRED +100

// Here's a little example
#include <iostream>

int main()
{
int x = ONE;
std::cout << x << std::endl;

x = ONE HUNDRED;
std::cout << x << std::endl;

x = ONE HUNDRED TWENTY;
std::cout << x << std::endl;

x = ONE HUNDRED TWENTY ONE;
std::cout << x << std::endl;
}


Or even worse:


#define ONE     +0x001
#define TWENTY  +0x020
#define HUNDRED +0x100

Edited by Cornstalks

##### Share on other sites
slicer4ever    6760

....snip...

just...just....o god my eyes....

##### Share on other sites
Trienco    2555

Still trying to figure out the reasoning behind that. Is that what happens when your code analysis tool complains about "magic numbers" and somebody just decides to "fix it"?

Because having one such tool complain about '0' being a "magic number" really made me question the worth of that tool and what the creators' code would look like.

##### Share on other sites
Bacterius    13165

#define ONE     +1
#define TWENTY  +20
#define HUNDRED +100

// Here's a little example
#include <iostream>

int main()
{
int x = ONE;
std::cout << x << std::endl;

x = ONE HUNDRED;
std::cout << x << std::endl;

x = ONE HUNDRED TWENTY;
std::cout << x << std::endl;

x = ONE HUNDRED TWENTY ONE;
std::cout << x << std::endl;
}


Or even worse:


#define ONE     +0x001
#define TWENTY  +0x020
#define HUNDRED +0x100


TWO HUNDRED == 102


But don't worry - we can fix it! We just need to be "clever"...

#define AND +0
#define ONE +1
#define TWO +2
#define TWENTY +20
#define HUNDRED *100

// Here's a little example
#include <iostream>

int main()
{
int x = TWO HUNDRED AND TWENTY ONE;
std::cout << x << std::endl;
// prints 221
}

This way you even get to write grammatically correct numbers.

Edited by Bacterius

##### Share on other sites
Khatharr    8812

I think we should pool our efforts and make a programming language where this kind of thing is sane.

##### Share on other sites
Cornstalks    7030

*snip*

Ah, good catch!

I think we should pool our efforts and make a programming language where this kind of thing is sane.

I was hoping for a more concise solution, but here we go (so now you can use commas and hyphens in your numbers!):
#include <type_traits>

template <typename T>
struct Number
{
mutable T n;
Number(T n) : n(n) {}

const Number<T> operator+() const
{
return *this;
}

template <typename U>
const Number<typename std::common_type<T, U>::type> operator, (const Number<U>& num) const
{
return n + num.n;
}

template <typename U>
const Number<typename std::common_type<T, U>::type> operator- (const Number<U>& num) const
{
return n + num.n;
}

template <typename U>
const Number<typename std::common_type<T, U>::type> operator+ (const Number<U>& num) const
{
return n + num.n;
}

template <typename U>
const Number<typename std::common_type<T, U>::type> operator* (const Number<U>& num) const
{
return n * num.n;
}

operator T() const
{
return n;
}
};

Number<unsigned long long> operator"" _n(unsigned long long n)
{
return n;
}

#define ONE      +1_n
#define TWO      +2_n
#define THREE    +3_n
#define FOUR     +4_n
// ... add more if you wish
#define TWENTY   +20_n
#define HUNDRED  *100_n
#define THOUSAND *1000_n
#define AND

#define TIMES *
#define MINUS -

#include <iostream>

int main()
{
// The only real drawback is that you have to surround them with parentheses if you use ,
int x = (FOUR THOUSAND, ONE HUNDRED AND TWENTY-THREE);
int y = (ONE HUNDRED AND ONE);
int result = x TIMES y;
std::cout << x << " * " << y << " = " << result << std::endl;

result = x MINUS y;
std::cout << x << " - " << y << " = " << result << std::endl;
}

Edited by Cornstalks

##### Share on other sites
blutzeit    1650

Good thing they put parentheses around those literals!

edit:

Oh gosh, I didn't even realize the TEN issue... That's horrible.

I don't understand, the only problem is that it's not complete? Here's an excerpt from the full definition:

#define EIGHT  (0x08)
#define NINE   (0x09)
#define AYE    (0x0A)
#define BEE    (0x0B)
#define CEE    (0x0C)
#define DEE    (0x0D)
#define EE     (0x0E)
#define EFF    (0x0F)
#define TEN    (0x10)
...
#define NINETEEN (0x19)
#define AYETEEN  (0x1A)
#define BEETEEN  (0x1B)
...
#define NINETYEFF (0x9F)
...
...
#define EFFTYEFF  (0xFF)


##### Share on other sites
mhagain    13430

I think we should pool our efforts and make a programming language where this kind of thing is sane.

It's called "Cobol"...

##### Share on other sites
...#define efftyeff (0xff)...

HAHA! Classic!

##### Share on other sites
Vortez    2714

why someone (sane) would use #define for numbers is beyond me...