Odd casting issue?

Started by
4 comments, last by MARS_999 20 years, 1 month ago
Maybe I am high! But I guess I never seen a book talk about it or seen it, but assumed that one could type cast unsigned?

float x = 110.1f;
unsigned char c = ''0'';

c = unsigned char(x);
or

float x = 110.1f;
unsigned char c = ''0'';

c = unsigned int(x);
I get errors on this? Why? Shouldn''t I be able to type cast any primitive type?
Advertisement
Are you sure that''s the exact same code? I just checked that with .NET (which is more ANSI anal than VC6) and I have no compiler warnings at all, let alone errors.
You need to type..
c = (unsigned int)(x);
quote:Original post by Anonymous Poster
You need to type..
c = (unsigned int)(x);



Coolbeans AP. That works. Why is that? why do I have to () around the type and the variable I want to typecast? Is this an ANSI compliant standard? for unsigned type casts? Thanks
It probably isn''t recognizing the full unsigned char type. It sees the char, casts it, and then sees some suspicious looking unsigned type that blows its mind.

At least it sounds cool.
"oh a char cast. nice... unsigned?! ahr! wtf! noooo!" gmrblmgllbrr!!! stupid compiler. Yeh I had this problem too once
Emil Johansen- SMMOG AI designerhttp://smmog.com

This topic is closed to new replies.

Advertisement