Jump to content
  • Advertisement
Sign in to follow this  
dawidjoubert

byte,char exactly 8 bits?

This topic is 4514 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

How can I make sure the variable i use for a byte is exactly 8 bits atm I am using a char but i get this warning warning C4309: 'default argument' : truncation of constant value at char aa = 254

Share this post


Link to post
Share on other sites
Advertisement
Guest Anonymous Poster
the char is signed .. make it unsigned

Share this post


Link to post
Share on other sites
The trouble here is that aa is a char, but 254 is an int, so the compiler is warning that the int is being truncated to a char. Since 254 is small enough, there won't be any real problem, but the warnings are annoying. You can just cast the 254 to a char to get rid of the warning:

char aa = (char)254;

To be safe when going back and forth between chars and ints, you should use unsigned types to prevent sign extension.

Share this post


Link to post
Share on other sites
Just to re-iterate, the real problem with your code is as the Anonymous Poster said - a char will only hold values from -128 to +127, while an unsigned char will hold values from 0-255. Thus, in this instance, you should be using an unsigned char.

You should also explicitly type-cast as Date Hunt suggested.

Share this post


Link to post
Share on other sites
I think the 'truncation' in the warning means: cannot store the value because it's too big. That would say that the compiler thinks the value is of type int, so I say that Dave is right.

And I read somewhere that most compilers have 'unsigned' on chars by default.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
typedef char byte;

Fixed a typo.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Sorry, that should be

typedef unsigned char byte;

ofcourse...

Share this post


Link to post
Share on other sites
Quote:
Original post by Pipo DeClown
And I read somewhere that most compilers have 'unsigned' on chars by default.


I wouldn't count on it though [wink]

And yeah, use a typedef - they were made for this kind of thing.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!