Jump to content
  • Advertisement
Sign in to follow this  
zyrolasting

Enum bit depths...? (C++)

This topic is 3455 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Search engines aren't really helping me here. I see that some can write an enum which forces values to 32-bit, from what the comments say. They are defined with the value 0x7fffffff. What does this accomplish, exactly? How it it used differently than from a normal enum with just the values you tossed in?

Share this post


Link to post
Share on other sites
Advertisement
Some compilers attempt to save space by making the enum as small as possible. If your enum contains only the values 0-128, for instance, then you only need a single byte to represent the values, rather than an entire 4-byte integer.

Adding a value which requires all 32-bits to represent forces the compiler to use 32-bit integers to represent the type. In general, this in only an issue during binary serialisation, or when padding structures to match alignment restrictions/cache lines.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!