• Advertisement

Archived

This topic is now archived and is closed to further replies.

Why define Zero? Why? Why? Why?

This topic is 6231 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Why is Zero defined in certain languages and where is it common practice to use it? Also, could you explain why? farmersckn

Share this post


Link to post
Share on other sites
Advertisement
In C it''s important that there''s the right number of bits set to 0.

A Byte=0 is not exactly the same as a WORD=0 nor a DWORD=0
That''s why we have NULL TRUE & FALSE, which are different than 0 true & false

Share this post


Link to post
Share on other sites

  • Advertisement