C++ Console - Char - Binary / Binary - Char
I just want a command that will get a single char variable, lets say 'a' and convert that to its ascii binary form and store it as a string or short, either is fine. Then beable to convert it back.
Uh, really the monkey code is extremly silly.
First snippet: read bit 7 -> 0
Second snippet: read bit 0 -> 7
*gasp* they give reversed results!!!!
edit: scrapped an unimportant bit having mostly do to with diffrent views on how you read your bits..
First snippet: read bit 7 -> 0
Second snippet: read bit 0 -> 7
*gasp* they give reversed results!!!!
edit: scrapped an unimportant bit having mostly do to with diffrent views on how you read your bits..
Um, also commonly the lowest bit is said to be the zeroth that is x & 1;//reads the lowest bit so technically speaking MM is by that convention printing the bit-values backwards.
edit:
ok, I give in.
Nice article
edit:
ok, I give in.
Nice article
The basic problem seems to be one of convention mixup.
x86 people like myself consider the "correct" bit-order to be: 76543210 since that's the convention for our platform, big-endian people by convention number their bits 01234567. From that follows that when printed numbers will independant on platform have the same textual binary representation.
Basicly what that means is that when you're on the assembly level (but when doing bitwork never before that) you have to know the convention used on your platform and adhere to it. MoulingMonkeys code gives the result it does mainly because he applies the big-endian convention when the platform mandates the 7->0 ordering of bits.
Basicly if you're only writing C you can't force it to be bit-order senstiev since you build bitmasks either by declaring them or by bitwise operators.
Anyone have a real world scenario where bit order issues have arisen? I can only dream up really wonky mixed interfaces to shared memory.
x86 people like myself consider the "correct" bit-order to be: 76543210 since that's the convention for our platform, big-endian people by convention number their bits 01234567. From that follows that when printed numbers will independant on platform have the same textual binary representation.
Basicly what that means is that when you're on the assembly level (but when doing bitwork never before that) you have to know the convention used on your platform and adhere to it. MoulingMonkeys code gives the result it does mainly because he applies the big-endian convention when the platform mandates the 7->0 ordering of bits.
Basicly if you're only writing C you can't force it to be bit-order senstiev since you build bitmasks either by declaring them or by bitwise operators.
Anyone have a real world scenario where bit order issues have arisen? I can only dream up really wonky mixed interfaces to shared memory.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement