why!? why!? WHY!?!? bitwise hell

Started by
28 comments, last by JasonBlochowiak 17 years, 6 months ago
I understand the operator precedence for mathematical operators such as +, -, etc. but the operators we were referring to such as & or != seem to be more abstract then your everyday mathematical operations. Logical operations such as these don't borrow rules from mathematics (except possibly from discrete mathematics, but these again seem to me to be just as arbitrary).

Thank you for clarifying that though, if someone could maybe shed some light on why the other (i.e. non-arithmetic operators such as &, |, etc.) are given precedence such as they are that would be cool.

Thanks
Advertisement
The choice (like many in software/computers/etc) was probably based on existing system when C was developed. Another possibility is they didnt want to seperate logical AND/OR from the bitwise AND/OR hence they both have lower precedence than comparision.
Quote:Original post by Illumini
The choice (like many in software/computers/etc) was probably based on existing system when C was developed. Another possibility is they didnt want to seperate logical AND/OR from the bitwise AND/OR hence they both have lower precedence than comparision.


Logical operators are evaluated much latter then bitwise operators.

If I was to venture a guess, it would stem from assembly and pre-assembly programming, in which bitwise operators are about as "base" as you can get. Then again, thats just a guess.


To people a bit blurry on operator precedence, this might be helpful
http://www.cppreference.com/operator_precedence.html
I guess your open to your own definition of "much later", but the page you just linked shows them at the next lowest precedence (ie directly after, there are no other operators in between them).
Quote:Original post by Serapth
It just so happens that since bitwise math is as "basic" math as computers get, they have the highest precedence ( are evaluated first ), unless over ridden by ( ) markers.


But that's sort of the point... they AREN'T evaluated first. The != is.

Quote:Original post by Morpheus011
I understand the operator precedence for mathematical operators such as +, -, etc. but the operators we were referring to such as & or != seem to be more abstract then your everyday mathematical operations. Logical operations such as these don't borrow rules from mathematics (except possibly from discrete mathematics, but these again seem to me to be just as arbitrary).

Thank you for clarifying that though, if someone could maybe shed some light on why the other (i.e. non-arithmetic operators such as &, |, etc.) are given precedence such as they are that would be cool.

Thanks


Actually comparing if two things are the same is even more basic than adding or subtracting them. Also And and OR and all that stuff is everyday mathematics or something even more basic - logic. Bitwise is more basic than logical since it is simply a boolean operation on the hardware representation of the bits themselves (not very abstract at all), while logical are more highlevel and akin to the everyday notion of boolean logic.

Anyways to wrap up &&, || , ! all 'borrow' rules from a 2000 year old subject called logic.
Quote:Original post by Illumini
I guess your open to your own definition of "much later", but the page you just linked shows them at the next lowest precedence (ie directly after, there are no other operators in between them).


Yes, I suppose much might be a bit much :)
Quote:Original post by Daerax

Actually comparing if two things are the same is even more basic than adding or subtracting them. Also And and OR and all that stuff is everyday mathematics or something even more basic - logic. Bitwise is more basic than logical since it is simply a boolean operation on the hardware representation of the bits themselves (not very abstract at all), while logical are more highlevel and akin to the everyday notion of boolean logic.

Anyways to wrap up &&, || , ! all 'borrow' rules from a 2000 year old subject called logic.


Right, that's why I referenced them as being borrowed from discrete mathematics. Claiming that performing boolean operations on hardware representations of bits isn't an abstract concept (imagine explaining that to someone who doesn't program) seems to only validate, in my mind, that it is all somewhat arbitrary, i.e. up to the discretion of the observer (or in this case, the developer). I understand these are rules that govern all human and mathematical logic, but IMO it seems that these things could just as easily be flipped around (perhaps in an alternate universe the logic works completely differently then in ours :D)

Although I see that that's a pointless thing to ponder, as it _is_ this way, and for good reason. Hell, we can't just go and defy the core logic of our universe, can we?

Quote:Original post by Morpheus011
Quote:Original post by Daerax

Actually comparing if two things are the same is even more basic than adding or subtracting them. Also And and OR and all that stuff is everyday mathematics or something even more basic - logic. Bitwise is more basic than logical since it is simply a boolean operation on the hardware representation of the bits themselves (not very abstract at all), while logical are more highlevel and akin to the everyday notion of boolean logic.

Anyways to wrap up &&, || , ! all 'borrow' rules from a 2000 year old subject called logic.


Right, that's why I referenced them as being borrowed from discrete mathematics. Claiming that performing boolean operations on hardware representations of bits isn't an abstract concept (imagine explaining that to someone who doesn't program) seems to only validate, in my mind, that it is all somewhat arbitrary, i.e. up to the discretion of the observer (or in this case, the developer). I understand these are rules that govern all human and mathematical logic, but IMO it seems that these things could just as easily be flipped around (perhaps in an alternate universe the logic works completely differently then in ours :D)

Although I see that that's a pointless thing to ponder, as it _is_ this way, and for good reason. Hell, we can't just go and defy the core logic of our universe, can we?


Ahh philosophy, my favourite topic. But very much off topic (ah cant help but say this much, what you say has been and is studied but nonetheless there is a class of necessary truths true across all universes of discourse whose otherwise existance would not be). Just a few corrections. Perhaps it is because it hits close to home but I must emphasize that a distinction be made between logic and discrete mathematics, just because discrete math uses some techniques from logic does not mean that it is. That is like saying chemistry or economics is the same as mathematics. I am sure the mathematician would be offended :).

As for absract, here it is meant that no direct physical representation, the bits of a computer certainly do have a physical existance and there is nothing abstract about flipping their state,surely much less abstract than the logical operations. Be assured that the concept of switching from on to off and in terms of logical notions of AND and OR is easily done. And to children as well. Do not underestimate the general public.
Lesson for the day: Make sure you set level 4 warnings in the Code Generation tab of project properties

This error would have been picked up immediately if you had done so (through a compiler warning "warning C4554: '&' : check operator precedence for possible error; use parentheses to clarify precedence".

This topic is closed to new replies.

Advertisement