Dear Andreas and all,
1) This code will cause an error "Implicit conversion changed sign of value":
uint64 ui64b = 4294967295;
But this won't:
uint64 ui64b = 4294967296;
2) If I declare a global application function like FnTestUint64(uint64 val) and pass ui64b variable to this function, then there are two different behaviours:
uint64 ui64b = 4294967295; FnTestUint64(ui64b); // C++ debugger will show incorrect value 0xffffffffffffffff
uint64 ui64b = 4294967296; FnTestUint64(ui64b); // C++ debugger will show correct value 0x0000000100000000
0xffffffffffffffff is -1 in int64 and 18446744073709551615 in uint64.
It seems that compiler incorrectly converts 4294967295 to int as -1, then converts this -1 to int64, then converts this big -1 (represented as 0xffffffffffffffff) to uint64
Maybe it is a well-known bug? Is it possible to fix it?
Edited by Apmyp, 24 September 2013 - 05:05 AM.