• Create Account

### #ActualApmyp

Posted 24 September 2013 - 05:05 AM

Dear Andreas and all,

1) This code will cause an error "Implicit conversion changed sign of value":

uint64 ui64b = 4294967295;

But this won't:

uint64 ui64b = 4294967296;

2) If I declare a global application function like FnTestUint64(uint64 val) and pass ui64b variable to this function, then there are two different behaviours:

uint64 ui64b = 4294967295; FnTestUint64(ui64b); // C++ debugger will show incorrect value 0xffffffffffffffff

uint64 ui64b = 4294967296; FnTestUint64(ui64b); // C++ debugger will show correct value 0x0000000100000000

0xffffffffffffffff is -1 in int64 and 18446744073709551615 in uint64.

It seems that compiler incorrectly converts 4294967295 to int as -1, then converts this -1 to int64, then converts this big -1 (represented as 0xffffffffffffffff) to uint64

Maybe it is a well-known bug? Is it possible to fix it?

Thank you.

### #4Apmyp

Posted 24 September 2013 - 05:01 AM

Dear Andreas and all,

1) This code will cause an error "Implicit conversion changed sign of value":

uint64 ui64b = 4294967295;

But this not:

uint64 ui64b = 4294967296;

2) If I declare global application function like FnTestUint64(uint64 val) and pass ui64b variable to this function, then there are two different behaviours:

uint64 ui64b = 4294967295; FnTestUint64(ui64b); // C++ debugger will show incorrect value 0xffffffffffffffff

uint64 ui64b = 4294967296; FnTestUint64(ui64b); // C++ debugger will show correct value 0x0000000100000000

0xffffffffffffffff is -1 in int64 and 18446744073709551615 in uint64.

It seems that compiler incorrectly converts 4294967295 to int as -1, then converts this -1 to int64, then converts this big -1 (represented as 0xffffffffffffffff) to uint64

Maybe it is a well-known bug? Is it possible to fix it?

Thank you.

### #3Apmyp

Posted 24 September 2013 - 04:57 AM

Dear Andreas and all,

1) This code will cause an error "Implicit conversion changed sign of value":

uint64 ui64b = 4294967295;

But this not:

uint64 ui64b = 4294967296;

2) If I declare global application function like FnTestUint64(uint64 val) and will pass ui64b variable to this function, then there is two different behaviors:

uint64 ui64b = 4294967295; FnTestUint64(ui64b); // C++ debugger will show incorrect value 0xffffffffffffffff

uint64 ui64b = 4294967296; FnTestUint64(ui64b); // C++ debugger will show correct value 0x0000000100000000

0xffffffffffffffff is -1 in int64 and 18446744073709551615 in uint64.

It seems that compiler incorrectly converts 4294967295 to int as -1, then converts this -1 to int64, then converts this big -1 (represented as 0xffffffffffffffff) to uint64

May be it is well-known bug? Is it possible to fix it?

Thank you.

### #2Apmyp

Posted 24 September 2013 - 04:52 AM

Dear Andreas and all,

1) This code will cause an error "Implicit conversion changed sign of value":

uint64 ui64b = 4294967295;

But this not:

uint64 ui64b = 4294967296;

2) If I declare global application function like FnTestUint64(uint64 val) and will pass ui64b variable to this function, then there is two different behaviors:

uint64 ui64b = 4294967295; FnTestUint64(ui64b); // C++ debugger will show incorrect value 0xffffffffffffffff

uint64 ui64b = 4294967296; FnTestUint64(ui64b); // C++ debugger will show correct value 0x0000000100000000

0xffffffffffffffff is -1 in int64 and 18446744073709551615 in uint64.

It seems that compiler incorrectly converts 4294967295 to int64 as -1, then converts this -1 to uint64.

May be it is well-known bug? Is it possible to fix it?

Thank you.

### #1Apmyp

Posted 24 September 2013 - 04:51 AM

Dear Andreas and all,

1) This code will cause an error "Implicit conversion changed sign of value":

uint64 ui64b = 4294967295;

But this not:

uint64 ui64b = 4294967296;

2) If I declare global application function like FnTestUint64(uint64 val) and will pass ui64b variable to this function, then there is two different behaviors:

uint64 ui64b = 4294967295; FnTestUint64(ui64b); // C++ debugger will show value incorrect value 0xffffffffffffffff

uint64 ui64b = 4294967296; FnTestUint64(ui64b); // C++ debugger will show value correct value 0x0000000100000000

0xffffffffffffffff is -1 in int64 and 18446744073709551615 in uint64.

It seems that compiler incorrectly converts 4294967295 to int64 as -1, then converts this -1 to uint64.

May be it is well-known bug? Is it possible to fix it?

Thank you.

PARTNERS