In the following code I modified the isLittleEndian() version to eliminate the bitmask. I understand that the function of the bitmask is to zero out any bits on the unsigned (int) that are not used. But the strange thing is that if I leave out this bitmask, the output will insert 3 bytes of 0xffffff for every unsigned (int) printed to the stringstream.
This leads to my question. If static_cast<>()ing to a larger type does C++ or GCC fill the extra bits in with all 1s? I don't know why this would be the case but it seems to be.
template <class T>
std::string type_to_hex( const T arg )
{
std::ostringstream hexstr ;
const char* addr = reinterpret_cast<const char*>(&arg) ;
hexstr << "0x" ;
hexstr << std::setw(2) << std::setprecision(2) << std::setfill('0') << std::hex ;
if( isLittleEndian() )
{
for( int b = sizeof(arg) - 1; b >= 0; b-- )
{
hexstr << static_cast<unsigned>(*(addr+b)) ;
std::cout << hexstr.str() << std::endl;
}
}
else
{
for( int b = 0; b < sizeof(arg); b++ )
{
hexstr << static_cast<unsigned>(*(addr+b) & 0xff) ;
}
}
return hexstr.str() ;
}
int main()
{
std::cout << type_to_hex((float)16/9) << std::endl;
}
Output:
0x3f
0x3fffffffe3
0x3fffffffe3ffffff8e
0x3fffffffe3ffffff8e39
0x3fffffffe3ffffff8e39