[C++] Put a 256 bits value to a file [SOLVED]

Started by
6 comments, last by ToohrVyk 16 years ago
Hi, Whenever I do:
ofstream texture("gui.tm", ios::binary | ios::trunc | ios::out);
if (texture.is_open())
{
    texture.put((char) 255);
}

Or even without the casting, it takes up 2 characters in the file (thus 4 hex values, thus 16 bits) instead of 1 character (2 hex values, 8 bits). Since 255 is the maximal value this should be possible. How can I write a 256 bit to a file so it will not take up 256*256 bits? [Edited by - Decrius on April 3, 2008 6:47:32 AM]
[size="2"]SignatureShuffle: [size="2"]Random signature images on fora
Advertisement
When you open the resulting file in a hex editor, what do you see?
Quote:
$ cat test.cpp#include <iostream>#include <fstream>using std::ios;using std::ofstream;int main() {  ofstream texture("gui.tm", ios::binary | ios::trunc | ios::out );  if (texture.is_open())    {      texture.put(255);    }}$ g++ test.cpp -o test.out$ ./test.out$ wc -c gui.tm1 gui.tm


I'm afraid it only takes one character here...
Quote:Original post by ToohrVyk
Quote:
$ cat test.cpp#include <iostream>#include <fstream>using std::ios;using std::ofstream;int main() {  ofstream texture("gui.tm", ios::binary | ios::trunc | ios::out );  if (texture.is_open())    {      texture.put(255);    }}$ g++ test.cpp -o test.out$ ./test.out$ wc -c gui.tm1 gui.tm


I'm afraid it only takes one character here...


Hmm true...weird.

EDIT:
Fixed the problem, thanks for helping.
[size="2"]SignatureShuffle: [size="2"]Random signature images on fora
char is a value that goes from -128 to 128, so 255 wont work.

Use unsigned char to place 255 as a single value.
Quote:Original post by nuno_silva_pt
char is a value that goes from -128 to 128, so 255 wont work.

Use unsigned char to place 255 as a single value.


The standard leaves it up to the implementation to decided whether a char is signed or unsigned. There is also something about a char being a different type to both signed char and unsigned char, regardless of which way the compiler chooses to implement character sign.

Example:
int main(){    char *string = "string";    unsigned char *us = string;    signed char *ss = string;}


Compiler errors:
Quote:
invalid conversion from `char*' to `unsigned char*'
invalid conversion from `char*' to `signed char*'
Quote:Original post by rip-off
Compiler errors:
Quote:
invalid conversion from `char*' to `unsigned char*'
invalid conversion from `char*' to `signed char*'


That's because you didnt typecast the pointer.

int main(){    char *string = "string";    unsigned char *us = (unsigned char*)string;    signed char *ss = (signed char*)string;}
Quote:Original post by nuno_silva_pt
That's because you didnt typecast the pointer.


The actual question is, why is a typecast required?

typedef signed char my_char;my_char *data = 0;signed char *cast = data;


If char was the same thing as signed char, then no typecast would be required, as demonstrated in the example above. Since a typecast is required, char and signed char are two different things.

As a matter of fact, the fact that char is signed or unsigned is an implementation choice. And so, (char) 255, while being unportable, is not incorrect.

This topic is closed to new replies.

Advertisement