• Advertisement
Sign in to follow this  

[C++] Put a 256 bits value to a file [SOLVED]

This topic is 3578 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, Whenever I do:
ofstream texture("gui.tm", ios::binary | ios::trunc | ios::out);
if (texture.is_open())
{
    texture.put((char) 255);
}

Or even without the casting, it takes up 2 characters in the file (thus 4 hex values, thus 16 bits) instead of 1 character (2 hex values, 8 bits). Since 255 is the maximal value this should be possible. How can I write a 256 bit to a file so it will not take up 256*256 bits? [Edited by - Decrius on April 3, 2008 6:47:32 AM]

Share this post


Link to post
Share on other sites
Advertisement
Quote:
$ cat test.cpp
#include <iostream>
#include <fstream>

using std::ios;
using std::ofstream;

int main() {

ofstream texture("gui.tm", ios::binary | ios::trunc | ios::out );
if (texture.is_open())
{
texture.put(255);
}
}

$ g++ test.cpp -o test.out
$ ./test.out
$ wc -c gui.tm
1 gui.tm


I'm afraid it only takes one character here...

Share this post


Link to post
Share on other sites
Quote:
Original post by ToohrVyk
Quote:
$ cat test.cpp
#include <iostream>
#include <fstream>

using std::ios;
using std::ofstream;

int main() {

ofstream texture("gui.tm", ios::binary | ios::trunc | ios::out );
if (texture.is_open())
{
texture.put(255);
}
}

$ g++ test.cpp -o test.out
$ ./test.out
$ wc -c gui.tm
1 gui.tm


I'm afraid it only takes one character here...


Hmm true...weird.

EDIT:
Fixed the problem, thanks for helping.

Share this post


Link to post
Share on other sites
Quote:
Original post by nuno_silva_pt
char is a value that goes from -128 to 128, so 255 wont work.

Use unsigned char to place 255 as a single value.


The standard leaves it up to the implementation to decided whether a char is signed or unsigned. There is also something about a char being a different type to both signed char and unsigned char, regardless of which way the compiler chooses to implement character sign.

Example:

int main()
{
char *string = "string";
unsigned char *us = string;
signed char *ss = string;
}



Compiler errors:
Quote:

invalid conversion from `char*' to `unsigned char*'
invalid conversion from `char*' to `signed char*'

Share this post


Link to post
Share on other sites
Quote:
Original post by rip-off
Compiler errors:
Quote:

invalid conversion from `char*' to `unsigned char*'
invalid conversion from `char*' to `signed char*'


That's because you didnt typecast the pointer.

int main()
{
char *string = "string";
unsigned char *us = (unsigned char*)string;
signed char *ss = (signed char*)string;
}


Share this post


Link to post
Share on other sites
Quote:
Original post by nuno_silva_pt
That's because you didnt typecast the pointer.


The actual question is, why is a typecast required?

typedef signed char my_char;

my_char *data = 0;
signed char *cast = data;


If char was the same thing as signed char, then no typecast would be required, as demonstrated in the example above. Since a typecast is required, char and signed char are two different things.

As a matter of fact, the fact that char is signed or unsigned is an implementation choice. And so, (char) 255, while being unportable, is not incorrect.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement