• Advertisement

Archived

This topic is now archived and is closed to further replies.

Convert a char to an integer

This topic is 5715 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
You don''t need to convert a char to an integer - it''s an implicit promotion. Do you mean something else?

Share this post


Link to post
Share on other sites
wwhat that guy was doing is typecasting, but i dont understand what the first guy was doing.
is that STL?

<marquee width=100% direction=right><table nostyle="filter:glow(color=red,strength=3)"><table nostyle="filter:shadow(color=blue,direction=left)">Can someone be nice and help me on my way to be the next Hideo Kojima?</table></table></marquee>

Share this post


Link to post
Share on other sites
quote:
Original post by Pipo DeClown
wwhat that guy was doing is typecasting, but i dont understand what the first guy was doing.


That is typecasting as well.
quote:

is that STL?


No, it''s standard C++.

Share this post


Link to post
Share on other sites
Go IndirectX!

<marquee width=100% direction=right><table nostyle="filter:glow(color=red,strength=3)"><table nostyle="filter:shadow(color=blue,direction=left)">Can someone be nice and help me on my way to be the next Hideo Kojima?</table></table></marquee>

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
the guy was probably asking how to parse an int from a char* used as a string.

Share this post


Link to post
Share on other sites
if it is just a single char that is a number and you want an int to be equal to that number

  
char c = ''5'';
int num = c - ''0'';

num is now equal to 5


"I pity the fool, thug, or soul who tries to take over the world, then goes home crying to his momma."
- Mr. T

Share this post


Link to post
Share on other sites
There is always the somewhat convoluted c++ way:


char *c = "1234";
std::stringstream input;
input << c;
int number;
input >> number;
std::cout << number;

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
quote:
Original post by invective
There is always the somewhat convoluted c++ way:


char *c = "1234";
std::stringstream input;
input << c;
int number;
input >> number;
std::cout << number;



Removing one step :



char* c = "1234";

std::istringstream input( c );

int number;
input >> number;



Share this post


Link to post
Share on other sites
No, well depends on what''s in your stack or what''s in your heap if c is global. Very dangerous.

Share this post


Link to post
Share on other sites
you might want to read up on how the atoi function works.


char s[] = "1234";
int i = atoi(s); // i = 1234;


To the vast majority of mankind, nothing is more agreeable than to escape the need for mental exertion... To most people, nothing is more troublesome than the effort of thinking.

[edited by - jenova on June 28, 2002 4:42:34 PM]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Don''t forget boost::lexical_cast<>!

Share this post


Link to post
Share on other sites
quote:
Original post by KalvinB
48 is the character code for 0.

So why not use ''0'' rather than the ASCII code?

Share this post


Link to post
Share on other sites
Because it has be converted to 48 at some point anyway.

Since I''m converting to numbers I use numbers. I don''t see str as a series of characters. I see it as a series of numbers. Therefore it makes no sense to subract a letter.

char is just an 8 bit number. It''s not a character until you output it to the screen.

''0'' is purely cosmetic and unneccessary unless you''re unfamiliar with the ASCII codes which you shouldn''t be if you''ve been coding for any amount of time.

Ben


IcarusIndie.com [ The Rabbit Hole | The Labyrinth | DevZone | Gang Wars | The Wall | Hosting ]

Share this post


Link to post
Share on other sites
quote:
Original post by KalvinB
Because it has be converted to 48 at some point anyway.

Since I''m converting to numbers I use numbers. I don''t see str as a series of characters. I see it as a series of numbers. Therefore it makes no sense to subract a letter.

char is just an 8 bit number. It''s not a character until you output it to the screen.

''0'' is purely cosmetic and unneccessary unless you''re unfamiliar with the ASCII codes which you shouldn''t be if you''ve been coding for any amount of time.

Ben



Ah, but the code for 0 on some other platforms may be different. Your program is not portable. Every book I''ve read told me to avoid direct ASCII codes (nonportable) and prefer actual characters (will work the same, and is portable).

Share this post


Link to post
Share on other sites
e.g. some IBM platforms use the EBCDIC standard instead of ASCII. The character ''0'' is represented by 240 in EBCDIC.

Share this post


Link to post
Share on other sites
quote:
Original post by KalvinB
What platforms wouldn''t the ASCII value be compatible with?

Ben



By it''s definition, ASCII stands for "American Standard Code for Information Interchange." It''s the standard code used for information interchange among data processing systems, data communications systems, and associated equipment in the United States. So say you made this great program that people in China or some other country (where that standard is Unicode or something different) want to port, well now they will be on a scavenger hunt for bugs.

Share this post


Link to post
Share on other sites

  • Advertisement