Archived

This topic is now archived and is closed to further replies.

OpenGL_Guru

how do you assign a char variable the DEGREES symbol?

Recommended Posts

i want to print out the degrees symbol in C++, in the extended ASCII table the degrees symbol is 0x178, however this doesnt work when i just assign it and try to print it out. anyone know how i can achieve this? thanks!

Share this post


Link to post
Share on other sites
nevermind i figured it out.. ill give you the command but i dont see this as a match in the ASCII table??

cout << (char) 176

if you do this, it will print out the Degrees Symbol.. i am guessing (char) converts this to Unicode instead of ASCII? anyone know? thanks..when you are in WORD or OFfice or something its easy to convert..but in C++ it may depend on compiler? wondering how C++ looks up the different tables and where they are stored.

[edited by - opengl_guru on January 22, 2004 1:48:02 PM]

Share this post


Link to post
Share on other sites
quote:
Original post by OpenGL_Guru
nevermind i found it.. ill give you the command but i dont see this as a match in the ASCII table??

if you do a cout << (char) 176, it will print out the Degrees Symbol.. i am guessing (char) converts this to Unicode instead of ASCII? anyone know? thanks..


No, it''s a question of codepage. Depending on which one is active, you will get different characters in the "extended ASCII" range. And, as it happens, Windows uses different codepages in its console and GUI windows... (at least, as far as I can tell).


“Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.”
— Brian W. Kernighan (C programming language co-inventor)

Share this post


Link to post
Share on other sites
except i dont have that symbol on my keyboard lol. i guess you can copy and paste from WOrd but havent tried to see if this will work. besides i will be printing this out using GLUT or maybe something different in openGL so i have to this about that issue too. lets say you have an environment and you want to be able to display the degrees F outside you might say at one particular frame 67.78°F or something.

Share this post


Link to post
Share on other sites
no i tried unsigned char, still didnt work, the degrees symbol, if you look at the ascii chart is well beyond the extended ascii lookup so i dunno. i decided to render a little ''o'' and translate it up a little bit. seemed to work find. but if anyone has any tips or ideas on this just to try to get the degrees symbol to work then please respond. im also looking at the other cool symbols, some of them are not useful at all but there are quite a few to where i can see them being useful a great deal.

Share this post


Link to post
Share on other sites
In hexadecimal, 0xFF is 255 and that's the end of extended ascii. So 0x178 is not on the ascii character map (it is 376 in decimal).

According to asciitable.com, both char(167) and char(248) return what look like degree symbols. I think 167 is the actual degree sign and 248 is a lower-down version of the same character, I don't know what it's called though.

EDIT: 167 = ° and 248 = º so you can see the difference.

~CGameProgrammer( );

Screenshots of your games or desktop captures -- Post screenshots of your projects. There's already 134 screenshot posts.

[edited by - CGameProgrammer on February 16, 2004 6:49:58 AM]

Share this post


Link to post
Share on other sites