Jump to content
  • Advertisement
Sign in to follow this  
_moagstar_

boost::lexical_cast

This topic is 3656 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am trying to use boost::lexical_cast to provide string representations of various objects, this all seems to work nicely and really cuts down on the temporary std::stringstream objects I used to have. However when I try and use unicode strings, it doesn't seem to work, as a quick example : boost::lexical_cast<std::string>(std::wstring(L"Unicode String")); boost::lexical_cast<std::wstring>(std::string("ANSI String")); Here, the conversion from ANSI to unicode works fine, but conversion from unicode to ansi I get the following compile error : boost\lexical_cast.hpp(174) : error C2679: binary '=' : no operator found which takes a right-hand operand of type 'std::basic_string<_Elem,_Traits,_Ax>' (or there is no acceptable conversion) Is there something that I'm missing?

Share this post


Link to post
Share on other sites
Advertisement
Firstly, this isn't really 'unicode to ASCII' or vice versa. ASCII is a subset of the Unicode character set and thus there is no actual conversion that takes place, as such. What you have are normal 8-bit strings and wide, 16-bit strings - 2 different ways to store the same thing, both valid ways of storing Unicode strings (eg. as UTF-8 or UTF-16). Converting an 8-bit string to a 16-bit string is easy since you just have to pad the information. It's like adding leading zeros to a number. But converting a 16-bit string to an 8-bit string is hard because the system needs to know what to do with that extra information. Keeping with the numerical analogy, it's like trying to represent the number 1234 with only 3 digits. It can't just throw the extra values away, because although that works for plain ASCII which fits perfectly into 8 bits (with 1 bit to spare, actually), it wouldn't work for many of the other character sets that can be represented with 16-bit characters. So, as far as I am aware, boost::lexical_cast makes no attempt to second-guess what you really want when you attempt to do such a conversion.

Share this post


Link to post
Share on other sites
OK cheers for the info, Providing specialisations of lexical_cast that deal with std::wstring->std::string conversion and std::string->std::wstring seem to have done the trick.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!