Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

randomDecay

assembly problem

This topic is 6205 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello. I have a problem I am working on. I am trying to convert a 16-bit number to a null-terminated character string and I have to store the ASCII value for the characters at memory locations. I know what this means, but I can''t figure out how to do it. Please help.

Share this post


Link to post
Share on other sites
Advertisement
Guest Anonymous Poster
firstly what compiler/assembler u r using..
secondly please include ur sample implementation..
then ill help u

(always anonymous)

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
u r not making it clear..
r u using c/c++ or assembly language

(always confuse)

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
if u r using c/c++, just use itoa func,
else please refer to ascii table for the
integer hex value than blablablaa...

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
first, why are you doing this in asm when using vc++?
second, the previous AP meant what is the current code that you have so far, and the specific problems with it? This may soud a bit harsh but i think reiventing the wheel here is silly (use itoa() instead). Sound slike a hw problem to me. The algo though you are looking for is the following.


1. alloc memory for string (optional if you have memory somehwere for storage) (you can use the powers of two do help decide how many chars will be in the string)

num (the number you are converting)
ptr (pointer to the buffer)

loop while (num!=0) and (not past length of buffer)
a = num%10
b=a+(offset for ascii value which you can look up)
mov [ptr], b
num-=a
inc ptr
end loop
mov [ptr], 0


that should get you very well on your way (dont expect me to give you the code)

- groof

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
lets say 8-bit integer ''5'' in hex it should be 0x05

but in ASCII (refer the table) char ''5'' might be 0x34

then u must add some value to ur integer so it equal to 0x34
then it is ''5'' in char

(alway anonymous & confuse)

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!