Would you let a digital copy of yourself be made?

Started by
19 comments, last by W0nD3rL1t3 11 years, 3 months ago
@cornstalks - they wouldn't be me; but the idea of a collective intelligence intrigues me. How many copies can we make? Would there be any negative side effects to such a system? Perhaps going mad from too much stimuli?
Advertisement
If the digital copy is everything you are, then it is you

OK, say you're in a room with this copy of you, its a perfect copy of you.

And they say only one of you can leave (for obvious reasons) would you have no qualms about putting the gun to your head & pulling the trigger & be happy knowing that a copy of you would walk out of the room

Well it's unclear what happens to your consciousness when an exact copy of you is made.. is it just another, separate "you", with the same memories, experiences, skills, etc... completely independent of you, or are you somehow "linked" to him, like entangled particles or something? Because if it is exactly you, perhaps when one of your copies dies (even the original) you just get "transferred" to another copy like nothing happened, so you wouldn't really have died - one of your physical copies just stopped functioning. I guess it depends how you define consciousness.

Otherwise, he'd just be a completely different human, but just very similar to you - he'll basically start off being exactly like you, then deviate from your own path based on his actions, thoughts, etc.. in this respect he should be treated as a different person (perhaps a very close twin) so obviously the "only one of you can leave the room" rule would be wrong because you'd be putting one individual's life above another (even if they are almost the same person, but not quite) but then again we've never experienced anything remotely close to this, so I don't know how people would react to it ethically. It's difficult to conceptualize.

By the way, you'd always have people cheating the system and duplicating themselves regardless of any cloning regulations, unless the laws of physics prevent it..

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

My initial reaction was that no, I wouldn't want to inflict such a horrible thing on another person. But then I realise this is exactly what I've always wanted, multiple clones of me to play games with at my level and enthusiasm. It would be better if it was like in Naruto where the clones and original can re-merge and everything they learnt is added to the originals knowledge/skill set.

I would do it and have my(real)self killd to gain immortality :3

no, it's not you. you are dead. it's the other you that is immortal. there is no conduit between you and other you for you to jump through when you died. it's like having children. you died. you children lives. you don't suddenly become your children. rolleyes.gif

a copy would never be me. i want immortality(yep, i don't care about petty idea's like living longer than your friends, or kids, if i was offered immortality, i would take it, and live until the end of the universe, and if possible, longer), and this is not the way to obtain that imo. in my opinion the only way to obtain it at first is a gradual reconstruction of my body/brain by nano-machines, or through slow grafting of hardware to merge my mind with it, but at a gradual pace, similar to how cells are replaced in the body. eventually my brain and thoughts would be completely machine, at which point it might be easy to create copy's, and "upload" myself to inhabit other entity's, but in a sense, their would still be an instance of me that exists, that is uniquely me. of course people might say that at a certain point, i stop being that unique instance, but honestly, i see no other way to obtain immortality that doesn't involve passing some invisible barrier that may or may not destroys the original me.

Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.

Well that concept was also explored in a show called "Dollhouse". And well based on what I saw, no. I am not going to be immortal. My copied twin or twins maybe, but I won't. So there's no real immortality unless you can guarantee my consciousness --not a copy-- can be transferred to an empty clone of myself for future use.

Beginner in Game Development?  Read here. And read here.

 

Sure but both it and myself know it's just a copy, nothing wrong with that but it shouldn't spent the rest of its existence trying to ape a dead man it should grow into its own unique being and use my experience as a seed. The mistake is in thinking that copy no matter how "perfect" is u.

OK, say you're in a room with this copy of you, its a perfect copy of you.

Heh, reminds me of this: http://www.cracked.com/blog/human-clones-do-you-fk-or-fight/
(NSFW, and probably not safe for the easily offended)

On-topic: The real danger of such a technology is the potential abuses it could lead to. If copies were cheap to create and maintain, I could see it devaluing human life with all the associated problems that brings.

It would also probably force us to modify our view of free will, conciousness, human rights, etc. How would we reconcile, for example, the following questions:

If a computer simulation contains digital minds, what responsibilities does the owner of that digital space have to keep it up and running? Is turning off said simulation equivalent to mass murder? What if instead of turning it off, it was just re-booted to reset everyone back to an earlier state - would that be a form of murder? Would you have any moral or legal ownership or rights over your own copies? Does the government have a right to torture/interrogate a criminal's copy in lieu of doing so to the original? If a copy of a person commits murder or other serious crime, should the original person or other copies be suspected of having the same criminal tendancies?
EDIT: that was weird...
This is always a fuzzy question because the parameters are generally not that well defined and shift around mid-discussion.

I would be OK with a copy of me being made provided that it wouldn't try to kill me, frame me, steal from my accounts, and other injuries that a perfect copy would be especially poised to inflict.

Aside from that, I think that the question is kind of a wash. I don't have any compelling reason to copy myself but I don't have a solid objection to one existing. A perfect copy of you wouldn't be materially different from any other non-you sentient entity. If there's a Khaiy in London, and another Khaiy in Hong Kong, so what? He wouldn't mess with me, nor I with him, unless there were an awfully good reason. And in that case, he or I would understand.

It would be the same sort of "immortality" as genetic continuation, and a copy of me would come into being knowing all of the things I would want to teach a conventional child. I think that people get hung up on the "it isn't really you" line of thought or the social implications and then have a gut reaction about those issues.

P.S. The best story I've read on this topic was Fat Farm.

-------R.I.P.-------

Selective Quote

~Too Late - Too Soon~

This topic is closed to new replies.

Advertisement