Jump to content

  • Log In with Google      Sign In   
  • Create Account

Would you let a digital copy of yourself be made?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
20 replies to this topic

#1 Cornstalks   Crossbones+   -  Reputation: 6989

Posted 31 December 2012 - 03:34 AM

For those of you who haven't watched Caprica, some explanation/background: Suppose there was a way someone could be "digitally copied"---their memories, emotions, personality, behaviors, nuances, etc. Everything that wasn't physical (but even then, this digital copy would project its own avatar as an image of the original physical person) was "downloaded" to a computer and existed in a virtual world (or worlds), kind of like the Matrix. The only difference between the digital copy and the original was just that: the original is physical and the digital copy is "software" (and if you're spiritual, perhaps the physical has an associated spirit which the digital copy lacks, but spirituality isn't my focus here). Other than this difference, the digital copy is you.

 

Would you want this? You, the physical, original you? What if they could make you an artificial body that looked and felt human enough and loaded your digital copy into it? Would you want that? This opens up a bunch of possibilities and consequences, like holding onto a loved one after they pass, or blurring the line between what's real and what isn't.

 

Personally, I don't think I would like this. When I die, I want my family and friends to move on and not cling on to something to (what I would call) an unhealthy degree. It's not a ticket to immortality, either, because it's the digital copy that lives on after you, not the original you.

 

It's weird to think about. What do you think?

 

 

 

Background: I just started watching Battlestar Galactica this week. I finished the first season in less than 24 hours, then I started season 2 and Caprica. I just finished Caprica and plan on continuing Battlestar Galactica. Most productive winter break ever biggrin.png


[ I was ninja'd 71 times before I stopped counting a long time ago ] [ f.k.a. MikeTacular ] [ My Blog ] [ SWFer: Gaplessly looped MP3s in your Flash games ]

Sponsor:

#2 Waterlimon   Crossbones+   -  Reputation: 2562

Posted 31 December 2012 - 04:41 AM

I would do it and have my(real)self killd to gain immortality :3

o3o


#3 Bacterius   Crossbones+   -  Reputation: 8862

Posted 31 December 2012 - 05:07 AM

If the digital copy is everything you are, then it is you, at least if it can be projected into a real body. But our society is built around the concept of death, if we were suddenly immortal, we'd need to do some serious changes on the way we think of "life". I don't think we can really imagine what being immortal would feel like - it's probably inconceivable for our generation and many to come.

 

Edit: we'd also have some very serious logistical issues pretty soon without population control laugh.png


Edited by Bacterius, 31 December 2012 - 06:25 AM.

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis


#4 KidsLoveSatan   Members   -  Reputation: 527

Posted 31 December 2012 - 05:55 AM

For experimental reason perhaps. Though I disagree about it being me, that would be like saying a canister of hydrogen is the very same atoms as in the sun. The difference is localization.

 

It might be immortal, I won't be.



#5 laztrezort   Members   -  Reputation: 966

Posted 31 December 2012 - 07:54 AM

There was a short SF story, cannot remember the author offhand but the title was something like "Think Like A Dinosaur" that stuck with me.

 

In it, FTL transportation was possible, but it involved making a complete copy of yourself.  To "balance" things out, however, the original you needed to be destroyed.  So to you, it would be a matter of sitting in a chamber and effectively being terminated/comitting suicide.  Your copy would only experience the transportation.

 

I suppose what stuck with me from the story, and applies to this topic, is the idea of identity, and the fact that everyone else in the universe (including your copy) would have a different view on that identity than you would.

 

Another series of books by Tony Ballantine (Recursion Trilogy) deals more directly with digital copies and how they immediately start to deviate after the point of beign copied, and how these copies might view each other.  These books also contain some scary implications of such technology - there is a certain female character whose many copies are illegaly traded to be used as torture entertainment by others.



#6 Cornstalks   Crossbones+   -  Reputation: 6989

Posted 31 December 2012 - 11:54 AM

There was a short SF story, cannot remember the author offhand but the title was something like "Think Like A Dinosaur" that stuck with me.

 

In it, FTL transportation was possible, but it involved making a complete copy of yourself.  To "balance" things out, however, the original you needed to be destroyed.  So to you, it would be a matter of sitting in a chamber and effectively being terminated/comitting suicide.  Your copy would only experience the transportation.

 

I suppose what stuck with me from the story, and applies to this topic, is the idea of identity, and the fact that everyone else in the universe (including your copy) would have a different view on that identity than you would.

That's an interesting idea for a story. Reminds me of some of the discussions on reincarnation we had in my World Religions class, and what (we/others think) identity really is.

 

I suppose if I had some kind of guarantee that when I died, my digital avatar(s) would also be deleted, I might be okay with it. I would have to have a really good reason for making a digital copy, too. If this were the case, I guess I would see my digital avatar as simply being an artificial copy. However, if it could outlive me, then I/others might identify with it more for some reason, which is what I don't want.

 

I'm trying to come up with a way of saying why I wouldn't want a digital copy of myself, but I'm having a hard time putting my thoughts into words and finding the cause of my feelings.


[ I was ninja'd 71 times before I stopped counting a long time ago ] [ f.k.a. MikeTacular ] [ My Blog ] [ SWFer: Gaplessly looped MP3s in your Flash games ]

#7 Wavarian   Members   -  Reputation: 719

Posted 31 December 2012 - 01:18 PM

The digital copy would have a few flaws:

 

1. At what point in time would the copy be made? Would it grow older, or could I be 70 years old looking at my 20 year old copy?

 

2. Given that our real-world experiences shape our personalities and behaviors, wouldn't we both travel different paths to become completely different people?

 

And then you'd have to wonder if you'd ever get jealous of your digital copy should they end up being more "successful" than yourself.



#8 Cornstalks   Crossbones+   -  Reputation: 6989

Posted 31 December 2012 - 01:26 PM

1. At what point in time would the copy be made? Would it grow older, or could I be 70 years old looking at my 20 year old copy?

I suppose the copy could be made whenever, but I'm currently envisioning a "current copy" (that is, if you make a copy now, the copy is "up-to-date," and there isn't a way to make a copy that represents you 20 years ago). But that's just how I'm envisioning it, there's no reason you can lead yourself on a thought experiment that's a bit different.

 

2. Given that our real-world experiences shape our personalities and behaviors, wouldn't we both travel different paths to become completely different people?

Yeah, pretty much. That kind of happens in Caprica. You could, of course, always "re-copy" yourself and update all your digital copies if you wanted to stay in sync. You could also have a "live-feedback" system where you and your digital copy are always connected, continuously sharing your... brains or whatever... so that the two of you are always up to date with each other (but of course you could still evolve differently).


Edited by Cornstalks, 31 December 2012 - 01:31 PM.

[ I was ninja'd 71 times before I stopped counting a long time ago ] [ f.k.a. MikeTacular ] [ My Blog ] [ SWFer: Gaplessly looped MP3s in your Flash games ]

#9 zedz   Members   -  Reputation: 291

Posted 31 December 2012 - 06:01 PM

If the digital copy is everything you are, then it is you

OK, say you're in a room with this copy of you, its a perfect copy of you.

And they say only one of you can leave (for obvious reasons) would you have no qualms about putting the gun to your head & pulling the trigger & be happy knowing that a copy of you would walk out of the room



#10 ShadowValence   Members   -  Reputation: 380

Posted 31 December 2012 - 06:22 PM

It would still be a different entity. I am me because of the things I've done and been through. The very second after inception, my copy and i would begin leading separate lives. i mean given the same circumstances can i honesty say that i would make the same exact decisions? Perhaps today I'm favoring my left side and i take the path on the left... Tomorrow i may take the right given the exact same decision to make... It would no longer be me....

#11 ShadowValence   Members   -  Reputation: 380

Posted 31 December 2012 - 06:25 PM

@cornstalks - they wouldn't be me; but the idea of a collective intelligence intrigues me. How many copies can we make? Would there be any negative side effects to such a system? Perhaps going mad from too much stimuli?

#12 Bacterius   Crossbones+   -  Reputation: 8862

Posted 31 December 2012 - 06:25 PM

If the digital copy is everything you are, then it is you

OK, say you're in a room with this copy of you, its a perfect copy of you.

And they say only one of you can leave (for obvious reasons) would you have no qualms about putting the gun to your head & pulling the trigger & be happy knowing that a copy of you would walk out of the room

Well it's unclear what happens to your consciousness when an exact copy of you is made.. is it just another, separate "you", with the same memories, experiences, skills, etc... completely independent of you, or are you somehow "linked" to him, like entangled particles or something? Because if it is exactly you, perhaps when one of your copies dies (even the original) you just get "transferred" to another copy like nothing happened, so you wouldn't really have died - one of your physical copies just stopped functioning. I guess it depends how you define consciousness.

 

Otherwise, he'd just be a completely different human, but just very similar to you - he'll basically start off being exactly like you, then deviate from your own path based on his actions, thoughts, etc.. in this respect he should be treated as a different person (perhaps a very close twin) so obviously the "only one of you can leave the room" rule would be wrong because you'd be putting one individual's life above another (even if they are almost the same person, but not quite) but then again we've never experienced anything remotely close to this, so I don't know how people would react to it ethically. It's difficult to conceptualize.

 

By the way, you'd always have people cheating the system and duplicating themselves regardless of any cloning regulations, unless the laws of physics prevent it..


Edited by Bacterius, 31 December 2012 - 06:29 PM.

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis


#13 GMuser   Members   -  Reputation: 211

Posted 31 December 2012 - 08:44 PM

My initial reaction was that no, I wouldn't want to inflict such a horrible thing on another person. But then I realise this is exactly what I've always wanted, multiple clones of me to play games with at my level and enthusiasm. It would be better if it was like in Naruto where the clones and original can re-merge and everything they learnt is added to the originals knowledge/skill set.



#14 FableFox   Members   -  Reputation: 506

Posted 31 December 2012 - 10:17 PM

I would do it and have my(real)self killd to gain immortality :3

no, it's not you. you are dead. it's the other you that is immortal. there is no conduit between you and other you for you to jump through when you died. it's like having children. you died. you children lives. you don't suddenly become your children. rolleyes.gif


Fable Fox is Stronger <--- Fable Fox is Stronger Project

#15 slicer4ever   Crossbones+   -  Reputation: 3886

Posted 31 December 2012 - 11:45 PM

a copy would never be me. i want immortality(yep, i don't care about petty idea's like living longer than your friends, or kids, if i was offered immortality, i would take it, and live until the end of the universe, and if possible, longer), and this is not the way to obtain that imo. in my opinion the only way to obtain it at first is a gradual reconstruction of my body/brain by nano-machines, or through slow grafting of hardware to merge my mind with it, but at a gradual pace, similar to how cells are replaced in the body.  eventually my brain and thoughts would be completely machine, at which point it might be easy to create copy's, and "upload" myself to inhabit other entity's, but in a sense, their would still be an instance of me that exists, that is uniquely me.  of course people might say that at a certain point, i stop being that unique instance, but honestly, i see no other way to obtain immortality that doesn't involve passing some invisible barrier that may or may not destroys the original me.


Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.

#16 Alpha_ProgDes   Crossbones+   -  Reputation: 4688

Posted 01 January 2013 - 12:23 AM

Well that concept was also explored in a show called "Dollhouse". And well based on what I saw, no. I am not going to be immortal. My copied twin or twins maybe, but I won't. So there's no real immortality unless you can guarantee my consciousness --not a copy-- can be transferred to an empty clone of myself for future use.


Beginner in Game Development? Read here.
 
Super Mario Bros clone tutorial written in XNA 4.0 [MonoGame, ANX, and MonoXNA] by Scott Haley
 
If you have found any of the posts helpful, please show your appreciation by clicking the up arrow on those posts Posted Image
 
Spoiler

#17 ddn3   Members   -  Reputation: 1286

Posted 01 January 2013 - 01:02 PM

Sure but both it and myself know it's just a copy, nothing wrong with that but it shouldn't spent the rest of its existence trying to ape a dead man it should grow into its own unique being and use my experience as a seed. The mistake is in thinking that copy no matter how "perfect" is u.



#18 laztrezort   Members   -  Reputation: 966

Posted 01 January 2013 - 04:07 PM

OK, say you're in a room with this copy of you, its a perfect copy of you.

Heh, reminds me of this: http://www.cracked.com/blog/human-clones-do-you-fk-or-fight/
(NSFW, and probably not safe for the easily offended)

On-topic: The real danger of such a technology is the potential abuses it could lead to. If copies were cheap to create and maintain, I could see it devaluing human life with all the associated problems that brings.

It would also probably force us to modify our view of free will, conciousness, human rights, etc. How would we reconcile, for example, the following questions:

If a computer simulation contains digital minds, what responsibilities does the owner of that digital space have to keep it up and running? Is turning off said simulation equivalent to mass murder? What if instead of turning it off, it was just re-booted to reset everyone back to an earlier state - would that be a form of murder? Would you have any moral or legal ownership or rights over your own copies? Does the government have a right to torture/interrogate a criminal's copy in lieu of doing so to the original? If a copy of a person commits murder or other serious crime, should the original person or other copies be suspected of having the same criminal tendancies?

#19 laztrezort   Members   -  Reputation: 966

Posted 01 January 2013 - 04:08 PM

EDIT: that was weird...

Edited by laztrezort, 01 January 2013 - 04:10 PM.


#20 Khaiy   Crossbones+   -  Reputation: 1342

Posted 01 January 2013 - 09:22 PM

This is always a fuzzy question because the parameters are generally not that well defined and shift around mid-discussion.

I would be OK with a copy of me being made provided that it wouldn't try to kill me, frame me, steal from my accounts, and other injuries that a perfect copy would be especially poised to inflict.

Aside from that, I think that the question is kind of a wash. I don't have any compelling reason to copy myself but I don't have a solid objection to one existing. A perfect copy of you wouldn't be materially different from any other non-you sentient entity. If there's a Khaiy in London, and another Khaiy in Hong Kong, so what? He wouldn't mess with me, nor I with him, unless there were an awfully good reason. And in that case, he or I would understand.

It would be the same sort of "immortality" as genetic continuation, and a copy of me would come into being knowing all of the things I would want to teach a conventional child. I think that people get hung up on the "it isn't really you" line of thought or the social implications and then have a gut reaction about those issues.

P.S. The best story I've read on this topic was Fat Farm.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS