A Kind Of Computer Capable Of Having Conciousness

Started by
57 comments, last by SuperVGA 11 years, 5 months ago
I'm not sure computer consciousness in the same sense as human consciousness is even desirable. Right now, we need computers because they do things we suck at, like adding large lists of numbers very quickly and without error. And computers need us because we do things they suck at, like working on arbitrary problems and finding creative solutions. We coexist as entities, symbiotically. Computers, just like a domestic cow or stalk of corn, need us to live and we need them. Is a computer any less alive than a chicken just because it's made of metal?

[Formerly "capn_midnight". See some of my projects. Find me on twitter tumblr G+ Github.]

Advertisement
Which people do you mean suggest this? It's a not uncommon view that consciousness arises out of the complexity of a large number of smaller simpler parts.


While that belief isn't uncommon, it lacks grounding. That is, there is no reason to believe that it is the case the consciousness arises inexorably from a complex system.

Aside from the lack of grounding, there are a number of other problems with that belief, the *least* of which is the lack of an objective ontology!

Consciousness is a really hard problem -- and it's incredibly difficult to even talk about. (Just an example: Francis Crick, co-discover of the structure of DNA, in a book about consciousness, explicitly refused to even define it!)

The study of consciousness encompass a broad range of fields spread over multiple disciplines as diverse as neuroscience and philosophy (which, believe it or not, interact to a significant degree). There's a lot of work being done by some incredibly bright people, yet we're no closer to anything like an answer today than we were 30 years ago.

One thing we can be reasonably certain about, however, is that we can't create anything resembling consciousness by purely algorithmic means.

One thing we can be reasonably certain about, however, is that we can't create anything resembling consciousness by purely algorithmic means.


Why?
An interesting paper on this topic: http://www.fhi.ox.ac...dmap-report.pdf

One thing we can be reasonably certain about, however, is that we can't create anything resembling consciousness by purely algorithmic means.[/quote]

Interesting, would you still hold that position, even when someone someday builds a seemingly conscious machine?

/felix

openwar - the real-time tactical war-game platform


[quote name='recompile' timestamp='1349824173' post='4988527']
One thing we can be reasonably certain about, however, is that we can't create anything resembling consciousness by purely algorithmic means.
Why?[/quote]I can't speak for recompile, but I took that statement to mean "we won't come up with an algorithm that produces consciousness directly, we will have to create a complex simulation in which conciousness is an emergent property".

I can't speak for recompile, but I took that statement to mean "we won't come up with an algorithm that produces consciousness directly, we will have to create a complex simulation in which conciousness is an emergent property".


On the other hand, recompile was referring to Searle, which states, if I understand him correctly, that not even this is possible. That there is only one consciousness and it can't be emulated.

openwar - the real-time tactical war-game platform

My consciousness, Felix?

On the other hand, recompile was referring to Searle, which states, if I understand him correctly, that not even this is possible.
Oh, you're right.
That is, there is no reason to believe that it is the case the consciousness [can arise] [s]inexorably[/s] from a complex system.
Well if you believe in evolution, then the reason to believe this is because that's what happened. The question is what kind of complex systems can give rise to behaviour that can be described as conciousness.
Well if you believe in evolution, then the reason to believe this is because that's what happened. The question is what kind of complex systems can give rise to behaviour that can be described as conciousness.


On the contrary, the fact of evolution puts the final nail in the coffin of epiphenomenalism (the idea that consciousness is caused by brain processes).

I'll try to keep this as simple as I can: We'll define consciousness simply as subjective experience.

First, note that you (i.e. your brain) can report on the content of phenomenal experience. (Give it a try.)

Assuming that epiphenomenalism is true, consciousness is causally inert. (This is not in dispute)

As consciousness is causally inert, it should be impossible for the brain to report on the content of phenomenal experience. (This one is tricky at first. Take some time with it.)

That was too easy. Is there a way out? Well, we should expect specific brain states to give rise to specific experiences. In order for the brain to report on the content of phenomenal experience, it needs only additional structures that can examine the state of the relevant neurons involved.

Surely, such structures can evolve. After all, evolution needs just two things to work: 1) Heritable variation with change and 2) A selection mechanism

Well, we have (1), no question. However, we can't get (2). (Can you see why? It's easier than you think.)

Like I said before. It's a hard problem ... and we have yet to even crack open an undergrad textbook here!

There is a large interdisciplinary research effort (which as spawned several new disciplines) yet for the volume of work produced, we're no closer to an answer. If you're a layperson, and you think you have the answer, you're virtually guaranteed to be wrong.
How about:
1) I evolved out of a big burning ball of hydrogen. That happened.
2) I can be measured as being concious, but the ball of hydrogen cannot.
The same stuff was reconfigured into a different complex system, which allowed conciousness to be observed.

If conciousness didn't arise out of this 'stuff', then you're attributing to something else that doesn't exist, like a spirit or soul -- i.e. magic, not science. This is likely the part where I've misunderstood you.

You don't need to drag theories of the mind into this, just ignore and treat it as a black box.
A thing that can be described as concious exists. That thing is a complex system that evolved from previous complex systems. Prior forms of the thing along it's evolutionary path cannot be described as conscious. At some point along that path, forms of the thing developed conscious behaviours. Either that is due to the new configuration, or it's due to magic.

If you're a layperson, and you think you have the answer, you're virtually guaranteed to be wrong.
N.B. we're not trying to say anything about what consciousness is or how it arises, but simply disputing your assertion that it can't arise from a complex system -- or maybe we've misinterpreted this statement, but that's how I'd paraphrase it at the moment.
I am a machine, and there's no reason to believe that I'm conscious through anything but my own machinery. If I build a machine just like me (through reproduction) I can expect that it will likely be able to exhibit consciousness too.

This topic is closed to new replies.

Advertisement