Jump to content

  • Log In with Google      Sign In   
  • Create Account

A Kind Of Computer Capable Of Having Conciousness


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
58 replies to this topic

#21 Sean T. McBeth   Crossbones+   -  Reputation: 1511

Posted 09 October 2012 - 03:00 PM

I'm not sure computer consciousness in the same sense as human consciousness is even desirable. Right now, we need computers because they do things we suck at, like adding large lists of numbers very quickly and without error. And computers need us because we do things they suck at, like working on arbitrary problems and finding creative solutions. We coexist as entities, symbiotically. Computers, just like a domestic cow or stalk of corn, need us to live and we need them. Is a computer any less alive than a chicken just because it's made of metal?

[Formerly "capn_midnight". See some of my projects. Find me on twitter tumblr G+ Github.]


Sponsor:

#22 recompile   Members   -  Reputation: 151

Posted 09 October 2012 - 05:09 PM

Which people do you mean suggest this? It's a not uncommon view that consciousness arises out of the complexity of a large number of smaller simpler parts.


While that belief isn't uncommon, it lacks grounding. That is, there is no reason to believe that it is the case the consciousness arises inexorably from a complex system.

Aside from the lack of grounding, there are a number of other problems with that belief, the *least* of which is the lack of an objective ontology!

Consciousness is a really hard problem -- and it's incredibly difficult to even talk about. (Just an example: Francis Crick, co-discover of the structure of DNA, in a book about consciousness, explicitly refused to even define it!)

The study of consciousness encompass a broad range of fields spread over multiple disciplines as diverse as neuroscience and philosophy (which, believe it or not, interact to a significant degree). There's a lot of work being done by some incredibly bright people, yet we're no closer to anything like an answer today than we were 30 years ago.

One thing we can be reasonably certain about, however, is that we can't create anything resembling consciousness by purely algorithmic means.

#23 szecs   Members   -  Reputation: 2141

Posted 09 October 2012 - 09:50 PM

One thing we can be reasonably certain about, however, is that we can't create anything resembling consciousness by purely algorithmic means.


Why?

#24 Felix Ungman   Members   -  Reputation: 1033

Posted 10 October 2012 - 05:22 AM

An interesting paper on this topic: http://www.fhi.ox.ac...dmap-report.pdf

One thing we can be reasonably certain about, however, is that we can't create anything resembling consciousness by purely algorithmic means.


Interesting, would you still hold that position, even when someone someday builds a seemingly conscious machine?

/felix

openwar  - the real-time tactical war-game platform


#25 Hodgman   Moderators   -  Reputation: 30384

Posted 10 October 2012 - 07:13 AM


One thing we can be reasonably certain about, however, is that we can't create anything resembling consciousness by purely algorithmic means.

Why?

I can't speak for recompile, but I took that statement to mean "we won't come up with an algorithm that produces consciousness directly, we will have to create a complex simulation in which conciousness is an emergent property".

#26 Felix Ungman   Members   -  Reputation: 1033

Posted 10 October 2012 - 07:40 AM

I can't speak for recompile, but I took that statement to mean "we won't come up with an algorithm that produces consciousness directly, we will have to create a complex simulation in which conciousness is an emergent property".


On the other hand, recompile was referring to Searle, which states, if I understand him correctly, that not even this is possible. That there is only one consciousness and it can't be emulated.

Edited by Felix Ungman, 10 October 2012 - 07:41 AM.

openwar  - the real-time tactical war-game platform


#27 szecs   Members   -  Reputation: 2141

Posted 10 October 2012 - 10:56 AM

My consciousness, Felix?

Edited by szecs, 10 October 2012 - 10:56 AM.


#28 Hodgman   Moderators   -  Reputation: 30384

Posted 10 October 2012 - 06:22 PM

On the other hand, recompile was referring to Searle, which states, if I understand him correctly, that not even this is possible.

Oh, you're right.

That is, there is no reason to believe that it is the case the consciousness [can arise] inexorably from a complex system.

Well if you believe in evolution, then the reason to believe this is because that's what happened. The question is what kind of complex systems can give rise to behaviour that can be described as conciousness.

#29 recompile   Members   -  Reputation: 151

Posted 10 October 2012 - 08:31 PM

Well if you believe in evolution, then the reason to believe this is because that's what happened. The question is what kind of complex systems can give rise to behaviour that can be described as conciousness.


On the contrary, the fact of evolution puts the final nail in the coffin of epiphenomenalism (the idea that consciousness is caused by brain processes).

I'll try to keep this as simple as I can: We'll define consciousness simply as subjective experience.

First, note that you (i.e. your brain) can report on the content of phenomenal experience. (Give it a try.)

Assuming that epiphenomenalism is true, consciousness is causally inert. (This is not in dispute)

As consciousness is causally inert, it should be impossible for the brain to report on the content of phenomenal experience. (This one is tricky at first. Take some time with it.)

That was too easy. Is there a way out? Well, we should expect specific brain states to give rise to specific experiences. In order for the brain to report on the content of phenomenal experience, it needs only additional structures that can examine the state of the relevant neurons involved.

Surely, such structures can evolve. After all, evolution needs just two things to work: 1) Heritable variation with change and 2) A selection mechanism

Well, we have (1), no question. However, we can't get (2). (Can you see why? It's easier than you think.)

Like I said before. It's a hard problem ... and we have yet to even crack open an undergrad textbook here!

There is a large interdisciplinary research effort (which as spawned several new disciplines) yet for the volume of work produced, we're no closer to an answer. If you're a layperson, and you think you have the answer, you're virtually guaranteed to be wrong.

#30 Hodgman   Moderators   -  Reputation: 30384

Posted 10 October 2012 - 09:07 PM

How about:
1) I evolved out of a big burning ball of hydrogen. That happened.
2) I can be measured as being concious, but the ball of hydrogen cannot.
The same stuff was reconfigured into a different complex system, which allowed conciousness to be observed.

If conciousness didn't arise out of this 'stuff', then you're attributing to something else that doesn't exist, like a spirit or soul -- i.e. magic, not science. This is likely the part where I've misunderstood you.

You don't need to drag theories of the mind into this, just ignore and treat it as a black box.
A thing that can be described as concious exists. That thing is a complex system that evolved from previous complex systems. Prior forms of the thing along it's evolutionary path cannot be described as conscious. At some point along that path, forms of the thing developed conscious behaviours. Either that is due to the new configuration, or it's due to magic.

If you're a layperson, and you think you have the answer, you're virtually guaranteed to be wrong.

N.B. we're not trying to say anything about what consciousness is or how it arises, but simply disputing your assertion that it can't arise from a complex system -- or maybe we've misinterpreted this statement, but that's how I'd paraphrase it at the moment.
I am a machine, and there's no reason to believe that I'm conscious through anything but my own machinery. If I build a machine just like me (through reproduction) I can expect that it will likely be able to exhibit consciousness too.

Edited by Hodgman, 10 October 2012 - 09:37 PM.


#31 szecs   Members   -  Reputation: 2141

Posted 10 October 2012 - 10:15 PM

I think I understand what Recompile says. I can't believe that my conciousness just arises from materials, because, um..... um.... It's impossible for me to describe. But my own conciousness is a mystery for me. I can't believe that conciousness is a continuum. It exists, or not. Yet, I was a baby once. When was my conciousness switched on? Or what is part of my concious existence and what are just memories? And the usual disturbing questions.


I don't know why the "feel" of conciousness can't rise from a complex system, but I feel it somehow defeats causality. The fact that I'm experiencing it and thinking about it IS a brain process, so this "upper" concious feel DID affect material in my brain.
Maybe Recompile is talking about something like this? It's only a feel though, I don't really get the reasoning. Why do you, Recompile, (and I) think that the "feeling" of conciousness comes from the "upper" conciousness? And not simply another brain process (it's just not the same linguistic reasoning process but an emotional one, which adds a bit to the "out of place" feel )? Or maybe that points to a contradiction in causality?

----

The whole problem with this approach in the context of this thread: I can never directly experience other consciousnesses. So defining it is useless and impossible in that context. So as wondering if others have it too. The one and only conciousness I can ever "feel" is mine. All I can experience comes through some limited channels. That's why the Turing test (in principle, what I called Duck test) applies (in principle, you can only define concious behaviour (it's still arbitrary a bit)).

Another problem: Maybe it's through that conciousness can't simply rise from a complex system. But why do you think that a complex system can"t just get a "soul" fro somewhere? Why can't a piece of machine get a "soul" why are humans more special than machines?
(it's just philosophy of course.)


EDIT: I misread the second part of the post. So you say evolution can't describe the existence of conciousness? (I recall hearing this as an argument against both epiphenomenalism and evolutionism ) Why? Why do you think that animals/planets/stars don't have one? Because of their behaviour. Why do you think humans (more precisely other humans) have it? Because of their behaviour.
Even you evolve: You started as a single cell (um..... not precise, but you get it). At one point in your own evolution, you "got" conciousness. Do you think you didn't evolve? When was your conciousness "turned on"? Can you remember? Are you sure those are not just implanted memories?

Edited by szecs, 10 October 2012 - 10:42 PM.


#32 recompile   Members   -  Reputation: 151

Posted 11 October 2012 - 12:09 AM

This is likely the part where I've misunderstood you.

Indeed. I'm not positing anything; I was just trying to show why evolution does not support epiphenomenalism.

You don't need to drag theories of the mind into this, just ignore and treat it as a black box.
A thing that can be described as concious exists. That thing is a complex system that evolved from previous complex systems. Prior forms of the thing along it's evolutionary path cannot be described as conscious. At some point along that path, forms of the thing developed conscious behaviours. Either that is due to the new configuration, or it's due to magic.


I'd hoped to show why the question is not that simple.

That we evolved is beyond dispute. It's also obvious that we are conscious. Though that's not terribly interesting. Just from "that we evolved" we need only assume the dominant metaphysics to turn that in to "we evolved to be conscious". No further work required -- Short of positing new metaphysics, you're forced to that conclusion. (Boring, isn't it?)

N.B. we're not trying to say anything about what consciousness is or how it arises, but simply disputing your assertion that it can't arise from a complex system


You're in luck, as I never intended to make such an assertion. (I understand completely, however, why my "attack" on epiphenomenalism would give you that impression. You need a lot of background to avoid that kind of misunderstanding.) Now, I did assert that it cannot be by purely algorithmic means. That may be too strong a statement, but it's stood up to near constant attack for more than 30 years now.

I may have been unfair to your position by picking on epiphenomenalism -- er, and refusing to advance a position. If you want, I can pick on a different theory; but I won't advance a position :)

I am a machine, and there's no reason to believe that I'm conscious through anything but my own machinery. If I build a machine just like me (through reproduction) I can expect that it will likely be able to exhibit consciousness too.


That reminds me of an old joke: "AI researches have discovered a method to create a thinking machine. It requires only two technicians and a nine-month initial construction period. Unfortunately, initial maintenance and training is both expensive and time-consuming with few production units able to eventually perform meaningful cognitive work."

Anyhow, we're just playing here (this is beginners stuff, after all) I only wanted to make the point that there are no simple and easy answers like the earlier posts in the thread imply. Oh, also that computationalism is dead -- I almost forgot that one!

#33 Hodgman   Moderators   -  Reputation: 30384

Posted 11 October 2012 - 01:13 AM

I only mentioned evolution as the chef that happened to create us (the conscious complex system) because it seemed that you were saying that a sufficiently complex system (arranged in a very specific way) cannot give rise to consciousness, when such a thing has obviously happened. You were stating that consciousness does not arise out of the complexity of a large number of smaller simpler parts.
Epiphenomenalism or any kind of evolutionary selection weren't connected.

Now, I did assert that it cannot be by purely algorithmic means.

What if, stepping away from biology/neurology, we 'solve' the laws of physics and are able to algorithmically simulate reality, then surely it would follow that any chemical/biological/neurological process could be simulated by such an algorithm?

N.B. when we stopped testing nuclear bombs in atolls, we started testing them in computer simulations of the laws of physics.

Edited by Hodgman, 11 October 2012 - 01:18 AM.


#34 mdwh   Members   -  Reputation: 875

Posted 11 October 2012 - 08:49 AM

What if, stepping away from biology/neurology, we 'solve' the laws of physics and are able to algorithmically simulate reality, then surely it would follow that any chemical/biological/neurological process could be simulated by such an algorithm?

N.B. when we stopped testing nuclear bombs in atolls, we started testing them in computer simulations of the laws of physics.

A simulation is not necessarily the same as the actual thing - a simulated nuclear explosion is not, after all.

That's not to say I agree with recompile that it _can't_ be algorithmic; rather it seems to be an open question.

Admittedly consciousness is a special case in that a realistic simulation would seem indistinguishable from the real thing. This is related to the question of whether Philosophical Zombies can exist (see Wikipedia).
http://erebusrpg.sourceforge.net/ - Erebus, Open Source RPG for Windows/Linux/Android
http://homepage.ntlworld.com/mark.harman/conquests.html - Conquests, Open Source Civ-like Game for Windows/Linux

#35 recompile   Members   -  Reputation: 151

Posted 11 October 2012 - 02:56 PM

... a sufficiently complex system (arranged in a very specific way) cannot give rise to consciousness, when such a thing has obviously happened.


It's not obvious at all. From: We are a complex system; We are conscious; It does not follow that consciousness must necessarily be an emergent property of certain kinds of complex systems. (That's just basic logic.)

What if, stepping away from biology/neurology, we 'solve' the laws of physics and are able to algorithmically simulate reality, then surely it would follow that any chemical/biological/neurological process could be simulated by such an algorithm?


Let's pick the low-hanging fruit. As mdwh rightly pointed out, you can't mistake the simulation of the thing for the thing itself. A simulated rainstorm won't get you wet, and Japan has nothing to fear from a simulated nuclear bomb. The standard counter: A simulated thought is still a thought.

I assume that you mean we should simulate a person, and that person would be conscious? Let's assume it's true, we have such a thing, and look at some of the immediate consequences. This assumes computationalism is true and, as a consequence, multiple realizability -- though it gets stretched to the limits! You need to assert than any system that is computationally isomorphic at any level of description with relevant parts of the brain at any level of description would share the same experience. (You should see an ontology problem here) Straight from John Searle "For any program there is some sufficiently complex object such that there is some description of the object under which it is implementing the program. Thus for example the wall behind my back is right now implementing the Wordstar program, because there is some pattern of molecule movements which is isomorphic with the formal structure of Wordstar. But if the wall is implementing Wordstar then if it is a big enough wall it is implementing any program, including any program implemented in the brain." It not longer becomes silly to say something like "at some point in history, Mt. Everest independently invented calculus"

Further, being deterministic, we lose downward causation, forcing us to conclude that consciousness is epiphenomenal (the standard arguments against epiphenomenalism now apply, though we're in a tougher starting position. To counter, for example, Popper, you'd need to change your set of metaphysical assumptions or advance a new physics.)

There are other problems, of course, before we even get to those consequences. The most obvious is that we cannot simulate the universe on a computer as a computer is a deterministic system and the universe is not. (Asserting the the brain is deterministic is a good approach, though that brings you right back to the problems above, and a few zillion others.)

What's the point of all this? The problem is incredibly difficult and there are no simple or easy answers. We just picked some of the easy stuff, the low-hanging fruit, and we've already got more than anyone on an internet forum want's to deal with.

While it's easy to say things like "it must be that X because the metaphysics demands that conclusion" it doesn't get you from "that" to "how" or even guarantee that the question and the conclusion are consistent with your foundational assumptions. (An unrelated example to clarify that point: Science can say nothing about the existence of god because that topic is outside the scope of science. Any conclusions drawn from the metaphysics [e.g. god does not exist] are not scientific statements. To make it a scientific statement, you'd need a new metaphysics to increase the scope of science so that the question can be answered by scientific means.)

Hope that helps.

#36 cowsarenotevil   Crossbones+   -  Reputation: 2043

Posted 11 October 2012 - 06:44 PM

Straight from John Searle "For any program there is some sufficiently complex object such that there is some description of the object under which it is implementing the program. Thus for example the wall behind my back is right now implementing the Wordstar program, because there is some pattern of molecule movements which is isomorphic with the formal structure of Wordstar. But if the wall is implementing Wordstar then if it is a big enough wall it is implementing any program, including any program implemented in the brain." It not longer becomes silly to say something like "at some point in history, Mt. Everest independently invented calculus"


I don't see why this is a problem, really. Particularly if the alternative allows for "philosophical zombies," which it seems to. You build a robot that simulates the structure of the brain to some arbitrary level of precision and somehow it doesn't become conscious. Since you've rejected epiphenomenalism, you seemingly have to accept the somewhat questionable view that, no matter how closely we simulate the functional brain, be it through arbitarily detailed physical simulations, it will not behave the same way. If it did, consciousness would be epiphenomenal. I've never seen a compelling reason to accept that qualia zombies are actually possible, but the idea that it's possible to simulate a person in a way that functionally matches the brain and still acts differently seems especially weird. In this case one imagines that, by studying the real brain and the "simulated" one, there would be a particular moment in time in which their behavior diverged, but what would account for this?

Also, "Mt. Everest is conscious if you can find an isomorphism that shows it performs the same computations that a human brain does" is really not that weird if you take the view that "consciousness" is nothing special to begin with. The fact that we can talk about our "subjective experience" is, of course, only interesting if we assume in advance that our "subjective experience" is something special. Just as plausible to me is that there's nothing special about our subjective experience at all aside from the fact that, as humans, we enjoy talking about it and pretending that it's special.

It seems to me that you've delievered the standard argument against epiphenomenal qualia, but you haven't really put forward anything in favor of the existence of qualia at all. "Well, gee, it sure seems like qualia is a thing" is of coourse circular at best. Naturally, I can't definitively argue against qualia, in the same way I can't argue against a God that is perfect at hiding from non-believers or against the fact that the number '2' has a special property "blargh," but that doesn't automatically make it more interesting than those two things either.

On the whole, though, my response to most arguments against functionalism is that, yeah, consciousness as we imagine it is not special at all. And so the fact that Mt. Everest is also not special, at least with respect to consciousness, doesn't change this.

There are other problems, of course, before we even get to those consequences. The most obvious is that we cannot simulate the universe on a computer as a computer is a deterministic system and the universe is not. (Asserting the the brain is deterministic is a good approach, though that brings you right back to the problems above, and a few zillion others.)


The universe is "non-deterministic" in only a very loose sense. Anything that's typically treated as non-deterministic can just as easily be expressed as something with hidden variables; results denying hidden variables make assumptions about the form that those hidden variables take, but these restrictions don't apply to simulations. Aside from the fact that we don't know the values of those hidden variables, there is nothing that precludes us from simulating them.
-~-The Cow of Darkness-~-

#37 recompile   Members   -  Reputation: 151

Posted 11 October 2012 - 08:45 PM

It seems to me that you've delievered the standard argument against epiphenomenal qualia

Maybe. I offered, in a single sentence a few posts back, something similar to (a modification of) one of the more common arguments against epiphenomenalism. I'm not sure that I'd say that I "delivered the standard argument against ..." (I think that "expression of" makes a stronger case than "knowledge of")

you haven't really put forward anything in favor of the existence of qualia at all.


Again, I'm not advocating any position. Even against computationalism, the only position I admit to holding, I've not offered an argument. On epiphenomenalism, I've not giving an opinion, just briefly noted some problems with it and alluded to others.

I'm just trying to point out that the problem is incredibly difficult and there are no simple or easy answers. It's silly to make any sort of claim as to the nature of consciousness, let alone what you need to artificially create it! Absurd things like "it must be" type answers are not just uninteresting (they're implicit in the metaphysics, after all, and thus offer us nothing new), they're ultimately useless (they can't get us past "that" to "how").

Again, the problem is extraordinarily complicated. It's absurd to make any claims like "it must be that" or "requires only that".

Just for fun:

Anything that's typically treated as non-deterministic can just as easily be expressed as something with hidden variables;

It turns out that this isn't true. See Bell.

#38 Hodgman   Moderators   -  Reputation: 30384

Posted 11 October 2012 - 08:50 PM

... a sufficiently complex system (arranged in a very specific way) cannot give rise to consciousness, when such a thing has obviously happened.

It's not obvious at all. From: We are a complex system; We are conscious; It does not follow that consciousness must necessarily be an emergent property of certain kinds of complex systems. (That's just basic logic.)

The obvious part is that I am a complex system that is conscious.

If the phenomenon of consciousness is not caused by something within my body, then what is it caused by? It seems that if it's not caused by stuff within my body, then the only other option is that it's caused by magic like souls and whatnot. Seeing the latter aren't science, doesn't that only leave us with the option that something in the body is creating consciousness? And seeing that the body is a complex system, that gives us an example of a conscious complex system?

We know it's not caused by one specific part of the body -- cutting out specific bits of the brain hasn't narrowed down the "consciousness lobe" -- but it's known that if you remove enough parts of the body then eventually the phenomenon does disappear. So it follows that some combination of body parts are required for consciousness to be able to be observed by an experimenter.

So, if you're saying that consciousness isn't created by the body, where do you believe it comes from?

Let's pick the low-hanging fruit. As mdwh rightly pointed out, you can't mistake the simulation of the thing for the thing itself. A simulated rainstorm won't get you wet, and Japan has nothing to fear from a simulated nuclear bomb. The standard counter: A simulated thought is still a thought.

A simulated rainstorm will wet things inside the rainstorm. I don't know why you'd need to point out that it won't we me. It's obvious that me and the rainstorm are in two different realities.

I personally subscribe to model-dependent reality: that there is no one true objective reality, there's only what you can measure and model. My mind creates a model of the world from the inputs it receives, and it's that model that I perceive as "the true reality". It's possible that my brainstem is connected to a computer and I'm in "the matrix" along with simulated rainstorms. In that hypothetical world, a simulated rainstorm would make me feel wet. But there's no way for me to measure such a hypothesis so it's neither true or false, it's irrelevant. Likewise my whole chemical structure could be some simulation and there'd be no way for me to measure that, so it's irrelevant. It's nonsense to talk about whether I'm truly a part of the one objective reality or not.

I assume that you mean we should simulate a person, and that person would be conscious?

If we created a simulation of reality, and somehow measured the make-up of a person and cloned them in the simulation, and we could somehow interact with the simulation, e.g. via a proxy person that could be controlled from outside the simulation, then, we would be able to interact with the simulated person and ask them questions. From that questioning, we'd be able to judge if they are conscious just as well as you can judge a "real" person's level of consciousness through questioning. You'd also be able to perform any measurements that you could in the "real world", such as EEGs etc.
So regardless of whether the sim is "really" experiencing consciousness (which is nonsensical to ask), you would be able to measure their level of consciousness just as well as you can any other human/animal.
Right?

It not longer becomes silly to say something like "at some point in history, Mt. Everest independently invented calculus"

Indeed, that's not a silly thing to say. Especially once you take branching history into account, there's also a point somewhere where you played chess against Mt. Everest while riding a pink pony.

Further, being deterministic
... a computer is a deterministic system and the universe is not.

Just because it can be simulated, it doesn't mean that it's deterministic as in one set of inputs == one set of outputs. One interaction can lead to an infinite number of branches in the simulation's timeline, which all then need to be simulated. A computer with infinite memory and infinite computation time can still simulate the universe. N.B. we already do this on extremely, extremely small scales.

Edited by Hodgman, 11 October 2012 - 09:46 PM.


#39 szecs   Members   -  Reputation: 2141

Posted 11 October 2012 - 10:27 PM

I guess my posts are stupid or hard to understand.
We keep asking and telling the same things. Why do you, Recompile, ignore these?

#40 recompile   Members   -  Reputation: 151

Posted 11 October 2012 - 10:34 PM

If the phenomenon of consciousness is not caused by something within my body, then what is it caused by? It seems that if it's not caused by stuff within my body, then the only other option is that it's caused by magic like souls and whatnot.


Now, I know you know better. If it's not caused by something within the body, then it must be caused by something external to it. (That's too easy.)

So, if you're saying that consciousness isn't created by the body, where do you believe it comes from?


Again, I'm not saying anything of the sort. The only thing I've said is that the problem is not simple, there are no answers as of yet (easy or otherwise). Oh, and that computationalism is dead.

Just because it can be simulated, it doesn't mean that it's deterministic


That's not what I said at all. Also, reading the rest of the quote, I looks like you've misunderstood the term 'determinism'.

Anyhow, we're just wasting time talking past each other. Let me repeat this bit from earlier:

While it's easy to say things like "it must be that X because the metaphysics demands that conclusion" it doesn't get you from "that" to "how" or even guarantee that the question and the conclusion are consistent with your foundational assumptions. (An unrelated example to clarify that point: Science can say nothing about the existence of god because that topic is outside the scope of science. Any conclusions drawn from the metaphysics [e.g. god does not exist] are not scientific statements. To make it a scientific statement, you'd need a new metaphysics to increase the scope of science so that the question can be answered by scientific means.)

Take some time with it.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS