• Advertisement
Sign in to follow this  

What is your take on this?

This topic is 4291 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

By Dr. Rodney Brooks, Dept Head of the MIT Computer Science and Artificial Intelligence Lab
Quote:
Newer Math? A new high-school mathematics might someday model complex adaptive systems. By Rodney Brooks While prognostications about "the end of science" might be premature, I think most of us expect that high-school mathematics, and even undergraduate math, will remain pretty much the same for all time. It seems math is just basic stuff that's true; there won't be anything new discovered that's simple enough to teach to us mortals. But just maybe, this conventional wisdom is wrong. Perhaps sometime soon, a new mathematics will be developed that is so revolutionary and elegantly simple that it will appear in high-school curricula. Let's hope so, because the future of technology -- and of understanding how the brain works -- demands it. My guess is that this new mathematics will be about the organization of systems. To be sure, over the last 50 years we've seen lots of attempts at "systems science" and "mathematics of systems." They all turned out to be rather more descriptive than predictive. I'm talking about a useful mathematics of systems. Currently, many different forms of mathematics are used to model and understand complicated systems. Algebras can tell you how many solutions there might be to an equation. The algebra of group theory is crucial in understanding the complex crystal structures of matter. The calculus of derivatives and integrals lets you understand the relationships between continuous quantities and their rates of change. Such a calculus is essential to predicting, for example, how long a tank of water would take to drain when the rate of flow fluctuates with the amount of water still in the tank. The list goes on: Boolean algebra is the core tool for analyzing digital circuits; statistics provides insight into the overall behavior of large groups that have local unpredictability; geometry helps explain abstract problems that can be mapped into spatial terms; lambda calculus and pi-calculus enable an understanding of formal computational systems. Still, all these tools have provided only limited help when it comes to understanding complex biological systems such as the brain or even a single living cell. They are also inadequate to explaining how networks of hundreds of millions of computers work, or how and when artificial evolutionary techniques -- applied to fields like software development -- will succeed. These are just a few examples of what are sometimes referred to as complex adaptive systems. They have many interacting parts that change in response to local inputs and as a result change the global behavior of the complete system. The relatively smooth operation of biological systems -- and even our human-constructed Internet -- is in some ways mysterious. Individual parts clearly do not have an understanding of how other individual parts are going to change their behavior. Nevertheless, the ensemble ends up working. We need a new mathematics to help us explain and predict the behavior of these sorts of systems. In my own field, we want to understand the brain so we can build more intelligent robots. We have primitive models of what individual neurons do, but we get stuck using the tools of information theory in trying to understand the "information content" that is passed between neurons in the timing of voltage spikes. We try to impose a computer metaphor on a system that was not intelligently designed in that way but evolved from simpler systems. My guess is that a new mathematics for complex adaptive systems will emerge, one that is perhaps no more difficult to understand than topology or group theory or differential calculus and that will let us answer essential questions about living cells, brains, and computer networks. We haven't had any new household names in mathematics for a while, but whoever figures out the structure of this new mathematics will become an intellectual darling -- and may actually succeed in designing a computer that comes close to mimicking the brain. Rodney Brooks directs MIT's Computer Science and Artificial Intelligence Laboratory.
I've been thinking about this for sometime. I suppose one would start with a simple graph and derive a calculus to model the evolution of the graph? I'm still tinkering with the idea, and I'll write up what I figure out tonight. Any thoughts on his proposal?

Share this post


Link to post
Share on other sites
Advertisement
Wolfram has been trying to develop his version of this new kind of mathematics for 20 years, with the recent release of his volumous tome "A new kind of science" (iirc). Basically it's his take on automata theory. (one sentence cannot do it justice though... read the book if you have a year to spare and want to form your own opinion of his ideas/results).

I agree with Brooks though; we need a fundamental mathematical language for describing complex adaptive systems and their dynamic bahviour. In addition to this though, we also need to understand how to analyse this behaviour, which is really what science and mathematics has been working on for many hundreds of years (and especially since the 30s and 40s). The past 70 years has seen huge advancements in our abilities to analyse these systems... but we're not much closer at describing why they work the way they do... the best answer I've heard so far is "because they do"! (Think about that for a moment before dismissing it as a witty one-liner).

Cheers,

Timkin

Share this post


Link to post
Share on other sites
I draw a paralell to AI here.

If you replace 'complex adaptive systems' with AI, the text makes sence to me, as an master student in the field of AI.

The prolem is that nobody realy know what intelligence is, and as the text sais, most AI is simplifications of the real world, simplified to a point where we _can_ understand it (Our math tools are limiting the systems complexity).

There are different opinions on what is the right way to go, and there are no standard notations / rules that apply to AI over the algorithem level. (or in wery spesific cases like ANN)

So are these the same? Is this what theoretical AI people have been working on since the 70es? If i could express my little system in a standarised notation, and the apply all time valid mathematical tools to this, to understand it better, then scaleability would explode, and incredible complex systems could be created, just as increadible complex simulations or matrix operations or analysis is possible because of the definitions, the rulse, and their proofs...

now, what is your take on this? :-)

Share this post


Link to post
Share on other sites
Quote:
Original post by Timkin
Wolfram has been trying to develop his version of this new kind of mathematics for 20 years, with the recent release of his volumous tome "A new kind of science" (iirc). Basically it's his take on automata theory. (one sentence cannot do it justice though... read the book if you have a year to spare and want to form your own opinion of his ideas/results).

I agree with Brooks though; we need a fundamental mathematical language for describing complex adaptive systems and their dynamic bahviour. In addition to this though, we also need to understand how to analyse this behaviour, which is really what science and mathematics has been working on for many hundreds of years (and especially since the 30s and 40s). The past 70 years has seen huge advancements in our abilities to analyse these systems... but we're not much closer at describing why they work the way they do... the best answer I've heard so far is "because they do"! (Think about that for a moment before dismissing it as a witty one-liner).

Cheers,

Timkin


I was going to make this my secondary research project for my Research Computer Science class. My approach is to study how data flows recursively through the network, and try determine a calculus to study the instantaneous change of the network. I'm creating a nice proposal right now on what exactly I want to do. I'll put it up sometime tomorrow.

Share this post


Link to post
Share on other sites
I don't know much about advanced Math, all I have is high-school in my bag of tricks...

...but how can this be done? If each individual node in a system has an internal set of variables, and is itself a variable, the correct formula to extract information about that system is another function with the same number of variables, so in other words, if you want to know what Adam's brain will do next, you need a second atomicly-cloned Adam's Brain.

Any other solutions will present deviations that after "n" iterations will have deviated too far off. Now that I think of it it's a bit like weather-simulation.

We can extract generalistic data from the system and be somewhat sure that it holds true for the next "n" time units, the same way we can say a hurricane is going to hit an island in the next 3 days, but it will never cease to be a "statistical" aproach to it, you'll allways have to say "there is a 94% chance of that happening", but I don't think we'll ever develop the math to write something like:
sim(AdamBrain) == Hunger?

...unless I'm not reading this correctly. Interesting subject none the less.

Share this post


Link to post
Share on other sites
I'll also point out that "complex adaptive systems" theory actually boils down to the ability to "solve" and analyze nonlinear equations. Humans have been trying to do this for a long, long time and our current methods are only qualitative or numerical, yielding insight but no real ability to design new systems with desirable properties. These nonlinear systems come up in every field -- from a simple pendulum swinging (one of the very few solvable ones) to the analysis of neural networks. The fact is that our world is not linear, particularly not the brain, yet humans have only been able to survive thus far with mathematics by approximating nonlinear systems with linear models. So yes, the current math is very difficult to understand and to perform, but it may be a very long time before the "right" patterns are found to simplify it all.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
The problem is that math tools we have right now can solve the problems in numerically, but not analytically. So we have a lot of variables, and solve those in simulation takes just extrimely time.

Another problem is that we know so little bit. So we have to use to general level formula.

Share this post


Link to post
Share on other sites
How do you guys feel about the whole "so simple it can be taught in a HS curriculem." Now maybe its just me, but I don't think Group Theory, Topology, or differential calculus is taught to the average HS'er(ok... the latter is coming into play for sure).

I have a nice ppt I'll put up later tonight explaining my approach to this whole problem.

Share this post


Link to post
Share on other sites
Let me first say that I don't fully understand what analysis Brooks wants to perform. Despite this fact, I'll give my reaction to the topic.

I think the idea is ill-founded. While some may view this as closed-minded, I don't forsee the conception of a new type of mathematics that will suddenly make analysis of such complex systems a simple task. The fact of the matter is that complex systems are just that. Trying to analyze or interpret the state of the entire internet or all of the cells in a creatures body, to me seems like an misguided exercise. It is not the entire state that matters, but each entities dynamic role in a system's operation that is key. I disagree with his statement that the operation of "our human-constructed Internet -- is in some ways mysterious". Such a system, despite its amazing and beautiful complexity, is the result of a huge number of small, simple entities that have come to a concensus about a standard means of operating and communicating. The cells in a creatures body are fundamentally no different: each performs a simple task and due to extreme levels of standardization, extremely complex system behavior can result. Like I said, I don't exactly understand what type of analysis Brooks would like to perform, but I seriously doubt that a new branch of mathematics will suddendly simplify a human's ability to understand the interactions that take place in a system with vast numbers of entities. The key is understanding the standards that make the internet, and more impressively a creatures body, work the way it does.

I am not a student of advanced mathematics so maybe none of this makes any sense and I sound like an idiot...

Share this post


Link to post
Share on other sites
Quote:
Original post by janoside
...It is not the entire state that matters, but each entities dynamic role in a system's operation that is key. I disagree with his statement that the operation of "our human-constructed Internet -- is in some ways mysterious". Such a system, despite its amazing and beautiful complexity, is the result of a huge number of small, simple entities that have come to a concensus about a standard means of operating and communicating. The cells in a creatures body are fundamentally no different: each performs a simple task and due to extreme levels of standardization, extremely complex system behavior can result....


Yeah, Brooks is hoping for some way to go from each individual's simple rule set to the resulting aggregate behavior. Even more ambitious, to go from some desired aggregate behavior to a simple rule set that would produce that behavior given enough individuals.

Personally, I believe Brooks won't get his wish-- there's little (no?) evidence of natural systems that can *easily* design simple rules for some desired complex aggregate behavior. Most of the natural systems typically used as examples (epigenetic development, neural networks, ecosystems, etc.) took millions of years of incremental search (evolution) to get there. *Maybe* we can design some representation that could quickly do the same kind of work, but I doubt it.

Share this post


Link to post
Share on other sites
Quote:
Original post by mnansgar
yet humans have only been able to survive thus far with mathematics by approximating nonlinear systems with linear models.


I disagree with this statement emphatically. I for one work in a field where I am constantly analysing complex, adaptive (nonlinear) systems and designing them, most particularly for the control of other complex adaptive systems. I certainly don't rely on linearisation when creating such systems. Indeed, in my current work I don't even rely on system identification techniques (modelling the system) when designing controllers, so there is nothing to linearise!

My personal belief is that we will see paradigm shifts in the way we describe and build complex adaptive systems. I have lots of reasons to believe this, but most of them are beyond the breadth of this discussion (and are rather mathematical). Partly though, it's because I work in this area and I'd like to think that one day all of the advances that I am seeing will actually amount to something beautiful and simple, just as many other areas of advanced mathematics do! 8)

I guess that coming from a mathematics background, I'm a bit biased though! ;)

[Edited by - Timkin on April 9, 2006 9:57:45 PM]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Quote:
Original post by Sagar_Indurkhya
By Dr. Rodney Brooks, Dept Head of the MIT Computer Science and Artificial Intelligence Lab

Quote:

Newer Math?
A new high-school mathematics might someday model complex adaptive systems.

By Rodney Brooks

While prognostications about "the end of science" might be premature, I think most of us expect that high-school mathematics, and even undergraduate math, will remain pretty much the same for all time. It seems math is just basic stuff that's true; there won't be anything new discovered that's simple enough to teach to us mortals.

But just maybe, this conventional wisdom is wrong. Perhaps sometime soon, a new mathematics will be developed that is so revolutionary and elegantly simple that it will appear in high-school curricula. Let's hope so, because the future of technology -- and of understanding how the brain works -- demands it.

My guess is that this new mathematics will be about the organization of systems. To be sure, over the last 50 years we've seen lots of attempts at "systems science" and "mathematics of systems." They all turned out to be rather more descriptive than predictive. I'm talking about a useful mathematics of systems.

Currently, many different forms of mathematics are used to model and understand complicated systems. Algebras can tell you how many solutions there might be to an equation. The algebra of group theory is crucial in understanding the complex crystal structures of matter. The calculus of derivatives and integrals lets you understand the relationships between continuous quantities and their rates of change. Such a calculus is essential to predicting, for example, how long a tank of water would take to drain when the rate of flow fluctuates with the amount of water still in the tank.

The list goes on: Boolean algebra is the core tool for analyzing digital circuits; statistics provides insight into the overall behavior of large groups that have local unpredictability; geometry helps explain abstract problems that can be mapped into spatial terms; lambda calculus and pi-calculus enable an understanding of formal computational systems.

Still, all these tools have provided only limited help when it comes to understanding complex biological systems such as the brain or even a single living cell. They are also inadequate to explaining how networks of hundreds of millions of computers work, or how and when artificial evolutionary techniques -- applied to fields like software development -- will succeed.

These are just a few examples of what are sometimes referred to as complex adaptive systems. They have many interacting parts that change in response to local inputs and as a result change the global behavior of the complete system. The relatively smooth operation of biological systems -- and even our human-constructed Internet -- is in some ways mysterious. Individual parts clearly do not have an understanding of how other individual parts are going to change their behavior. Nevertheless, the ensemble ends up working.

We need a new mathematics to help us explain and predict the behavior of these sorts of systems. In my own field, we want to understand the brain so we can build more intelligent robots. We have primitive models of what individual neurons do, but we get stuck using the tools of information theory in trying to understand the "information content" that is passed between neurons in the timing of voltage spikes. We try to impose a computer metaphor on a system that was not intelligently designed in that way but evolved from simpler systems.

My guess is that a new mathematics for complex adaptive systems will emerge, one that is perhaps no more difficult to understand than topology or group theory or differential calculus and that will let us answer essential questions about living cells, brains, and computer networks.

We haven't had any new household names in mathematics for a while, but whoever figures out the structure of this new mathematics will become an intellectual darling -- and may actually succeed in designing a computer that comes close to mimicking the brain.

Rodney Brooks directs MIT's Computer Science and Artificial Intelligence Laboratory.


I've been thinking about this for sometime. I suppose one would start with a simple graph and derive a calculus to model the evolution of the graph? I'm still tinkering with the idea, and I'll write up what I figure out tonight.

Any thoughts on his proposal?




"will emerge"

If its not already been kicked around by theoretical mathmeticians for several decades, its not likely to emerge (or more likely its simply not a subject that can be reduced to a 'simple' system that can be injected into high school students). The kids these days barely get fundamentals in much simpler subjects, is it likely this will be taught ???

Share this post


Link to post
Share on other sites
I think to the problem of adaptative systems is resources. Having unlimited, or always more than enough of them will bring us the posibility of not worring about being so mathematicaly precise to avoid wasting processing/time/energy. I don't think that nature is so worried about how things work from a mathematically point of view. Things just tend to group in certain ways under different circumstances, and the only thing that matters is if the "thing" works, and not "how", or many times "how eficiently". What does it need? A bigger brain? Have it. 40tn of weight? Done.

Share this post


Link to post
Share on other sites
Quote:
Original post by Timkin
Quote:
Original post by mnansgar
yet humans have only been able to survive thus far with mathematics by approximating nonlinear systems with linear models.


I disagree with this statement emphatically. I for one work in a field where I am constantly analysing complex, adaptive (nonlinear) systems and designing them, most particularly for the control of other complex adaptive systems. I certainly don't rely on linearisation when creating such systems. Indeed, in my current work I don't even rely on system identification techniques (modelling the system) when designing controllers, so there is nothing to linearise!

My personal belief is that we will see paradigm shifts in the way we describe and build complex adaptive systems. I have lots of reasons to believe this, but most of them are beyond the breadth of this discussion (and are rather mathematical). Partly though, it's because I work in this area and I'd like to think that one day all of the advances that I am seeing will actually amount to something beautiful and simple, just as many other areas of advanced mathematics do! 8)

I guess that coming from a mathematics background, I'm a bit biased though! ;)



I come from an engineering background, but I'm pursuing a doctorate in computational neuroscience. So, I see your point that there exist a subset of complex adaptive systems which CAN be designed (neural networks come to mind), but I still stand by my previous statement since linear systems are/have been so important to engineering.

Practical engineering techniques heavily rely on linearization, especially if you'll agree that numerical analyses are typically just iteratively small linearizations (for instance, primitively Euler's Method). In designing modern circuits, we still rely on crude linearizations of the transistor IV curves and other solid state components. I'm sure that you're familiar with the popular sin(theta) ~= theta (when theta is small) approximation used for simplifying a multitude of formulas. You're much more an expert in automatic controllers, but I have the impression that your nonlinear work is more of the exception than the norm given the limited resources often allocated to controllers.

I can also see your rationale for emphatic disagreement -- indeed, there are many classes of nonlinear systems which we can analyze and design. However, I argue that these are typically (1) limited to classes of equations which have been rigorously studied, (2) require iterative methods which often require intuition (e.g. number of layers/hidden nodes), (3) require vast computing resources for accuracy, drastically limiting their usefulness in practical systems, and (4) even so are typically limited to qualitative/numerical analyses. I will leave on a positive note though, saying that as you indicate, their use is definitely increasing nowadays perhaps largely due to the availability of fast computing resources.

I for one would be very interested in hearing your reasons to believe paridigm shifts may occur. Thanks for sharing!

Share this post


Link to post
Share on other sites
Hehe...after posting this I realised how long it had become... my apologies...

Quote:
Original post by mnansgar
I'm pursuing a doctorate in computational neuroscience.


Off Topic: What's your thesis on? I've worked in CN previously, most particularly on seizure prediction algorithms and image segmentation and registration algorithms. It's a fascinating field. 8)

Back on topic...

Quote:
but I still stand by my previous statement since linear systems are/have been so important to engineering.


There's a huge difference between mathematics and engineering, between what we know and what we can make/sell. For example, we teach undergraduate engineering students that the gradient of a vector field is a vector. It isn't. It's a one form. It just so happens that in Euclidean space, one forms and vectors have equivalent properties. Step outside of Euclidean space and this equality doesn't hold, meaning analysis based on this assumption would fail. Why do we teach these student a 'lie'? Because in general, they'll never need to know they weren't taught the truth. ;) If they do, then we teach them the truth of the matter (which generally only happens when they learn the error of their ways and become mathematicians! ;) )

Quote:
Practical engineering techniques heavily rely on linearization


Yes, they do. In part, because we don't bother to teach engineers advanced mathematical analysis and design techniques and because linearisation works on many real world problems... but that's because most of the real world problems we deal with in every day life are quasi-linear. That's more of a statement about the domain over which engineering presides (and can survive while presiding over), rather than our inability to deal with tougher problems.

There are obviously exceptions to this in which the problems we are trying to engineer solutions for are highly nonlinear. In these cases, linearisation is often applied, but only because the engineer involved doesn't know a better technique, doesn't have the time/money to develop a better implementation or doesn't want to implement anything 'new'. I face this attitude regularly when dealing with in-house engineers of our industry partners. The tools are there though and if you look at engineering R&D, you'll certainly see nonlinear analysis and design techniques being implemented, particularly in areas like Control Theory.

Quote:
especially if you'll agree that numerical analyses are typically just iteratively small linearizations


Most certainly not. If you restrict your view to only finite difference analyses of differential models/systems, then perhaps so... but you're ignoring a wealth of techniques that don't rely on any linearisation of a system model. For example: phase space analysis, spectral analysis, statistical analysis and functional analysis, to name but a few.

Quote:
but I have the impression that your nonlinear work is more of the exception than the norm given the limited resources often allocated to controllers.

Certainly industry still relies on simple solutions (because they're easy to understand and sell as ideas to management), but nonlinear methods have existed since the earliest days of control. Today, there are certainly many more linear devices (such as PIDs) than nonlinear ones, but thats more to do with the inertia involved in shifting industry than any lack of knowledge regarding nonlinear methods.

Quote:
I argue that these are typically (1) limited to classes of equations which have been rigorously studied


So because it has been rigorously studied, that makes it exempt in the consideration of nonlinear vs linearisation? I think perhaps you should have argued along the lines of the "size of the class of problems that have been rigorously studied". Indeed, anything beyond second order is normally the realm of mathematicians (engineers often deal with so called 'ideal second order systems')... and third order systems become tough to analyse, requiring advanced tools such as Lie Algebra and asymptotic methods... but this is only if you're trying to predict state evolution exactly. If you want to analyses the system for its performance (which is quite often all we require of engineered systems) then you don't need to know the exact state; only that it is stable and under what conditions it might traverse to instability... and that it meets a performance criteria.

Quote:
(2) require iterative methods which often require intuition (e.g. number of layers/hidden nodes)

There's nothing wrong with a good iterative learning method, so long as you have the time and the data! ;) Structural learning is certainly possible, but what you find is that human intuition is often quite a good first guess.

Quote:
(3) require vast computing resources for accuracy, drastically limiting their usefulness in practical systems


Yes, you have a point here, when you compare the resources required to analyse a nonlinear system compared to a simple set-point linearisation (which can be achieved by something as simple as linear regression of the local data). Certainly, nonlinear analysis techniques are data intensive and resource consuming. However, used appropriately, on many problems the increased performance far outweighs the cost.


Quote:
(4) even so are typically limited to qualitative/numerical analyses.


I disagree with that. I can (almost) just as easily fit a second order polynomial model to anything you can linearise. That's neither qualitative nor numerical. If, however, I estimate the parameters of my model online, then it certainly is numerical... but then so is the linearisation.

Quote:
I will leave on a positive note though, saying that as you indicate, their use is definitely increasing nowadays perhaps largely due to the availability of fast computing resources.


...and broader acceptance of techniques... which is sort of the point of Rodney's statements... that if we can reduce the science of complex adaptive systems to a description that a secondary school student can understand, then we will see revolutionary change in how we perceive and control our environment around us. Of course, this would also be true if we could teach quantum mechanics to pre-schoolers.

Quote:
I for one would be very interested in hearing your reasons to believe paridigm shifts may occur.


Actually this has a lot to do with my pessimism about science and scientific method, particularly with regards to how we approach problems. Complex systems research is a good example. We bring many of our preconceptions about dynamic systems (developed from years of linear analysis techniques ;) ) to the table when we try and analyse these systems. Like much of science, we try and atomise the problem to understand it... and in parallel systems, that simply doesn't work. Yet most of these systems are comprised from simple elements, interacting with simple rules. It's the internal balance and harmony of the interplay of the components that enables these systems to survive and be observed. I firmly believe we will find a way to mathematically describe these systems, because we already have some of the important tools that open our eyes to what is going on within these systems. I just believe we're trying to describe these systems in the wrong way. We haven't worked out exactly what our tools are telling us and what we're not seeing yet. My optimisim also stems from the simplicity of the substructure of these systems. Simple elements can be described in simple ways. The complexity arises when you try and describe the global properties in terms of local properties. I think we'll find a way of doing that and that it will, at its heart, be simple and beautiful, just like the systems it describes.

Cheers,

Timkin

Share this post


Link to post
Share on other sites
While a nice wish list, that's all it is. Saddly, it's not even original and I'm a bit surprised nobody has brought this up yet.

The Foundation Series, written in the 1940's, by Isaac Asimov, envisioned the type of system Mr. Brooks is looking for.

From Wiki:
Quote:

The premise of the series is that mathematician Hari Seldon has spent his life developing a branch of mathematics known as psychohistory, a concept devised by Asimov and his editor John W. Campbell. It uses the law of mass action to predict the future on a large scale, such as of planets or empires


It's a magimatical theory that allows our ficticious Mr. Seldon to determine the outcome of events involving many complex interactions many years ahead of time.

That said, it's unlikely any human will develop such a system. As humans, we're limited to human concepts and human thinking. For example, while many believe that 1+1 is a universal truth, there isn't really anything universal about it.

Concepts such as 1, or 2, might be strictly human, much like 'red' or 'hot'.

Will

Share this post


Link to post
Share on other sites
Quote:
Original post by RPGeezus
Concepts such as 1, or 2, might be strictly human, much like 'red' or 'hot'.


Certainly for humans, the concepts related to the numbers 1 to 5 are inbuilt to our brains (not learned). That is, we can inherently recognise the difference between collections of objects where the collections have between 1 and 5 objects in them and we can do this without using any of the areas of the brain normally associated with numerical reason or number representation.

If I recall correctly, other species have similar inbuilt concepts, although the size of the sets varies. So it might be reasonable to state that while attaching labels to the sizes of sets is a human ability, there is also a non-human ability to recognise quantity, if only to a limited degree.

Cheers,

Timkin

Share this post


Link to post
Share on other sites
Quote:
Original post by Timkin
If I recall correctly, other species have similar inbuilt concepts, although the size of the sets varies. So it might be reasonable to state that while attaching labels to the sizes of sets is a human ability, there is also a non-human ability to recognise quantity, if only to a limited degree.

Cheers,

Timkin


Agreed. I know certain animals are very good at 'more' and 'less', but still incapable of '1', '2', '3'..

Will

Share this post


Link to post
Share on other sites
Quote:
Original post by RPGeezus
I know certain animals are very good at 'more'...


Human toddlers are good at this one too ;)

Share this post


Link to post
Share on other sites
You could say brains are like a program. They are given the basic syntax but then have to reprogram themselves. The question is how they reprogram themselves. I would guess that since generally most humans are simmilar at birth there must be also some "pre programmed" start up which uses the syntax they are given to evaluate inputs and outputs and set into motion the learning process. This start up program could be inherited from previous generations which would result in some kind of civil evolution.

TwinX

Share this post


Link to post
Share on other sites
Quote:
Original post by TwinX
You could say brains are like a program. They are given the basic syntax but then have to reprogram themselves. The question is how they reprogram themselves. I would guess that since generally most humans are simmilar at birth there must be also some "pre programmed" start up which uses the syntax they are given to evaluate inputs and outputs and set into motion the learning process. This start up program could be inherited from previous generations which would result in some kind of civil evolution.

TwinX


If I remember correctly, that's known as the Subsumption Architecture."

Share this post


Link to post
Share on other sites
I think I've nailed out how I'm going to proceed for the time being. I've been reading wolfram's NKS, and the first thing I did was code up a little environment where I can actually observe a graph structure where data flows through it. Then I'll try a few very small graphs and try to predict what will happen, etc. More pilot study I suppose.

Share this post


Link to post
Share on other sites
Subsumption architecture hmmm all these long names i dont know im just at high school in the UK of all places they dont actually teach you anything at these places

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement