Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


Don't forget to read Tuesday's email newsletter for your chance to win a free copy of Construct 2!


human intelligence


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
97 replies to this topic

#41 Nypyren   Crossbones+   -  Reputation: 4498

Like
0Likes
Like

Posted 29 January 2014 - 08:59 PM

Talking brain and computer size, consider: The important components of a computer are very small and most of a computer's space is taken up by the wires that connect each major component, cooling fans and empty space, and power management hardware (batteries, PSUs, capacitors, etc). Cell phones are obviously extremely small, yet have fairly powerful processors.

Recent laptop-class CPU dies are roughly 2-3 square centimeters and 1-2 mm thick, where most of the volume is structure not directly involved in processing. The packaging outside of the silicon is largely just making sure that heat dissipation and socket contacts are robust. RAM is a bit larger. Nonvolatile storage is often shockingly small (MicroSD as an example).

Now, without any considerations of actually powering it up or connecting the components in a meaningful way, pack a human skull full of nothing but the CPU, RAM, and MicroSD. You'd be able to pack tens of thousands of various combinations of them within an adult human skull. It's not that shocking that human brains would be more powerful than a few dozen square centimeters of silicon.

Biological brains grow as a single unit. They don't need to waste space with things like making the contacts in PCI slots far enough apart to allow for sloppy alignment. Everything can be as small as functionally possible. Everything is arranged in 3D, so you don't have to make your circuits planar, allowing you much more freedom in where you place connected pieces.

How big would cell phone hardware be without the antenna, battery, or screen?

Edited by Nypyren, 29 January 2014 - 09:13 PM.


Sponsor:

#42 samoth   Crossbones+   -  Reputation: 4926

Like
0Likes
Like

Posted 30 January 2014 - 04:57 AM

But if you think that way, you must add the size of the liver and the heart (as much as pretty much every other organ) to the size of a brain, too. A brain doesn't work without a complete human to support it. Remove one organ, and unless it's a paired organ the whole human including the brain encounters a permanent failure condition.

 

On the other hand, the skull is half-empty (filled with fluid) and a considerable part of neural tissue is collagen which does "nothing" for its compute power -- it only guarantees that the tissue resists reasonable force. Insofar, the comparison with filling the whole skull with silicon isn't 100% fair either.

Also, much like a brain, a processor simply won't work without the supportive hardware such as a power supply (though it is much more tolerant to being switched off and on).



#43 Hodgman   Moderators   -  Reputation: 31047

Like
3Likes
Like

Posted 30 January 2014 - 05:37 AM

The debate is extremely flawed if you fail to accept that biology is just nanotechnology, and that humans are therefore machinery...

 

Humans are machines.

Humans are intelligent and self aware.

Therefore machines can be intelligent and self aware.

The only way for this to not be true, is to artificially restrict your definition of machinery, or to choose to believe that humans are made of magic or whatever...

 

@OP though -- An i7 and GB's of RAM -- no. That is a ridiculously simple (and inefficient, and fragile) machine when compared to a human.

 

On the topic though, google just spent half a billion dollars acquiring a company who's developed a self-learning AI that can learn to play (and win) at different Atari games just from the RGB pixel inputs -- that's fully automated feature recognition, classification, planning, etc... That's a pretty good achievement thus far!



#44 Tutorial Doctor   Members   -  Reputation: 1650

Like
-1Likes
Like

Posted 30 January 2014 - 09:04 PM

So tired of google trying to conquer the world. smh. haha


They call me the Tutorial Doctor.


#45 IADaveMark   Moderators   -  Reputation: 2511

Like
0Likes
Like

Posted 31 January 2014 - 08:30 AM

Just thought I'd swing through this thread again... I see the derpage continues. Don't make me moderate your ass.


Dave Mark - President and Lead Designer of Intrinsic Algorithm LLC

Professional consultant on game AI, mathematical modeling, simulation modeling
Co-advisor of the GDC AI Summit
Co-founder of the AI Game Programmers Guild
Author of the book, Behavioral Mathematics for Game AI

Blogs I write:
IA News - What's happening at IA | IA on AI - AI news and notes | Post-Play'em - Observations on AI of games I play

"Reducing the world to mathematical equations!"

#46 Tutorial Doctor   Members   -  Reputation: 1650

Like
0Likes
Like

Posted 31 January 2014 - 12:47 PM

When Louie said that we can hold more data than computers, I'd agree. The amount of sensory data that even an uneducated person takes in just by design is far more than computers will be able to take in. And we still have to process this data constantly.

To compare computer intelligence to human intelligence and to suggest a computer could be more intelligent is very dismissive of just how complex the human brain is.

We can't even make an algorithm that makes a robot as autonomous as a human. Far from doing it too.

Programs can't fix themselves, as we CAN.

They can't upgrade themselves, as we can. We can choose to learn or not to learn, and choose how to learn.

Grab the best doctors, most brilliant scientists, most skilled engineers, put them together to build a robot and you still won't get a "machine" that could ever be more brilliant than the creators themselves.

That company google bought has done something neat, but is so far from human intelligence.

They call me the Tutorial Doctor.


#47 Álvaro   Crossbones+   -  Reputation: 13658

Like
0Likes
Like

Posted 31 January 2014 - 01:05 PM


When Louie said that we can hold more data than computers, I'd agree. The amount of sensory data that even an uneducated person takes in just by design is far more than computers will be able to take in. And we still have to process this data constantly.

 

I repeat the challenge: Design a test of memory prowess where you think you can do better than a laptop.



#48 samoth   Crossbones+   -  Reputation: 4926

Like
0Likes
Like

Posted 31 January 2014 - 02:00 PM


I repeat the challenge: Design a test of memory prowess where you think you can do better than a laptop.
Ow darn you, now you are compelling me to prove something that's so obviously wrong on all accounts biggrin.png

 

But the way this challenge is worded, it actually works out:

 

I can remember events from my childhood. That's 40 years. My grandparents can remember things from theirs, that's 85 years.

 

A laptop battery doesn't live much longer than 3-4 years, nor does a laptop battery. Make that 10 years if you are very lucky. Few computers have an uptime of more than a few months (a few servers have 5-10 years), and few computers that are older than 20-30 years are still in service. No computer older than 85 years is in service.

 

---> beaten the machine on long-term memory prowess

 

Now of course one could argue "but M-Disc lasts 1000 years". Alas, that's the manufacturer's advertising gag, we will know in 1000 years (or rather, we will not). What we do know is that DVDs definitively don't last 1000 years, though.



#49 Nypyren   Crossbones+   -  Reputation: 4498

Like
2Likes
Like

Posted 31 January 2014 - 03:15 PM

I repeat the challenge: Design a test of memory prowess where you think you can do better than a laptop.

Ow darn you, now you are compelling me to prove something that's so obviously wrong on all accounts biggrin.png
 
I can remember events from my childhood. That's 40 years. My grandparents can remember things from theirs, that's 85 years.


Both of you are oversimplifying the situation. If you only pick a *single* requirement, either human or computer can beat the other at anything! The reality though is that both humans and computers have to fulfill a LOT of requirements at the same time. Human requirements are dictated by nature. Computer requirements are dictated by Humans.

Just like a computer loses its data due to software bugs, media decay or hardware failures, Humans forget things naturally all the time: Do you remember everything you learned from every lesson in school? How long can you remember things that you never use?

#50 conq   Members   -  Reputation: 354

Like
2Likes
Like

Posted 31 January 2014 - 04:08 PM

 


I repeat the challenge: Design a test of memory prowess where you think you can do better than a laptop.
Ow darn you, now you are compelling me to prove something that's so obviously wrong on all accounts biggrin.png

 

But the way this challenge is worded, it actually works out:

 

I can remember events from my childhood. That's 40 years. My grandparents can remember things from theirs, that's 85 years.

 

A laptop battery doesn't live much longer than 3-4 years, nor does a laptop battery. Make that 10 years if you are very lucky. Few computers have an uptime of more than a few months (a few servers have 5-10 years), and few computers that are older than 20-30 years are still in service. No computer older than 85 years is in service.

 

---> beaten the machine on long-term memory prowess

 

Now of course one could argue "but M-Disc lasts 1000 years". Alas, that's the manufacturer's advertising gag, we will know in 1000 years (or rather, we will not). What we do know is that DVDs definitively don't last 1000 years, though.

 

Can you? That's impressive! Tell me, what was the exact shade of color exactly 1/8th from the top of your eyesight radius? What were the exact dimensions of the blades of grass around you?

 

What we "remember" as vivid isn't as vivid as you actually think.

 

Can you pick out 4 thousand events that have happened in your life in detail?

 

You remember listening to music? Neat, what were you doing with your right foot's big toe at the start of the guitar solo?

 

Just think about it this way. Try to think of about an hour of vivid memories from every year you've been born. Do you think a computer could store double what you can recall?



#51 Nathan2222_old   Members   -  Reputation: -400

Like
-3Likes
Like

Posted 01 February 2014 - 10:00 AM

I think the thread name answers the question. Machines will never ever be human, it may have almost all the attributes of a human but will never be human.

UNREAL ENGINE 4:
Total LOC: ~3M Lines
Total Languages: ~32
smile.png
--
GREAT QUOTES:
I can do ALL things through Christ - Jesus Christ
--
Logic will get you from A-Z, imagination gets you everywhere - Albert Einstein
--
The problems of the world cannot be solved by skeptics or cynics whose horizons are limited by the obvious realities. - John F. Kennedy


#52 LennyLen   Crossbones+   -  Reputation: 3918

Like
0Likes
Like

Posted 01 February 2014 - 11:29 AM

I think the thread name answers the question. Machines will never ever be human, it may have almost all the attributes of a human but will never be human.

 

That in no way answers the question about whether it is possible to design a machine that surpasses human intelligence.



#53 Nathan2222_old   Members   -  Reputation: -400

Like
0Likes
Like

Posted 01 February 2014 - 04:32 PM

I think the thread name answers the question. Machines will never ever be human, it may have almost all the attributes of a human but will never be human.


That in no way answers the question about whether it is possible to design a machine that surpasses human intelligence.
Try recreating the function the eye and brain performs in less than a micro second. Could you squeeze up to 1,000,000 microchips into a thinner than a mm plate?
You could try making a machine that's smarter than a child.
It requires an 83,000 processor supercomputer to simulate 1% of the brain in 40 minutes (over 220,000 pc's to do what a single, small organ does in less than a second, every day for the rest of it's use).

Yes a human can't solve mathematical problems as fast as a computer but that's because the human brain is a massive general purpose organ. It efficiently, effectively and intelligently managing all the numerous body systems. You could try simulating in one second what the brain does in a second with any single computer (super or otherwise) and see what happens.

If the brain where to do just mathematical problems or any single task then comparing it to any computer will be like comparing a Bugatti Veyron to a snail.

You've never seen a human suffer from (LMS) Low Memory Syndrome. Soak a computer in water for 30 minutes and see.

Whenever you do make a machine that can do these, try making it as human-like. It shouldn't have memory overload, over heating, runtime error, hanging/crashing, system shutdown etc. because it's using too many parts etc. It may then be possible to think of making something better.
From 2050-2100-whenever.

UNREAL ENGINE 4:
Total LOC: ~3M Lines
Total Languages: ~32
smile.png
--
GREAT QUOTES:
I can do ALL things through Christ - Jesus Christ
--
Logic will get you from A-Z, imagination gets you everywhere - Albert Einstein
--
The problems of the world cannot be solved by skeptics or cynics whose horizons are limited by the obvious realities. - John F. Kennedy


#54 LennyLen   Crossbones+   -  Reputation: 3918

Like
1Likes
Like

Posted 01 February 2014 - 08:11 PM

Machines with human intelligence are not going to be based on current technology. A human being *is* a machine, created by random evolution.  If nature can come up with human intelligence by accident, we can eventually improve it by design if enough resources are spent on the problem (ie do we actually want machines that can think?).

 

That does require working out exactly how the brain functions though, which is where the difficulty lies. 

 


Soak a computer in water for 30 minutes and see.

 

What does that have to do with anything?  Try running 240V through a human for 10 days straight and see how they function.  Blathering on about transistors and human eye function adds nothing to the conversation. It's just repeating what's already been said.

 


You could try simulating in one second what the brain does in a second with any single computer (super or otherwise) and see what happens.

 

The only reason we can't simulate the brain effectively is because we don't understand it.



#55 ActiveUnique   Members   -  Reputation: 837

Like
0Likes
Like

Posted 01 February 2014 - 10:10 PM

Does anyone here thinks human intelligence is overrated and that it's just a matter of months until someone finds the right algorithm and just with a intel i7 and some GBytes of memory we can surpass human intelligence after running the algorithm for some months?

 

prove me wrong

  1. It would take more than a few months to program the algorithm, especially if it required gigabytes of code to start running.  Although if you're just curious if it could take some-one to put together the correct idea, yeah, one person is enough.
  2. Programs that benefit from months of human computation are the closest thing we have to a program with human intelligence, however incomplete. There is no measure to the amount of human intelligence required to fulfill the logical calculations, solving the problems for a program that doesn't know the answers, correcting mistakes that were inevitably introduced along the way.  That seems underrated.
  3. Something I read just randomly wandering around as a total noob at programming was iIf a program rewrote its own kernel, that program would stop working. Eventually it'd make a permanent change that caused bugs even if it could debug itself, the debugger would have bugs.

 

Preemted by a month in my journal: http://www.gamedev.net/blog/1780-theoretical-games-that-evolve-from-player-input/


I've read about the idea guy. It's a serious misnomer. You really want to avoid the lazy team.


#56 Tutorial Doctor   Members   -  Reputation: 1650

Like
0Likes
Like

Posted 01 February 2014 - 10:15 PM

Agree. If I wanted to calculate 1000000! I wouldn't ask a mathematician to start sharpening his pencil, I'd write a library and I'd still beat the human.
 
EDIT: Even if the library was inefficient.


You are the human. And yes, "you" (a human) would have to write it.

They call me the Tutorial Doctor.


#57 BHXSpecter   Members   -  Reputation: 1626

Like
2Likes
Like

Posted 01 February 2014 - 10:21 PM

LennyLen, he does this regularly. It seems like he just reads the first sentence and then bases his whole reply on that. He comes across as a borderline troll sometimes.

 

As for AI in machines surpassing humans. All I have to say is what was science fiction in the past became science fact. I wouldn't knock anything. Though, the problem of a Skynet scenario would become very real because if you make the AI smart enough to learn it would learn how to bypass its safety protocols (become self aware). As for can a machine be made that surpasses a human's intelligence...it isn't a matter of if or can, but rather when.


"Through vengence I was born.Through war I was trained.Through love I was found. Through death I was released. Through release I was given a purpose."


#58 samoth   Crossbones+   -  Reputation: 4926

Like
1Likes
Like

Posted 02 February 2014 - 09:53 AM

Can you? That's impressive! Tell me, what was the exact shade of color exactly 1/8th from the top of your eyesight radius? What were the exact dimensions of the blades of grass around you?

 

Sigh. This had to come.

 

Sadly, it only shows that you didn't read my post properly, nor do you understand (or you deliberatly pretend not understanding) how human perception or the human mind works in any way.

 

It is obvious that even a below-average human's intelligence is superior to very advanced artificial intelligence, but it is also obvious that the human ability to memorize quantifiable data is neglegible compared to a computer. I would most certainly fail trying to memorize the first 100,000 primes even if you gave me 3 months of time. To a cellphone-sized computer, this is no challenge. However, computers likewise fail at pathetically trivial tasks. Such a comparison is widely meaningless.

 

My post said that finding a memory prowess challenge where the human beats a laptop is very unsuitable to demonstrate the superiority of one over the other. The challenge is so simple that it is compelling to "prove" the obviously wrong: Humans are better at remembering things (as "proven" by my example).

 

The "proof" by child memories still stands.

 

No, I don't remember the exact shade of some pixels on my retina 30 years ago or the number of grass blades anywhere. The reason being that my visual organs do not have a concept of pixels, nor of exact shades, nor does my brain have any such concept. Besides, a computer is not able to reliably answer the question "how many grass blades are in this image" either, even without having to remember the number, even when it explicitly tries to (other than me, who explicitly tries not to remember that information).

 

My brain, like the vast majority of human brains, receives a pre-integrated, contrast-enhanced (horizontal/bipolar cells) and modulated (ganglion cells), fault-corrected signal coming from a very non-uniform sample grid with a very non-uniform color reception and a very non-objective automatic luminance regulation. Plus, superposition of two images from different viewpoints combined in one.

The brain somehow transforms this... stuff... into something, which it selectively filters for information that is important for the present situation. That is what I "see". It is not an array of pixels of some particular shade, not even remotely.

 

This is a key to survival and to managing everyday situations. The brain then selects what part of this information (and other information) is important for the situation and how much of it, if any, is important to be remembered. This involves several circular propagations on a more or less hardwired system, attenuated or amplified by some metric which somehow involves emotions and olfactoric senses and some "recipe" which so far nobody can understand. There are several "layers" of storage (not just short-term and long-term memory) as well. That is what I "remember".

 

It works the same for all "properly working" humans.

 

Trying to compare this process to image data as picked up by a camera and stored in a computer is meaningless. It's like comparing a cow's ability of flying an airplane compared to a scissor's ability to produce eggs.

 

No, I probably can't remember 4,000 events either, though maybe I could, who knows. My memories are not stored in an array, and I am not counting them, so it is hard to tell how many they are. However, it is also meaningless to try to find out. The human memory, in the same way as perception is highly selective in what is stored (at least on "properly working" humans, there exist a few individuals where this isn't the case, they are seriously troubled every moment of their everyday life). This is a property that is essential for survival. The brain is supposed not to store all information, this is by design.

On the other hand, it is also highly fault-tolerant. You are still able to properly identify most things almost all the time when you acquire a retina defect later (supposed it's not a 100% defect). Humans are still able to perform this task rather trivially and with a very low error rate having lost one eye completely and having lost upwards of 50% on the remaining eye. Try and make a computer match data with a noise ratio upwards of 75%. Or try Google for "similar images" and see what you get, for that matter.

 

It is however meaningless how much of my eyesight I could lose, whether or not I can remember 400 or 4,000 or 40,271 events in my life, or whether I can remember some particular shade of some color. A computer is entirely unable to reproduce most of this kind of memory either way, so there is no base for comparison in the first place.

 

A computer could, however, conceivably reproduce a memory (or a ruleset, or other information) such as "fire is hot", "hot not good for your hands", or "things you drop fall to the ground", or "eggs only have limited support for microwaving", or "you can put a sphere into a circular hole".

 

These basic rules/patterns/facts are all things which most people learn in childhood. Also, they are things that not only the most advanced human, but even humans which are of quite sub-average intelligence reliably remember to the end of their lives.

Like most children, I had to learn multiplication tables in school. Unluckily, all present time computers have arithmetic hardwired, so it isn't very suitable for a "memory" comparison (but maybe you can still find a functional Z80?), but if that was the case, my grandfather would still win, since there is no 85 year old computer in service (and certainly there are worldwide less than a handful of computers older than 20-25 years in uninterrupted service, without replacing harddisks etc).

 

Being able to remember a single event/fact/ruleset over 40/80/100 years will show "superiority" to the computer according to the given challenge, since 1 > 0, and so far hardly any computer can remember anything from 40 years ago (if at all) and none can remember anything from 60, 80 or 100 years ago. But even leaving the fact that computers don't yet exist for that long out of consideration, the most advanced computer isn't nearly as capable as a very much sub-average human, and definitively has not been and will not remain functional nearly as long as the average human (not without replacing the "brain" and restoring data from backup anyway, which is cheating).


Edited by samoth, 02 February 2014 - 09:54 AM.


#59 samoth   Crossbones+   -  Reputation: 4926

Like
2Likes
Like

Posted 02 February 2014 - 11:31 AM

To go deeper into detail why such comparisons are meaningless, consider the following:

 

In A.J. Hanson's Visualizing Quaternions book, there exists an example to which he refers as "urban legend" of an upside-down F16. According to the legend, the board computer would turn the airplane upside down when crossing the equator because the sign of the latitude flipped. The author says he could not find a reference as to whether this actually happened, or whether it only happened in simulations (hence "legend").

It makes no difference whether it happened for real, or only in a simulation (same thing for the computer!). The point is that an intelligent being would be immediately aware that turning the airplane upside down for no apparent reason (and defiance to the visible horizon and the gyroscope) is a nonsensical decision, and something must be wrong.

This legend is very similar to an actual event where a civil airplane left a trench a couple of hundred meters long in a forest at a new airplane's first public demonstration.
The initial story was that the pilot performed a show-off manueuver which went slightly over the allowed tolerances, and when he pulled the stick,  nothing happened. The board computer had deemed that the manueuver wasn't so good for the airplane (of course, crashing into a forest isn't precisely good either, but the computer failed to see that). This was later settled in an officieal statement backed by the (presumably well-paid) pilot which said it was a mere "piloting error".

Both events are examples how being able to perform calculations and intelligence are not the same things.

 

Similar can be said about pattern matching. Computers are much better at finding a fingerprint in a database than a human would be. They are also much better at identifying a person's face in a crowd.

However, the police still has every "hit" verified by a human, and biometric passports need to be sourced with photographs with a very specific layout and very exact placement. Why is this the case?

 

The reason is simple: Computers are not better at the job. They are faster at doing calculations. They are thus better at finding some statistical match out of a large number of samples, given a precise human-generated metric and well-chosen comparison patterns. Their results may or may not correlate with an actual match.

Once every so and so often (and often enough to be significant), the computer will report a match where the human reviewer will immediately see that the match is total bollocks. Similarly, the computer only achieves reasonably good outputs if given high-quality standardized patterns to match against.

 

Average humans are not able to match ten thousand faces per second, but they are able to identify/recognize another human given very deficient input patterns with a surprisingly low error rate. Especially women are exceedingly good at face-matching (don't ask me why, someone might come up with a hunter-vs-breeder evolutionary theory, but since the gender depends merely on one chromosome, I'd wager that unless face recognition is coded on the X-chromosome, there's hardly a way this could be a reason).

Either way, try and have a computer recognize a face from a 30° angle above front view when only having seen that person from the side before. Or try to get a positive recognition on someone looking away in almost the opposite direction. Women still get it 99% right even in absurdly bad conditions (and, they do that without having trained on the task in particular, without someone else writing a specialized "program" for them to work in that border case).


Edited by samoth, 02 February 2014 - 11:34 AM.


#60 Tutorial Doctor   Members   -  Reputation: 1650

Like
0Likes
Like

Posted 02 February 2014 - 11:44 PM

Good post Samoth. I acknowledge computers are faster and more efficient at things humans are not. But as far as "more intelligent" they just don't come close. Perhaps we should have sorted out the definition of "intelligence" we all can agree on first. 


Edited by Tutorial Doctor, 02 February 2014 - 11:45 PM.

They call me the Tutorial Doctor.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS