Can you? That's impressive! Tell me, what was the exact shade of color exactly 1/8th from the top of your eyesight radius? What were the exact dimensions of the blades of grass around you?
Sigh. This had to come.
Sadly, it only shows that you didn't read my post properly, nor do you understand (or you deliberatly pretend not understanding) how human perception or the human mind works in any way.
It is obvious that even a below-average human's intelligence is superior to very advanced artificial intelligence, but it is also obvious that the human ability to memorize quantifiable data is neglegible compared to a computer. I would most certainly fail trying to memorize the first 100,000 primes even if you gave me 3 months of time. To a cellphone-sized computer, this is no challenge. However, computers likewise fail at pathetically trivial tasks. Such a comparison is widely meaningless.
My post said that finding a memory prowess challenge where the human beats a laptop is very unsuitable to demonstrate the superiority of one over the other. The challenge is so simple that it is compelling to "prove" the obviously wrong: Humans are better at remembering things (as "proven" by my example).
The "proof" by child memories still stands.
No, I don't remember the exact shade of some pixels on my retina 30 years ago or the number of grass blades anywhere. The reason being that my visual organs do not have a concept of pixels, nor of exact shades, nor does my brain have any such concept. Besides, a computer is not able to reliably answer the question "how many grass blades are in this image" either, even without having to remember the number, even when it explicitly tries to (other than me, who explicitly tries not to remember that information).
My brain, like the vast majority of human brains, receives a pre-integrated, contrast-enhanced (horizontal/bipolar cells) and modulated (ganglion cells), fault-corrected signal coming from a very non-uniform sample grid with a very non-uniform color reception and a very non-objective automatic luminance regulation. Plus, superposition of two images from different viewpoints combined in one.
The brain somehow transforms this... stuff... into something, which it selectively filters for information that is important for the present situation. That is what I "see". It is not an array of pixels of some particular shade, not even remotely.
This is a key to survival and to managing everyday situations. The brain then selects what part of this information (and other information) is important for the situation and how much of it, if any, is important to be remembered. This involves several circular propagations on a more or less hardwired system, attenuated or amplified by some metric which somehow involves emotions and olfactoric senses and some "recipe" which so far nobody can understand. There are several "layers" of storage (not just short-term and long-term memory) as well. That is what I "remember".
It works the same for all "properly working" humans.
Trying to compare this process to image data as picked up by a camera and stored in a computer is meaningless. It's like comparing a cow's ability of flying an airplane compared to a scissor's ability to produce eggs.
No, I probably can't remember 4,000 events either, though maybe I could, who knows. My memories are not stored in an array, and I am not counting them, so it is hard to tell how many they are. However, it is also meaningless to try to find out. The human memory, in the same way as perception is highly selective in what is stored (at least on "properly working" humans, there exist a few individuals where this isn't the case, they are seriously troubled every moment of their everyday life). This is a property that is essential for survival. The brain is supposed not to store all information, this is by design.
On the other hand, it is also highly fault-tolerant. You are still able to properly identify most things almost all the time when you acquire a retina defect later (supposed it's not a 100% defect). Humans are still able to perform this task rather trivially and with a very low error rate having lost one eye completely and having lost upwards of 50% on the remaining eye. Try and make a computer match data with a noise ratio upwards of 75%. Or try Google for "similar images" and see what you get, for that matter.
It is however meaningless how much of my eyesight I could lose, whether or not I can remember 400 or 4,000 or 40,271 events in my life, or whether I can remember some particular shade of some color. A computer is entirely unable to reproduce most of this kind of memory either way, so there is no base for comparison in the first place.
A computer could, however, conceivably reproduce a memory (or a ruleset, or other information) such as "fire is hot", "hot not good for your hands", or "things you drop fall to the ground", or "eggs only have limited support for microwaving", or "you can put a sphere into a circular hole".
These basic rules/patterns/facts are all things which most people learn in childhood. Also, they are things that not only the most advanced human, but even humans which are of quite sub-average intelligence reliably remember to the end of their lives.
Like most children, I had to learn multiplication tables in school. Unluckily, all present time computers have arithmetic hardwired, so it isn't very suitable for a "memory" comparison (but maybe you can still find a functional Z80?), but if that was the case, my grandfather would still win, since there is no 85 year old computer in service (and certainly there are worldwide less than a handful of computers older than 20-25 years in uninterrupted service, without replacing harddisks etc).
Being able to remember a single event/fact/ruleset over 40/80/100 years will show "superiority" to the computer according to the given challenge, since 1 > 0, and so far hardly any computer can remember anything from 40 years ago (if at all) and none can remember anything from 60, 80 or 100 years ago. But even leaving the fact that computers don't yet exist for that long out of consideration, the most advanced computer isn't nearly as capable as a very much sub-average human, and definitively has not been and will not remain functional nearly as long as the average human (not without replacing the "brain" and restoring data from backup anyway, which is cheating).
Edited by samoth, 02 February 2014 - 09:54 AM.