Hi. I just had one idea that I would like to share. Please tell me what you think.
As we all know, if we have a neural net, there is a limit up to which it can learn, then the net screws up. It''s calculus, can''t argue with that.
Think of our mind. You can separate it into countiousness and subcountiousness. When we sleep some garbage data is lost, some is transfared from countiousness to subcountiousness, etc.
So, why not use it in games? I am not sure of the architectures, but you can make every character sleep for some period of time. During this time, the AI would reteach the network, transfer data based on the network architecture, etc.
Not a bad interpretation. I'm not sure about the conscience part though. After all, what is a subconscience?
I wonder if a neural net needs to train all the time anyway. Selective training might be more fruitful.
Perhaps we should try something like the Borg on Star Trek. We could have a lot of different neural nets training in different ways and use some sort of genetic algorithm to weed out the bad things and then transfer the good among the other neural nets. Maybe we should treat small neural nets as a single lifelike organizms and let them work in conjunction with one another in some organized way?
Ok, my neural net says eat more Pringles after editing.
Nick, how about setting up the net''s architecture so that there would really be two nets. One for short term memory, and the other for long term. During the "day" the net learns all the data "into" the short term memory. During the "night", based on some algorithm the important data is filtered out, and moved tought "into" the long term memory. Then the short term memory will be set to blank. This way when the net wakes up, it will be able to learn new things again, and at the same time the important data will be collected, and old junk will be erased.
This idea is known to neural network theory. It''s called the elasticity and plasticity of the network. Unfortunately I can''t remember off-hand the name of a network that uses it. The concept is that the primary task of a network, or long-term memory, is referred to as it plasticity, and it''s ability to adapt to short-term changes it''s elasticity. Short-term changes can be adapted into the long-term memory. Sorry I can''t provide more information, but it''s been a while since I read this. Anyone else out there know of this network?