How to: Time-Based Learning

Started by
6 comments, last by Prozak 21 years, 7 months ago
Hi all, using neural networks, how does one construct a NNet that has the ability to learn time-based patterns, like, there will always be a signal on input 1 every 6 cycles... for ex...? Does one have to change the weight/threshold aproach and use another type of neuron? thanx for any tip on this,

[Hugo Ferreira][Positronic Dreams][Stick Soldiers]
"Redundant book title: "Windows For Dummies".
"Camouflage condoms: So they won''t see you coming".

Advertisement
If I interpret your desires correctly, you want to construct a recurrent neural network for learning an autoregressive process (a time series of data). There''s plenty of literature on this subject available on the net.

Good luck,

Timkin
wow! that was really unhelpfull... geeez!

[Hugo Ferreira][Positronic Dreams][Stick Soldiers]
"Redundant book title: "Windows For Dummies".
"Camouflage condoms: So they won''t see you coming".

i just want a glimpse, a short description of
how other AI scientists achieve this. how to make a
single neuron, or a small group of neurons learn
patterns that are recurrent in time...

thanx for any short desscription, or really clear link...

[Hugo Ferreira][Positronic Dreams][Stick Soldiers]
"Redundant book title: "Windows For Dummies".
"Camouflage condoms: So they won''t see you coming".

What Timkin said is really helpful. Did you want one of us to write you a tutorial on the subject or did you expect the answer to be just a few sentences long?

Do a search on Google for "recurrent neural network time series". You''ll get 12,000+ hits.

(a recurrent network btw has links that go backwards as well as the usual feedforward connections)
I attempt the "few sentences long" answer.

Basically, with a feed-forward network (I''m assuming you have some knowledge of this NN architecture) you show your set of inputs to your network and out pops the resulting output (or number of outputs).

With a recurrent network one (or some) of the node-outputs that usually ''feed-forward'' to the next layer (closer to the output layer) are now sent to a previous layer.

In this we have a represention of time because in the next epoch these ''feed-backward'' nodes hold some measure of the state of the network from the previous epoch.

This probably doesn''t help at all but hey, it is only a few sentences!

<a href="http://www.purplenose.com>purplenose.com
There are basically two ways to create a time serie predictor. They are not based on messing arround with single neurons, but with the network architecture.

Solution 1: tapped delays.
You can use more past values of your signal as input for a network. Suppose you have a time serie x(0), x(1), ..., x(k). Assume that a history of five is enough, then you can take x(k), x(k-1), x(k-2), x(k-3) and x(k-4) as inputs of a network with 5 inputs. The network should give as output x(k+1). So you train the network by taking all 5 succesive values of x as input and the next value as output as train set. The network itself will form a nonlinear autoregresive model of the pocess tha generated the serie. Drawback of this approach is that the number of inputrs and therfore also the number of weights increases. This will incraese the probability of overfitting a short sequences.

Solution 2: Recurrent network.
The output of the network can be use as input, but you cannot use the current output (because you want to compute this). What you do is take the previous output of he network as input. Example where y is the output of the network, then you would take x(k) and y(k-1) as input and the output should be y(k). The network will form the nonlinear state transition model. Drawback of this is that training becomes more involved. If you use backprop and arives at the input layer, you will find the output as input. This means that you have to contnue backpropping over the layes for the past values as well (backprop through time).

For google keywords:
-Elman network and Jordan network are the most commonly found recurrent networks.
- Time serie is also called time sequence
- sunspots (this is a standard benchmark used to show off solution methods).
Thanx a bunch for the replies, it helped me a lot!

[Hugo Ferreira][Positronic Dreams][Stick Soldiers]
"Please Review my WebSite!".

This topic is closed to new replies.

Advertisement