Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and chemometrics......click here for more. |
![]() |
Home ![]() ![]() ![]() ![]() |
|
See also: time series model building, recurrent networks, Vector Operators | |
Time Series - Neural Network ModelsWhen dealing with neural networks, the model-finding
process is very similar to that for ARIMA models. The three phases,
Window Networks:Since neural networks have not been developed for handling sequences of inputs, either the input has to be pre-processed or the model has to be adapted to temporal tasks. Pre-processing is the easier of the two strategies. It turns a sequence of time-series elements into a single input. This can, for instance, be achieved by sliding a so-called "time window" over the sequence. The following figure demonstrates how this works:
The input consists of the preceding sequence elements. The neural network is trained to forecast the next sequence element. (Annotation: this resembles an AR-model from the class of ARIMA models, where the window size is equal to the order of the AR-model). Using a time window has the advantage that standard neural networks can be used. Therefore, they can be simulated with standard neural network modeling tools. Time Delay Neural Networks (TDNN):A large number of sophisticated neural network models tailored towards processing sequences of inputs have been developed. They are all equipped with some kind of "memory" keeping information over time. Neural networks without such memories "forget" each input after mapping it to the output. In time delay networks![]() ![]() Recurrent Networks:While window networks and time delay networks are non-recurrent nets, networks with feedback loops belong to the group of recurrent networks. There, unit activations are not only delayed while being fed forward through the network, but unit activations are also delayed and fed back to preceding layers. By this procedure, information can cycle in the network. At least theoretically, this allows an unlimited number of past activations to be taken into account.
|
|
Home ![]() ![]() ![]() ![]() |