Aim:-Write a program for Neural Network
Backpropogation.
Tools: -MATLAB
Theory: -Backpropagation, an abbreviation for "backward
propagation of errors", is a common method of training artificial neural networks. From a desired output, the network learns from many
inputs, similar to the way a child learns to identify a dog from examples of
dogs.
It is a supervised
learning method, and is a generalization of the delta rule. It requires a dataset of the desired
output for many inputs, making up the training set. It is most useful for
feed-forward networks (networks that have no feedback, or simply, that have no
connections that loop). Backpropagation requires that the activation
function used by the artificial neurons (or "nodes") bedifferentiable.
The backpropagation learning
algorithm can be divided into two phases: propagation and weight update.
Phase 1: Propagation
Each propagation involves the
following steps:
1.
Forward propagation of a
training pattern's input through the neural network in order to generate the
propagation's output activations.
2.
Backward propagation of the
propagation's output activations through the neural network using the training
pattern target in order to generate the deltas of all output and hidden
neurons.
Phase 2: Weight
update
For each weight-synapse follow the
following steps:
1.
Multiply its output delta
and input activation to get the gradient of the weight.
2.
Subtract a ratio (percentage) of
the gradient from the weight.
This ratio (percentage) influences
the speed and quality of learning; it is called the learning rate. The greater the
ratio, the faster the neuron trains; the lower the ratio, the more accurate the
training is. The sign of the gradient of a weight indicates where the error is
increasing, this is why the
weight must be updated in the opposite direction.
Repeat phase 1 and 2 until the
performance of the network is satisfactory.
Program:-
p=[0
0 1 1; 0 1 0 1];
t=[0 1 1 0];
net=newff(p,t,[5,5]);
net.trainparam.epochs=50;
net1=train(net,p,t);
y=sim(net1,p)
p1=[0 0.3330 0.9898 0.9790; 1 0.9990 0.0003
0.9999]
y=sim(net1,p1)
plot(p,t);
0 Comments