Название: Handbook of Intelligent Computing and Optimization for Sustainable Development
Автор: Группа авторов
Издательство: John Wiley & Sons Limited
Жанр: Техническая литература
isbn: 9781119792628
isbn:
2.3.1 McCulloch-Pitts Neural Model
The model proposed by McCulloch and Pitts is documented as linear threshold gate [1]. The artificial neuron takes a set of input I1, I2, I3, … …, IN ∈ {0, 1} and produces one output, y ∈ {0, 1}. Input sets are of two types: one is dependent input termed as excitatory input and the other is independent input termed as inhibitory input. Mathematically, the function can be expressed by the following equations:
(2.2)
where
W1, W2, W3, …, …, WN ≡ weight values associated with the corresponding input which are normalized in the range of either (0, 1) or (−1, 1);
S ≡ weighted sum;
θ ≡ threshold constant.
The function f is called linear step function shown in Figure 2.3.
The schematic diagram of linear threshold gate is given in Figure 2.4.
This initial two-state model of ANN is simple but has immense computational power. The disadvantage of this model is lack of flexibility because of fixed weights and threshold values. Later McCulloch-Pitts neuron model has been improved incorporating more flexible features to extend its application domain.
2.3.2 The Perceptron
McCulloch-Pitts neuron model was enhanced by Frank Rosenblatt in 1957 where he proposed the concept of the perceptron [2] to solve linear classification problems. This algorithm supervises the learning process of binary classifiers. This binary single neuron model merges the concept of McCulloch-Pitts model [1] with Hebbian learning rule of adjusting weights [3]. In perceptron, an extra constant, termed as bias, is added. The decision boundary can be shifted by bias away from the origin. It is independent of any input value. To define perceptron, Equation (2.1) has been modified as follows:
(2.3)
where
b ≡ bias value.
2.3.3 ANN With Continuous Characteristics
This model is also the extension of McCulloch-Pitts neuron model. Two stages are used to illustrate ANN with continuous characteristics. The schematic diagram of the model is presented in Figure 2.5. The linear combination of input values is calculated in the first stage. The weight value associated with each value of the input array lies between 0 and 1. The summation function can be expressed as σ in Equation (2.4).
where
T ≡ extra input value associated with weight value 1 which represents the threshold or bias of a neuron.
Figure 2.3 Linear threshold function.
Figure 2.4 Schematic diagram of linear threshold gate.
The second stage of the model is the activation function which takes the sum-of-product value as the input and produces the output. The activity of this stage determines the characteristic of the ANN model. This function compresses the amplitude of the output so that it lies in the range of [0, 1] or [−1, 1]. The compression of the output signal is performed to mimic the signal produced by biological neuron in the form of continuous action-potential spikes.
The function used in the above discussed model is semi-linear and termed as logistic sigmoid function. The graphical depiction of the function is presented in Figure 2.6. The mathematical demonstration of logistic sigmoid function is presented in Equation (2.5).
Figure 2.5 ANN model with continuous characteristics.
Figure 2.6 Graphical representation of logistic sigmoid function.
From the graphical representation, it is clear that for the large positive value of input x, the output value y tends to 1. On the other hand, for the negative value of x, y tends to 0. Again, if x approaches to −∞ or ∞, the slope of the graph becomes 0. The increment of slope occurs as x goes close to 0. These characteristics of the sigmoid graph are very crucial in ANN.
2.3.4 Single-Layer Neural Network
It is the simplest form of neural network model containing only one layer of input nodes which receives the weighted input and send it to the subsequent layer of receiving nodes. In some cases, there may be only one neuron exists at the receiving end. Even a single neuron of ANN has astonishing computational capability. As the activity of the input is limited to receiving and passing of input signal and it does not perform any computation, thus only true layer of neuron is single-layer network is the output layer. The basic model of single layer neural network is shown in Figure 2.7. The yellow nodes denote the input layer which receives x1, x2, x3, ……, xN as the input and send it to the output layer СКАЧАТЬ