Digital Communications 1. Safwan El Assad
Чтение книги онлайн.

Читать онлайн книгу Digital Communications 1 - Safwan El Assad страница 13

Название: Digital Communications 1

Автор: Safwan El Assad

Издательство: John Wiley & Sons Limited

Жанр: Программы

Серия:

isbn: 9781119779773

isbn:

СКАЧАТЬ

      We denote:

       – [X] = [xl, x2, ... , xn]: the set of all the symbols at the input of the channel;

       – [y] = [yi, ... , ym]: the set of all the symbols at the output of the channel;

       – [P(X)] = [p(x1), p(x2), ...,p(xn)]: the vector of probability of symbols at the input of the channel;

       – [P(Y)] = [p(yi), p(y2), ... , p(ym)]: the vector of probability of symbols at the output of the channel.

      Because of the perturbations, the space [Y] can be different from the space [X], and the probabilities P(Y) can be different from the probabilities P(X).

      We define a product space [XY] and we introduce the matrix of the probabilities of the joint symbols, input-output [P(X, Y)]:

      [2.20] images

      We deduce, from this matrix of probabilities:

      [2.21] images

      [2.22] images

      We then define the following entropies:

       – the entropy of the source:[2.23]

       – the entropy of variable Y at the output of the transmission channel:[2.24]

       – the entropy of the two joint variables (X, Y)Because of the disturbances in the transmission channel, if the symbolinput-output:[2.25]

      Because of the disturbances in the transmission channel, if the symbol yj appears at the output, there is an uncertainty on the symbol xi, j = 1, ... ,,n which has been sent.

Schematic illustration of ambiguity on the symbol at the input when j is received.

      Figure 2.3. Ambiguity on the symbol at the input when yj is received

      The average value of this uncertainty, or the entropy associated with the receipt of the symbol yj, is:

      [2.26] images

      The mean value of this entropy for all the possible symbols yj received is:

      [2.27] images

      Which can be written as:

      [2.28] images

      or:

      [2.29] images

      The entropy H(X/Y) is called equivocation (ambiguity) and corresponds to the loss of information due to disturbances (as I(X, Y) = H(X)− H(X/Y)). This will be specified a little further.

Schematic illustration of uncertainty on the output when we know the input.

      Figure 2.4. Uncertainty on the output when we know the input

      The entropy of the random variable Y at the output knowing the X at the input is:

      [2.30] images

      This entropy is a measure of the uncertainty on the output variable when that of the input is known.

      The matrix P(Y/X) is called the channel noise matrix:

      [2.31] images

      A fundamental property of this matrix is:

      [2.32] images

      Where: p(yj/xi) is the probability of receiving the symbol yj when the symbol xi has been emitted.

      In addition, one has:

      [2.33] images

      [2.34] images

      p(yj) is the probability of receiving the symbol yjwhatever the symbol xi emitted, and:

      [2.35] images

      p(xi/yj) is the probability that the symbol xi was issued when the symbol yj is received.

      2.5.2. Relations between the various entropies

      We can write:

      In the same way, as one has: H(Y, X) = H(X, Y), therefore:

      In addition, one has the following inequalities:

      [2.38] images

      and similarly:

      [2.39] images

      – Noiseless channel: in this case, on receipt of yj, СКАЧАТЬ