Digital Communications 1. Safwan El Assad
Чтение книги онлайн.

Читать онлайн книгу Digital Communications 1 - Safwan El Assad страница 12

Название: Digital Communications 1

Автор: Safwan El Assad

Издательство: John Wiley & Sons Limited

Жанр: Программы

Серия:

isbn: 9781119779773

isbn:

СКАЧАТЬ unit of measure of information, the information obtained by the random selection of a single event out of two equally probable events pi = pj = 1/2. In this case, we can write:

images

      If we choose the logarithm in base 2, λ becomes equal to unity and therefore h(xi) = h(xj) = log2(2) = 1 Shannon (Sh) or 1 bit of information, not to be confused with the digital bit (binary digit) which represents one of the binary digits 0 or 1.

      Finally, we can then write:

      [2.14] images

      It is sometimes convenient to work with logarithms in base e or with logarithms in base 10. In these cases, the units will be:

      loge e = 1 natural unit = 1 nat (we choose 1 among e)

      log10 10 = 1 decimal unit = dit (we choose 1 among 10)

      Knowing that:

images

      the relationships between the three units are:

       – natural unit: 1 nat = log2(e) = 1/loge(2) = 1.44 bits of information;

       – decimal unit: 1 dit= log2(10) = 1/log10(2) = 3.32 bits of information.

      They are pseudo-units without dimension.

      Let a stationary memoryless source S produce random independent events (symbols) s, belonging to a predetermined set [S] = [s1,s2, ... ,sN]. Each event (symbol) Si is of given probability pi, with:

images

      The source S is then characterized by the set of probabilities [P] = [p1,p2, ... ,PN]. We are now interested in the average amount of information from this source of information, that is to say, resulting from the possible set of events (symbols) that it carries out, each is taken into account with its probability of occurrence. This average amount of information from the source S is called “entropy H(S) of the source”.

      It is therefore defined by:

      [2.15] images

      2.3.2. Fundamental lemma

      Let two probability partitions on S:

images

      we have the inequality:

      [2.16] images

      Indeed, since: loge(x) ≤ x − 1, ∀x positive real, then:

images

       – Positive: since 0 ≤ pi ≤ 1; (with the agreement

       – Continuous: because it is a sum of continuous functions “log” of each pi.

       – Symmetric: relative to all the variables pi.

       – Upper bounded: entropy has a maximum value: got for a uniform law:

       – Additive: let , then

      [2.17] images

      2.3.4. Examples of entropy

      2.3.4.1. Two-event entropy (Bernoulli’s law)

images Graph depicts Entropy of a two-event source.

      Figure 2.1. Entropy of a two-event source

      The maximum of the entropy is obtained for images equal to 1 bit of information.

      2.3.4.2. Entropy of an alphabetic source with (26 + 1) characters

       – For a uniform law:⟹H = log2 (27) = 4.75 bits of information per character

       – In the French language (according to a statistical study):⟹H = 3.98 bits of information per character

      Thus, a text of 100 characters provides an information = 398 bits.

      The inequality of the probabilities makes a loss of 475 – 398 = 77 bits of information.

      The information rate of a source is defined by:

      [2.18] images

      Where: images represents the average duration of a symbol emitted by the source.

      The redundancy of a source is defined as follows:

      [2.19] images

      Between the source of information and the destination, there is the medium through which information is transmitted. This medium, including the equipment necessary for transmission, is called the transmission channel (or simply the channel).

      Let us consider a discrete stationary and memoryless channel (discrete: the alphabet of the symbols at the input and the one at the output are discrete).

Flow diagram depicts basic transmission system based on a discrete channel.

      Figure 2.2. Basic transmission system based on a discrete channel. For a color version of this figure, see www.iste.co.uk/assad/digital1.zip

СКАЧАТЬ