Nonlinear Filters. Simon Haykin
Чтение книги онлайн.

Читать онлайн книгу Nonlinear Filters - Simon Haykin страница 22

Название: Nonlinear Filters

Автор: Simon Haykin

Издательство: John Wiley & Sons Limited

Жанр: Программы

Серия:

isbn: 9781119078159

isbn:

СКАЧАТЬ the entropy of a discrete random vector with alphabet is defined as:

       and correspondingly, for a continuous random vector , we have:

      (2.97)upper H left-parenthesis bold upper X right-parenthesis equals double-struck upper E left-bracket StartFraction 1 Over log p left-parenthesis bold x right-parenthesis EndFraction right-bracket comma

      Definition 2.4 Joint entropy is defined for a pair of random vectors based on their joint distribution as:

      (2.98)upper H left-parenthesis bold upper X comma bold upper Y right-parenthesis equals double-struck upper E left-bracket StartFraction 1 Over log p left-parenthesis bold x comma bold y right-parenthesis EndFraction right-bracket period

      Definition 2.5 Conditional entropy is defined as the entropy of a random variable (state vector) conditional on the knowledge of another random variable (measurement vector):

      (2.99)upper H left-parenthesis bold upper X vertical-bar bold upper Y right-parenthesis equals upper H left-parenthesis bold upper X comma bold upper Y right-parenthesis minus upper H left-parenthesis bold upper Y right-parenthesis period

       It can also be expressed as:

      (2.100)upper H left-parenthesis bold upper X vertical-bar bold upper Y right-parenthesis equals double-struck upper E left-bracket StartFraction 1 Over log p left-parenthesis bold x vertical-bar bold y right-parenthesis EndFraction right-bracket period

      Definition 2.6 Mutual information between two random variables is a measure of the amount of information that one contains about the other. It can also be interpreted as the reduction in the uncertainty about one random variable due to knowledge about the other one. Mathematically it is defined as:

       Substituting for from (2.99) into the aforementioned equation, we will have:

      (2.102)upper I left-parenthesis bold upper X semicolon bold upper Y right-parenthesis equals upper H left-parenthesis bold upper X right-parenthesis plus upper H left-parenthesis bold upper Y right-parenthesis minus upper H left-parenthesis bold upper X comma bold upper Y right-parenthesis period

      Therefore, mutual information is symmetric with respect to bold upper X and bold upper Y. It can also be viewed as a measure of dependence between the two random vectors. Mutual information is nonnegative; being equal to zero, if and only if bold upper X and bold upper Y are independent. The notion of observability for stochastic systems can be defined based on the concept of mutual information.

      Definition 2.7 (Stochastic observability) The random vector (state) is unobservable from the random vector (measurement), if they are independent or equivalently . Otherwise, is observable from .

СКАЧАТЬ