Название: Nonlinear Filters
Автор: Simon Haykin
Издательство: John Wiley & Sons Limited
Жанр: Программы
isbn: 9781119078159
isbn:
and correspondingly, for a continuous random vector , we have:
Entropy can also be interpreted as the expected value of the term
(2.97)
where
Definition 2.4 Joint entropy is defined for a pair of random vectors based on their joint distribution as:
(2.98)
Definition 2.5 Conditional entropy is defined as the entropy of a random variable (state vector) conditional on the knowledge of another random variable (measurement vector):
(2.99)
It can also be expressed as:
(2.100)
Definition 2.6 Mutual information between two random variables is a measure of the amount of information that one contains about the other. It can also be interpreted as the reduction in the uncertainty about one random variable due to knowledge about the other one. Mathematically it is defined as:
Substituting for from (2.99) into the aforementioned equation, we will have:
(2.102)
Therefore, mutual information is symmetric with respect to
Definition 2.7 (Stochastic observability) The random vector (state) is unobservable from the random vector (measurement), if they are independent or equivalently . Otherwise, is observable from .
Since mutual information is nonnegative, (2.101) leads to the following conclusion: if either
2.8 Degree of Observability
СКАЧАТЬ