Название: Nonlinear Filters
Автор: Simon Haykin
Издательство: John Wiley & Sons Limited
Жанр: Программы
isbn: 9781119078159
isbn:
Preface
Taking an algorithmic approach, this book provides a step towards bridging the gap between control theory, statistical signal processing, and machine learning regarding the state/parameter estimation problem. State estimation is an important concept that has profoundly influenced different branches of science and engineering. State of a system refers to a minimal record of the past history, which is required for predicting the future behavior. In this regard, a dynamic system can be described from the state perspective by selecting a set of independent variables as state variables. It is often desirable to know the state variables, and in control applications, to force them to follow desired trajectories in the state space. State estimation refers to the process of reconstructing the hidden or latent state variables, which cannot be directly measured, from system inputs and outputs in the minimum possible length of time. Filtering algorithms, which are deployed for state estimation, aim at minimizing the error between the estimated and the true values of the state variables.
The first part of the book is dedicated to classic estimation algorithms. A thorough presentation of the notion of observability, which refers to the ability to reconstruct the state variables from measurements, is followed by covering a number of observers as state estimators for deterministic systems. Regarding stochastic systems, optimal Bayesian filtering is presented that provides a conceptual solution for the general state estimation problem. Different Bayesian filtering algorithms have been developed based on computationally tractable approximations of the conceptual Bayesian solution. For the special case of linear systems with Gaussian noise, Kalman filter provides the optimal Bayesian solution. To extend the application of Kalman filter to nonlinear systems, two main approaches have been proposed to provide suboptimal solutions: using power series to approximate the nonlinear functions and approximating the probability distributions. While extended Kalman filter, extended information filter, and divided‐difference filter approximate the nonlinear functions, unscented Kalman filter, cubature Kalman filter, and particle filter approximate the probability distributions. Other Kalman filter variants include Gaussian‐sum filter, which handles non‐Gaussianity, and generalized PID filter. Among the mentioned filters, particle filter is capable of handling nonlinear and non‐Gaussian systems. Smooth variable‐structure filter, which has been derived based on a stability theorem, is able to handle model uncertainties. Moreover, it benefits from using a secondary set of performance indicators in addition to the innovation vector.
The second part of the book is dedicated to machine learning‐based filtering algorithms. Basic learning algorithms, deep learning architectures, and variational inference are reviewed to lay the groundwork for such algorithms. Different deep learning‐based filters have been developed, which deploy supervised or unsupervised learning. These filters include deep Kalman filter, backpropagation Kalman filter, differentiable particle filter, deep Rao–Blackwellized particle filter, deep variational Bayes filter, Kalman variational autoencoder, and deep variational information bottleneck. Wasserstein distributionally robust Kalman filter and hierarchical invertible neural transport are presented in addition to the mentioned filtering algorithms. Expectation maximization allows for joint state and parameter estimation. Different variants of expectation maximization algorithm are implemented using particles, Gaussian mixture models, deep neural networks, relational deep neural networks, variational filters, and amortized variational filters. Variational inference and reinforcement learning can be viewed as instances of a generic expectation maximization problem. As a result, (deep) reinforcement learning methods can be used to develop novel filtering algorithms. Finally, the book covers nonparametric Bayesian models. In addition to reviewing measure‐theoretic probability concepts and the notions of exchangeability, posterior computability, and algorithmic sufficiency, guidelines are provided for constructing nonparametric Bayesian models from parametric ones.
This book reviews a wide range of applications of classic and machine learning‐based filtering algorithms regarding COVID‐19 pandemic, influenza incidence, prediction of drug effect, robotics, information fusion, augmented reality, battery state‐of‐charge estimation for electric vehicles, autonomous driving, target tracking, urban traffic network, cybersecurity and optimal power flow in power systems, single‐molecule fluorescence microscopy, and finance.
P. Setoodeh, S. Habibi, and S. Haykin
Hamilton, Ontario, Canada
January 2022
Acknowledgments
We would like to express our deepest gratitude to several colleagues, who helped us in one form or another while writing this book: Dr. Mehdi Fatemi, Dr. Pouya Dehghani Tafti, Dr. Ehsan Taghavi, Dr. Andrew Gadsden, Dr. Hamed Hossein Afshari, Dr. Mina Attari, Dr. Dhafar Al‐Ani, Dr. Ulaş Güntürkün, Dr. Yanbo Xue, Dr. Ienkaran Arasaratnam, Dr. Mohammad Al‐Shabi, Dr. Alireza Khayatian, Dr. Ali Akbar Safavi, Dr. Ebrahim Farjah, Dr. Paknoosh Karimaghaee, Dr. Mohammad Ali Masnadi‐Shirazi, Dr. Mohammad Eghtesad, Dr. Majid Rostami‐Shahrbabaki, Dr. Zahra Kazemi, Dr. Farshid Naseri, Dr. Zahra Khatami, Dr. Mohsen Mohammadi, Dr. Thiagalingam Kirubarajan, Dr. Stergios Roumeliotis, Dr. Magnus Norgaard, Dr. Eric Foxlin, Dr. Maryam Dehghani, Dr. Mohammad Mehdi Arefi, Dr. Mohammad Hassan Asemani, Dr. Mohammad Mohammadi, Dr. Mehdi Allahbakhshi, Dr. Haidar Samet, Dr. Mohammad Rastegar, Dr. Behrooz Zaker, Dr. Ali Reza Seifi, Dr. Mahdi Raoofat, Dr. Jun Luo, and Dr. Steven Hockema.
Last but by no means least, we would like to thank our families. Their endless support, encouragement, and love have always been a source of energy for us.
P. Setoodeh, S. Habibi, and S. Haykin
Acronyms
Backprop KFbackpropagation Kalman filterBMSbattery management systemsCKFcubature Kalman filterCNNconvolutional neural networkCRLBCramér–Rao lower boundDDFdivided‐difference filterDKFdeep Kalman filterDRBPFdeep Rao–Blackwellized particle filterDVBFdeep variational Bayes filterDVIBdeep variational information bottleneckEKFextended Kalman filterELBOevidence lower boundEMexpectation maximizationFATEfairness, accountability, transparency, and ethicsGANgenerative adversarial networkGRUgated recurrent unitHINThierarchical invertible neural transportIBinformation bottleneckIMMinteracting multiple modelISimportance samplingKFKalman filterKLDKullback–Leibler divergenceKVAEKalman variational autoencoderLSTMlong short‐term memoryLTIlinear time‐invariantLTVlinear time‐varyingMAPmaximum a posterioriMCMCMarkov chain Monte CarloMDPMarkov decision processMLmaximum likelihoodMMSEminimum mean square errorMPFmarginalized particle filterN‐EMneural expectation maximizationNIBnonlinear information bottleneckNLLnegative log likelihoodPCRLBposterior Cramér–Rao lower boundPDFprobability distribution functionP‐EMparticle expectation maximizationPFparticle filterPIDproportional‐integral‐derivativePOMDPpartially‐observable Markov decision processRBPFRao‐Blackwellised particle filterReLUrectified linear unitR‐NEMrelational neural expectation maximizationRNNrecurrent neural networkSGVBstochastic gradient variational BayesSIRsampling importance resamplingSLAMsimultaneous localization and mappingSMAUGsingle molecule analysis by unsupervised Gibbs samplingSMCsequential Monte CarloSoCstate of chargeSoHstate of healthSVSFsmooth variable‐structure filterTD learningtemporal‐difference learningUIOunknown‐input observerUKFunscented Kalman filterVAEvariational autoencoderVFEMvariational filtering expectation maximizationVSCvariable‐structure controlwILIweighted influenza‐like illness
1 Introduction
1.1 State of a Dynamic System
In many branches of science and engineering, deriving a probabilistic model for sequential data plays a key role. System theory provides guidelines for studying the underlying dynamics of sequential data (time series). In describing a dynamic system, the notion of state is a key concept [1]:
СКАЧАТЬ