Περίληψη: | This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the book describes the derivation of theoretical bounds in minimax decentralized hypothesis testing, which have not yet been known. As a timely report on the state-of-the-art in robust hypothesis testing, this book is mainly intended for postgraduates and researchers in the field of electrical and electronic engineering, statistics and applied probability. Moreover, it may be of interest for students and researchers working in the field of classification, pattern recognition and cognitive radio.
|