Opportunistic Detection Rules: Finite and Asymptotic Analysis


Abstract in English

Opportunistic detection rules (ODRs) are variants of fixed-sample-size detection rules in which the statistician is allowed to make an early decision on the alternative hypothesis opportunistically based on the sequentially observed samples. From a sequential decision perspective, ODRs are also mixtures of one-sided and truncated sequential detection rules. Several results regarding ODRs are established in this paper. In the finite regime, the maximum sample size is modeled either as a fixed finite number, or a geometric random variable with a fixed finite mean. For both cases, the corresponding Bayesian formulations are investigated. The former case is a slight variation of the well-known finite-length sequential hypothesis testing procedure in the literature, whereas the latter case is new, for which the Bayesian optimal ODR is shown to be a sequence of likelihood ratio threshold tests with two different thresholds: a running threshold, which is determined by solving a stationary state equation, is used when future samples are still available, and a terminal threshold (simply the ratio between the priors scaled by costs) is used when the statistician reaches the final sample and thus has to make a decision immediately. In the asymptotic regime, the tradeoff among the exponents of the (false alarm and miss) error probabilities and the normalized expected stopping time under the alternative hypothesis is completely characterized and proved to be tight, via an information-theoretic argument. Within the tradeoff region, one noteworthy fact is that the performance of the Stein-Chernoff Lemma is attainable by ODRs.

Download