Chao Wang, Qing Zhao, and Kobi Cohen
2017 IEEE International Symposium on Information Theory (ISIT)
The problem of detecting a few anomalous processes among a large number of M processes is considered. At eachtime, aggregated observations can be taken from a chosen subset of processes, where the chosen subset conforms to agiven binary tree structure. The random observations are i.i.d. over time with a general distribution that may dependon the size of the chosen subset and the number of anomalous processes in the subset. The objective is a sequentialsearch strategy that minimizes the sample complexity (i.e., the expected number of observations which representsdetection delay) subject to a reliability constraint. A sequential test that results in a biased random walk on the treeis developed and is shown to be asymptotically optimal in terms of detection accuracy. Furthermore, it achieves theoptimal logarithmic-order sample complexity in M provided that the Kullback-Liebler divergence between aggregatedobservations in the presence and the absence of anomalous processes are bounded away from zero at all levels of thetree structure as M approaches infinity. Sufficient conditions on the decaying rate of the aggregated observations topure noise under which a sublinear scaling in M is preserved are also identified for the Bernoulli case.