Pros and Cons of Weight Pruning for Out-of-Distribution Detection: An Empirical Survey

Satoru Koda, Alon Zolfit, Edita Grolman, Asaf Shabtai, Ikuya Morikawa, Yuval Elovici

2023 International Joint Conference on Neural Networks (IJCNN), 1-10, 2023

Deep neural networks (DNNs) perform well on samples from the training distribution. However, DNNs deployed in the real world are exposed to out-of-distribution (OOD) samples, which refer to the samples from distributions that differ from the training distribution. OOD detection is indispensable to the DNNs as OOD samples can cause unexpected behaviors for them. This paper empirically explores the effectiveness of weight pruning of DNNs for OOD detection in a post-hoc setting (i.e., performing OOD detection based on pretrained DNN models). We conduct experiments on image, text, and tabular datasets to thoroughly evaluate OOD detection performance of weight-pruned DNNs. Our experimental results bring the following three novel findings: (i) Weight pruning improves OOD detection per-formance more significantly with a Mahalanobis distance-based detection approach, which performs OOD detection on …