Research on Image Out-of-Distribution Detection: A Review
GUO Lingyun1,2,3, LI Guohe1,2, GONG Kuangfeng1,2, XUE Zhan'ao3
1. College of Information Science and Engineering, China University of Petroleum, Beijing 102249; 2. Beijing Key Laboratory of Petroleum Data Mining, China University of Petroleum, Beijing 102249; 3. College of Computer and Information Engineering, Henan Normal University, Xinxiang 453007
Abstract:Classifier learning assumes that the training data and the testing data are independent and identically distributed. Due to the overly stringent assumption, erroneous sample recognition of classifiers for out-of-distribution examples is often caused. Therefore, thorough research on out-of-distribution(OOD) detection becomes paramount. Firstly, the definition of OOD detection and the relevant research are introduced. A comprehensive overview of supervised detection methods, semi-supervised detection methods, unsupervised detection methods and outlier exposure detection methods is provided according to the difference of network training methods. Then, the existing OOD detection methods are summarized from the aspect of three key technologies: neural network classifiers, metric learning and deep generative models. Finally, research trends of OOD detection are discussed.
[1] GOODFELLOW I J, SHLENS J, SZEGEDY C. Explaining and Har-nessing Adversarial Examples[C/OL]. [2023-05-27]. https://arxiv.org/abs/1412.6572. [2] HENDRYCKS D, GIMPEL K. A Baseline for Detecting Misclassified and Out-of-Distribution Examples in Neural Networks[C/OL]. [2023-05-27]. https://arxiv.org/pdf/1610.02136.pdf. [3] YU C, ZHU X Y, LEI Z, et al. Out-of-Distribution Detection for Reliable Face Recognition. IEEE Signal Processing Letters, 2020, 27: 710-714. [4] BOYER P, BURNS D, WHYNE C. Out-of-Distribution Detection of Human Activity Recognition with Smartwatch Inertial Sensors. Sensors, 2021, 21(5). DOI: 10.3309/s21051669. [5] MÅRTENSSON G, FERREIRA D, GRANBERG T, et al. The Reliability of a Deep Learning Model in Clinical Out-of-Distribution MRI Data: A Multicohort Study. Medical Image Analysis, 2020, 66. DOI: 10.1016/j.media.2020.101714. [6] GAO L, WU S D. Response Score of Deep Learning for Out-of-Distribution Sample Detection of Medical Images. Journal of Biomedical Informatics, 2020, 107. DOI: 10.1016/j.jbi.2020.103442. [7] DE ANGELI K, GAO S, DANCIU I, et al. Class Imbalance in Out-of-Distribution Datasets: Improving the Robustness of the TextCNN for the Classification of Rare Cancer Types. Journal of Biomedical Informatics, 2022, 125. DOI: 10.1016/j.jbi.2021.103957. [8] REN J, LIU P J, FERTIG E, et al. Likelihood Ratios for Out-of-Distribution Detection // Proc of the 33rd International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2019: 14707-14718. [9] MOHSENI M, YAP J, YOLLAND W, et al. Out-of-Distribution Detection for Dermoscopic Image Classification[C/OL].[2023-05-27]. https://arxiv.org/pdf/2104.07819.pdf. [10] CAI F Y, KOUTSOUKOS X. Real-Time Out-of-Distribution Detection in Learning-Enabled Cyber-Physical Systems // Proc of the ACM/IEEE 11th International Conference on Cyber-Physical Systems. Washington, USA: IEEE, 2020: 174-183. [11] FENG Y L, NG D J X, EASWARAN A. Improving Variational Autoencoder Based Out of Distribution Detection for Embedded Real-Time Applications. ACM Transactions on Embedded Computing Systems, 2021, 20(5s). DOI: 10.1145/3477026. [12] KIM Y, CHO D, LEE J H. Wafer Defect Pattern Classification with Detecting Out-of-Distribution. Microelectronics Reliability. 2021, 122. DOI: 10.1016/j.microrel.2021.114157. [13] LEE J, SHIN M S. Estimation of Photometric Redshifts. II. Identification of Out-of-Distribution Data with Neural Networks. The Astronomical Journal, 2022, 163. DOI: 10.3847/1538-3881/ac4335. [14] YANG X, LI Y J, MENG D, et al. Three-Way Multi-granularity Learning towards Open Topic Classification. Information Sciences, 2022, 585: 41-57. [15] XU H, HE K Q, YAN Y M, et al. A Deep Generative Distance-Based Classifier for Out-of-Domain Detection with Mahalanobis Space // Proc of the 28th International Conference on Computational Linguistics. Stroudsburg, USA: ACL, 2020: 1452-1460. [16] IQBAL T, CAO Y, KONG Q Q, et al. Learning with Out-Of-Distribution Data for Audio Classification // Proc of the IEEE International Conference on Acoustics, Speech and Signal Processing. Washington, USA: IEEE, 2020: 636-640. [17] ROADY R, HAYES T L, KEMKER R, et al. Are Out-of-Distribution Detection Methods Effective on Large-Scale Datasets? PLoS One, 2019, 15(9). DOI: 10.1371/journal.pone.0238302. [18] BULUSU S, KAILKHURA B, LI B, et al. Anomalous Instance Detection in Deep Learning: A Survey[C/OL].[2023-05-27]. https://arxiv.org/pdf/2003.06979v1.pdf. [19] YANG J K, ZHOU K Y, LI Y X, et al. Generalized Out-of-Distribution Detection: A Survey[C/OL].[2023-05-27]. https://arxiv.org/pdf/2110.11334.pdf. [20] GARDNER M W, Artificial Neural Networks(The Multilayer Perceptron)-A Review of Applications in the Atmospheric Sciences. Atmospheric Environment, 1998, 32(14/15): 2627-2636. [21] OVADIA Y, FERTIG E, REN J, et al. Can You Trust Your Mo-del's Uncertainty? Evaluating Predictive Uncertainty Under Dataset Shift // Proc of the 33rd International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2019: 14003-14014. [22] WANG M, DENG W H. Deep Visual Domain Adaptation: A Survey. Neurocomputing. 2018, 312: 135-153. [23] BEN-DAVID S, BLITZER J, CRAMMER K, et al. A Theory of Learning from Different Domains. Machine Learning, 2010, 79: 151-175. [24] LI D, YANG Y X, SONG Y Z, et al. Deeper, Broader and Artier Domain Generalization // Proc of the IEEE International Confe-rence on Computer Vision. Washington, USA: IEEE, 2017: 5543-5551. [25] PARK H, HWANG K, HA Y. An Object Detection Model Robust to Out-of-Distribution Data // Proc of the IEEE International Conference on Big Data and Smart Computing. Washington, USA: IEEE, 2021: 275-278. [26] BLUM H, SARLIN P, NIETO J, et al. Fishyscapes: A Benchmark for Safe Semantic Segmentation in Autonomous Driving // Proc of the IEEE/CVF International Conference on Computer Vision Workshop. Washington, USA: IEEE, 2019:2403-2412. [27] HENDRYCKS D, BASART S, MAZEIKA M, et al. A Benchmark for Anomaly Segmentation[C/OL].[2023-05-27]. https://arxiv.org/pdf/1911.11132v1.pdf. [28] YANG Y W, GUO X T, PAN Y W, et al. Uncertainty Quantification in Medical Image Segmentation with Multi-decoder U-Net // Proc of the 7th International MICCAI Brainlesion Workshop. Berlin, Germany: Springer, 2021: 570-577. [29] CHAN R, ROTTMANN M, GOTTSCHALK H. Entropy Maximization and Meta Classification for Out-of-Distribution Detection in Semantic Segmentation // Proc of the IEEE/CVF International Conference on Computer Vision. Washington, USA: IEEE, 2021: 5108-5117. [30] BAUR C, WIESTLER B, ALBARQOUNI S, et al. Deep Autoencoding Models for Unsupervised Anomaly Segmentation in Brain MR Images // Proc of the International MICCAI Brainlesion Workshop. Berlin, Germany: Springer, 2018: 161-169. [31] 吕承侃,沈飞,张正涛,等.图像异常检测研究现状综述.自动化学报, 2022, 48(6): 1402-1428. (LÜ C K, SHEN F, ZHANG Z T, et al. Review of Image Anomaly Detection. Acta Automatica Sinica, 2022, 48(6): 1402-1428.) [32] 王增茂. 面向小样本问题的主动学习理论及应用研究.博士学位论文.武汉:武汉大学, 2019. (WANG Z M. A Study on the Theory and Application of Active Learning for Small Size Samples. Ph.D. Dissertation. Wuhan, China: Wuhan University, 2019.) [33] EL-YANIV R, WIENER Y. On the Foundations of Noise-Free Selective Classification. Journal of Machine Learning Research, 2010, 11: 1605-1641. [34] DHAMIJA A R, AHMAD T, SCHWAN J, et al. Self-Supervised Features Improve Open-World Learning[C/OL].[2023-05-27]. https://arxiv.org/pdf/2102.07848.pdf. [35] 高菲,杨柳,李晖.开放集识别研究综述.南京大学学报(自然科学), 2022, 58(1): 115-134. (GAO F, YANG L, LI H. A Survey on Open Set Recognition. Journal of Nanjing University(Natural Science), 2022, 58(1): 115-134.) [36] LIANG S Y, LI Y X, SRIKANT R. Enhancing the Reliability of Out-of-Distribution Image Detection in Neural Networks[C/OL]. [2023-05-27]. https://openreview.net/pdf?id=H1VGkIxRZ. [37] VERNEKAR S, GAURAV A, ABDELZAD V, et al. Out-of-Distribution Detection in Classifiers via Generation[C/OL].[2023-05-27]. https://arxiv.org/pdf/1910.04241v1.pdf. [38] GAO Y, SU Q L. Out-of-Distribution Detection with Uncertainty Enhanced Attention Maps // Proc of the International Joint Confe-rence on Neural Networks. Washington, USA: IEEE, 2021. DOI: 10.1109/IJCNN52387.2021.9533779. [39] GOODFELLOW I J, POUGET-ABADIE J, MIRZA M, et al. Gene-rative Adversarial Nets // Proc of the 27th International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2014: 2672-2680. [40] AMIT G, LEVY M, ROSENBERG I, et al. FOOD: Fast Out-of-Distribution Detector // Proc of the International Joint Conference on Neural Networks. Washington, USA: IEEE, 2021. DOI: 10.1109/IJCNN52387.2021.9533465. [41] THULASIDASAN S, THAPA S, DHAUBHADEL S, et al. An Effective Baseline for Robustness to Distributional Shift // Proc of the 20th IEEE International Conference on Machine Learning and Applications. Washington, USA: IEEE, 2021: 278-285. [42] RAN X M, XU M K, MEI L R, et al. Detecting Out-of-Distribution Samples via Variational Auto-Encoder with Reliable Uncertainty Estimation. Neural Networks, 2022, 145: 199-208. [43] RUFF L, VANDERMEULEN R A, GÖRNITZ N, et.al. Deep Semi-Supervised Anomaly Detection[C/OL]. [2023-05-27]. https://openreview.net/pdf?id=HkgH0TEYwH. [44] HENDRYCKS D, MAZEIKA M, DIETTERICH T. Deep Anomaly Detection with Outlier Exposure[C/OL]. [2023-05-27]. https://openreview.net/pdf?id=HyxCxhRcY7. [45] PAPADOPOULOS A, RAJATI M R, SHAIKH N, et al. Outlier Exposure with Confidence Control for Out-of-Distribution Detection. Neurocomputing. 2021, 441: 138-150. [46] WANG Q Z, YE J J, LIU F, et al. Out-of-Distribution Detection with Implicit Outlier Transformation[C/OL].[2023-05-27]. https://openreview.net/pdf?id=hdghx6wbGuD. [47] CHEN J F, LI Y X, WU X, et al. ATOM: Robustifying Out-of-Distribution Detection Using Outlier Mining // Proc of the European Conference on Machine Learning and Knowledge Discovery in Databases. Berlin, Germany: Springer, 2021: 430-445. [48] CHEN J F, LI Y X, WU X, et al. Robust Out-of-Distribution Detection for Neural Networks[C/OL].[2023-05-27]. https://arxiv.org/pdf/2003.09711v6.pdf. [49] DU X F, WANG Z N, CAI M, et al. VOS: Learning What You Don't Know by Virtual Outlier Synthesis[C/OL].[2023-05-27]. https://openreview.net/pdf?id=TW7d65uYu5M. [50] HSU Y C, SHEN Y L, JIN H X, et al. Generalized ODIN: Detecting Out-of-Distribution Image without Learning from Out-of-Distribution Data // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2020: 10951-10960. [51] BEGON J M, GEURTS P. Sample-Free White-Box Out-of-Distribution Detection for Deep Learning // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops. Washington, USA: IEEE, 2021: 3285-3294. [52] LEE K, LEE K, LEE H, et al. A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks // Proc of the 32nd International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2018: 7167-7177. [53] KAMOI R, KOBAYASHI K. Why Is the Mahalanobis Distance Effective for Anomaly Detection[C/OL]. [2023-05-27]. https://arxiv.org/pdf/2003.00402.pdf. [54] REN J, FORT S, LIU J, et al. A Simple Fix to Mahalanobis Distance for Improving Near-OOD Detection[C/OL].[2023-05-27]. https://arxiv.org/pdf/2106.09022.pdf. [55] ABDELZAD V, CZARNECKI K, SALAY R, et al. Detecting Out-of-Distribution Inputs in Deep Neural Networks Using an Early-Layer Output[C/OL].[2023-05-27]. https://arxiv.org/pdf/1910.10307.pdf. [56] SASTRY C S, OORE S. Detecting Out-of-Distribution Examples with Gram Matrices // Proc of the 37th International Conference on Machine Learning. San Diego, USA: JMLR, 2020: 8491-8501. [57] ERDIL E, CHAITANYA K, KARANI N, et al. Task-Agnostic Out-of-Distribution Detection Using Kernel Density Estimation // Proc of the International Workshop on Uncertainty for Safe Utilization of Machine Learning in Medical Imaging, and Perinatal Imaging, Placental and Preterm Image Analysis. Berlin, Germany: Springer, 2021: 91-101. [58] NIMI S T, AREFEEN A, UDDIN Y S, et al. EARLIN: Early Out-of-Distribution Detection for Resource-Efficient Collaborative Inference // Proc of the Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Berlin, Germany: Springer, 2021: 635-651. [59] LECUN Y, CHOPRA S, HADSELL R, et al. A Tutorial on Energy-Based Learning[C/OL]. [2023-05-27]. http://yann.lecun.com/exdb/publis/pdf/lecun-06.pdf. [60] LIU W T, WANG X Y, OWENS J D, et al. Energy-Based Out-of-Distribution Detection[C/OL].[2023-05-27]. https://arxiv.org/pdf/2010.03759.pdf. [61] LIN Z Q, ROY S D, LI Y X. MOOD: Multi-level Out-of-Distribution Detection // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2021: 15308-15318. [62] HENDRYCKS D, LIU X Y, WALLACE E, et al. Pretrained Transformers Improve Out-of-Distribution Robustness // Proc of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: ACL, 2020: 2744-2751. [63] KONER R, SINHAMAHAPATRA P, ROSCHER K, et al. OODformer: Out-of-Distribution Detection Transformer[C/OL].[2023-05-27]. https://arxiv.org/pdf/2107.08976.pdf. [64] LI J Y, CHEN P G, HE Z X, et al. Rethinking Out-of-Distribution(OOD) Detection: Masked Image Modeling Is All You Need // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2023: 11578-11589. [65] GOLAN I, YANIV R E. Deep Anomaly Detection Using Geometric Transformations // Proc of the 32nd International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2018: 9781-9791. [66] HENDRYCKS D, MAZEIKA M, KADAVATH S, et al. Using Self-Supervised Learning Can Improve Model Robustness and Uncertainty // Proc of the 33rd International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2019: 15663-15674. [67] YUN S, HAN D, CHUN S, et al. CutMix: Regularization Strategy to Train Strong Classifiers with Localizable Features // Proc of the IEEE/CVF International Conference on Computer Vision. Wa-shington, USA: IEEE, 2019: 6022-6031. [68] YANG J K, ZHOU K Y, LIU Z W. Full-Spectrum Out-of-Distribution Detection[C/OL]. [2023-05-27]. https://arxiv.org/pdf/2204.05306.pdf. [69] DEVRIES T, TAYLOR G W. Learning Confidence for Out-of-Distribution Detection in Neural Networks[C/OL]. [2023-05-27]. https://arxiv.org/pdf/1802.04865.pdf. [70] LEE K, LEE H, LEE K, et al. Training Confidence-Calibrated Classifiers for Detecting Out-of-Distribution Samples[C/OL].[2023-05-27]. https://arxiv.org/pdf/1711.09325v2.pdf. [71] SHALEV G, ADI Y, KESHET J. Out-of-Distribution Detection Using Multiple Semantic Label Representations // Proc of the 32nd International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2018: 7386-7396. [72] HENDRYCKS D, LEE K, MAZEIKA M. Using Pre-training Can Improve Model Robustness and Uncertainty // Proc of the 36th International Conference on Machine Learning. San Diego, USA: JMLR, 2019: 2712-2721. [73] HUANG R, LI Y X. MOS: Towards Scaling Out-of-Distribution Detection for Large Semantic Space // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2021: 8706-8715. [74] VYAS A, JAMMALAMADAKA N, ZHU X, et al. Out-of-Distribution Detection Using an Ensemble of Self Supervised Leave-Out Classifiers // Proc of the 15th European Conference on Computer Vision. Berlin, Germany: Springer, 2018: 560-574. [75] YANG D H, NGOC K M, SHIN I, et al. Ensemble-Based Out-of-Distribution Detection. Electronics. 2021, 10(5). DOI: 10.3390/electronics10050567. [76] WU Z H, PAN S R, CHEN F W, et al. A Comprehensive Survey on Graph Neural Networks. IEEE Transactions on Neural Networks and Learning Systems, 2021, 32(1): 4-24. [77] SONG Y, WANG D L. Learning on Graphs with Out-of-Distribution Nodes // Proc of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2022: 1635-1645. [78] LI H Y, WANG X, ZHANG Z W, et al. OOD-GNN: Out-of-Distribution Generalized Graph Neural Network. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(7): 7328-7340. [79] WU Q T, CHEN Y T, YANG C X, et al. Energy-Based Out-of-Distribution Detection for Graph Neural Networks[C/OL].[2023-05-27]. https://arxiv.org/pdf/2302.02914.pdf. [80] FANG Z, LI Y X, LU J, et al. Is Out-of-Distribution Detection Learnable?[C/OL]. [2023-05-27]. https://arxiv.org/pdf/2210.14707.pdf. [81] PAPERNOT N, MCDANIEL P D. Deep k-Nearest Neighbors: Towards Confident, Interpretable and Robust Deep Learning[C/OL]. [2023-05-27]. https://arxiv.org/pdf/1803.04765.pdf. [82] LEHMANN D, EBNER M. Layer-Wise Activation Cluster Analysis of CNNs to Detect Out-of-Distribution Samples // Proc of the International Conference on Artificial Neural Networks and Machine Learning. Berlin, Germany: Springer, 2021: 214-226. [83] SUN Y Y, MING Y F, ZHU X J, et al. Out-of-Distribution Detection with Deep Nearest Neighbors // Proc of the 39th International Conference on Machine Learning. San Diego, USA: JMLR, 2022: 20827-20840. [84] HENDRYCKS D, BASART S, MAZEIKA M, et al. Scaling Out-of-Distribution Detection for Real-World Settings[C/OL].[2023-05-27]. https://arxiv.org/pdf/1911.11132.pdf. [85] LIU F T, TING K M, ZHOU Z H. Isolation Forest // Proc of the 8th IEEE International Conference on Data Mining. Washington, USA: IEEE, 2008: 413-422. [86] ZHU J L, WANG Y Q, ZHOU D H, et al. Batch Process Mode-ling and Monitoring with Local Outlier Factor. IEEE Transactions on Control Systems Technology, 2019, 27(4): 1552-1565. [87] SEDLMEIER A, GABOR T, PHAN T, et al. Uncertainty-Based Out-of-Distribution Classification in Deep Reinforcement Learning. Digitale Welt, 2020, 4: 74-78. [88] YONG B X, PEARCE T, BRINTRUP A. Bayesian Autoencoders Analysing and Fixing the Bernoulli Likelihood for Out-of-Distribution Detection[C/OL]. [2023-05-27]. https://arxiv.org/pdf/2107.13304.pdf. [89] CHEN Z M, YEO C K, LEE B S, et al. Autoencoder-Based Network Anomaly Detection // Proc of the Wireless Telecommunications Symposium. Washington, USA: IEEE, 2018. DOI: 10.1109/WTS.2018.8363930. [90] MASANA M, RUIZ I, SERRAT J, et al. Metric Learning for Novelty and Anomaly Detection[C/OL].[2023-05-27]. https://arxiv.org/pdf/1808.05492v1.pdf. [91] WINKENS J, BUNEL R, ROY A G, et al. Contrastive Training for Improved Out-of-Distribution Detection[C/OL].[2023-05-27]. https://arxiv.org/pdf/2007.05566.pdf. [92] RUFF L, VANDERMEULEN R A, GÖRNITZ N, et.al. Deep One-Class Classification[C/OL]. [2023-05-27]. http://proceedings.mlr.press/v80/ruff18a/ruff18a.pdf. [93] RUFF L, VANDERMEULEN R A, FRANKS B J, et al. Rethinking Assumptions in Deep Anomaly Detection[C/OL].[2023-05-27]. https://arxiv.org/pdf/2006.00339.pdf. [94] HUANG H W, LI Z H, WANG L L, et al. Feature Space Singularity for Out-of-Distribution Detection[C/OL].[2023-05-27]. https://arxiv.org/abs/2011.14654. [95] KINGMA D P, WELLING M. Auto-Encoding Variational Bayes[C/OL]. [2023-05-27]. https://arxiv.org/pdf/1312.6114v3.pdf. [96] NALISNICK E, MATSUKAWA A, TEH Y W, et al. Do Deep Generative Models Know What They Don't Know?[C/OL]. [2023-05-27]. https://arxiv.org/abs/1810.09136. [97] SERRÀ J, ÁLVAREZ D, GÓMEZ V, et al. Input Complexity and Out-of-Distribution Detection with Likelihood-Based Generative Models[C/OL].[2023-05-27]. https://arxiv.org/pdf/1909.11480.pdf. [98] 胡铭菲,左信,刘建伟.深度生成模型综述.自动化学报, 2022, 48(1): 40-74. (HU M F, ZUO X, LIU J W. Survey on Deep Generative Model. Acta Automatica Sinica, 2022, 48(1): 40-74.) [99] ZISSELMAN E, TAMAR A. Deep Residual Flow for Out of Distri-bution Detection // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2020: 13991-14000. [100] KIRICHENKO P, IZMAILOV P, WILSON A G. Why Normalizing Flows Fail to Detect Out-of-Distribution Data // Proc of the 34th International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2020: 20578-20589. [101] RAMAKRISHNA S, RAHIMINASAB Z, EASWARAN A, et al. Efficient Multi-class Out-of-Distribution Reasoning for Perception Based Networks: Work-in-Progress // Proc of the International Conference on Embedded Software. Washington, USA: IEEE, 2020: 40-42. [102] SUNDAR V K, RAMAKRISHNA S, RAHIMINASAB Z, et al. Out-of-Distribution Detection in Multi-label Datasets Using Latent Space of β-VAE // Proc of the IEEE Security and Privacy Workshops. Washington, USA: IEEE, 2020: 250-255. [103] HAVTORN J D, FRELLSEN J, HAUBERG S, et al. Hierarchical VAEs Know What They Don't Know[C/OL].[2023-05-27]. https://arxiv.org/pdf/2102.08248.pdf. [104] CHOI J, YOON C, BAE J, et al. Robust Out-of-Distribution De-tection on Deep Probabilistic Generative Models[C/OL].[2023-05-27]. https://arxiv.org/abs/2106.07903. [105] CHOI H, JANG E, ALEMI A A. WAIC, But Why? Generative Ensembles for Robust Anomaly Detection[C/OL]. [2023-05-27]. https://arxiv.org/pdf/1810.01392.pdf. [106] PIMENTEL M A F, CLIFTON D A, CLIFTON L, et al. A Review of Novelty Detection. Signal Processing, 2014, 99: 215-249. [107] SABOKROU M, KHALOOEI M, FATHY M, et al. Adversarially Learned One-Class Classifier for Novelty Detection // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2018: 3379-3388. [108] DENOUDEN T, SALAY R, CZARNECKI K, et al. Improving Reconstruction Autoencoder Out-of-Distribution Detection with Mahalanobis Distance[C/OL].[2023-05-27]. https://arxiv.org/pdf/1812.02765v1.pdf. [109] GONG D, LIU L Q, LE V, et al. Memorizing Normality to Detect Anomaly: Memory-Augmented Deep Autoencoder for Unsupervised Anomaly Detection // Proc of the IEEE/CVF International Conference on Computer Vision. Washington, USA: IEEE, 2019: 1705-1714. [110] MA R. ODDObjects: A Framework for Multiclass Unsupervised Anomaly Detection on Masked Objects[C/OL]. [2023-05-27]. https://arxiv.org/pdf/2104.12300.pdf. [111] ZHANG S Y, PAN C, SONG L Y, et al. Label-Assisted Memory Autoencoder for Unsupervised Out-of-Distribution Detection // Proc of the Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Berlin, Germany: Springer, 2021: 795-810. [112] ZHANG P, LIU S Y, LU D, et al. An Out-of-Distribution-Aware Autoencoder Model for Reduced Chemical Kinetics. Discrete and Continuous Dynamical Systems, 2022, 15(4): 913-930. [113] SCHLEGL T, SEEBÖCK P, WALDSTEIN S M, et al. Unsupervised Anomaly Detection with Generative Adversarial Networks to Guide Marker Discovery[C/OL].[2023-05-27]. https://arxiv.org/pdf/1703.05921.pdf. [114] ZENATI H, ROMAIN M, FOO C S, et al. Adversarially Learned Anomaly Detection // Proc of the IEEE International Conference on Data Mining. Washington, USA: IEEE, 2018: 727-736. [115] AKCAY S, ATAPOUR A A, BRECKON T P. GANomaly: Semi-supervised Anomaly Detection via Adversarial Training // Proc of the Asian Conference on Computer Vision. Berlin, Germany: Springer, 2018: 622-637. [116] HAN X, CHEN X H, LIU L P. GAN Ensemble for Anomaly Detection. Proceedings of the AAAI Conference on Artificial Intelligence, 2021, 35(5): 4090-4097. [117] LECUN Y, BOTTOU L, BENGIO Y, et al. Gradient-Based Learning Applied to Document Recognition. Proceedings of the IEEE, 1998, 86(11): 2278-2324. [118] KRIZHEVSKY A. Learning Multiple Layers of Features from Tiny Images[C/OL]. [2023-05-27]. http://www.cs.toronto.edu/~kriz/learning-features-2009-TR.pdf. [119] HAN X, RASUL K, VOLLGRAF R. Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms[C/OL]. [2023-05-27]. https://arxiv.org/pdf/1708.07747.pdf. [120] DENG J, DONG W, SOCHER R, et al. ImageNet: A Large-Scale Hierarchical Image Database // Proc of the IEEE Confe-rence on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2009: 248-255. [121] LAKE B M, SALAKHUTDINOV R, TENENBAUM J B. Human-Level Concept Learning through Probabilistic Program Induction. Science, 2015, 350(6266): 1332-1338. [122] NETZER Y, WANG T, COATES A, et al. Reading Digits in Natural Images with Unsupervised Feature Learning[C/OL]. [2023-05-27]. http://ufldl.stanford.edu/housenumbers/nips2011_housenumbers.pdf. [123] XIAO J X, HAYS J, EHINGER K A, et al. SUN Database: Large-Scale Scene Recognition from Abbey to Zoo // Proc of the IEEE Computer Society Conference on Computer Vision and Pa-ttern Recognition. Washington, USA: IEEE, 2010: 3485-3492. [124] YU F, SEFF A, ZHANG Y D, et al. LSUN: Construction of a Large-Scale Image Dataset Using Deep Learning with Humans in the Loop[C/OL].[2023-05-27]. https://arxiv.org/abs/1506.03365.