Abstract:The sensitivity of Adaline to weight perturbation is discussed. Considering the discrete feature of input and output of Adaline, the sensitivity is defined as the probability of an Adaline's output inversion due to the weight perturbation with respect to all possible inputs. By hypersphere model and analytical geometry technique, a method is proposed for approximately computing the sensitivity. Under the circumstance of high enough input dimension, the method is prior to other existing methods. It reduces computational complexity greatly with little sacrifice in precision and makes the sensitivity more practical.
[1] Stevenson M, Winter R, Widrow B. Sensitivity of Feedforward Neural Networks to Weight Errors. IEEE Trans on Neural Networks, 1990, 1(1): 71-80 [2] Alippi C, Piuri V, Sami M. Sensitivity to Errors in Artificial Neural Networks: A Behavioral Approach. IEEE Trans on Circuits and Systems I: Fundamental Theory and Applications, 1995, 42(6): 358-361 [3] Piche S W. The Selection of Weight Accuracies for Madalines. IEEE Trans on Neural Networks, 1995, 6(2): 432-445 [4] Zeng Xiaoqin, Wang Yingfeng, Zhang Kang. Computation of Adalines' Sensitivity to Weight Perturbation. IEEE Trans on Neural Networks, 2006, 17(2): 515-519 [5] Wang Yingfeng, Zeng Xiaoqin, Yeung D S, et al. Computation of Madalines' Sensitivity to Input and Weight Perturbations. Neural Computation, 2006, 18(11): 2854-2877 [6] Choi J Y, Choi C H. Sensitivity Analysis of Multilayer Perceptron with Differentiable Activation Functions. IEEE Trans on Neural Networks, 1992, 3(1): 101-107 [7] Oh S H, Lee Y. Sensitivity Analysis of a Single Hidden-Layer Neural Networks with Threshold Function. IEEE Trans on Neural Networks, 1995, 6(4): 1005-1007 [8] Zeng Xiaoqin, Yeung D S. Sensitivity Analysis of Multilayer Perceptron to Input and Weight Perturbations. IEEE Trans on Neural Networks, 2001, 12(6): 1358-1366 [9] Zeng Xiaoqin, Yeung D S. A Quantified Sensitivity Measure for Multilayer Perceptron to Input Perturbations. Neural Computation, 2003, 15(1): 183-212 [10] Bernier J L, Ortega J, Rojas I, et al. Improving the Tolerance of Multilayer Perceptrons by Minimizing the Statistical Sensitivity to Weight Deviations. Neurocomputing, 2000, 31(1/2/3/4): 87-103 [11] Bernier J L, Ortega J, Ros E, et al. A Quantitative Study of Fault Tolerance, Noise Immunity and Generalization Ability of MLPs. Neural Computation, 2000, 12(12): 2941-2964 [12] Engelbrecht A P. A New Pruning Heuristic Based on Variance Analysis of Sensitivity Information. IEEE Trans on Neural Networks, 2001, 12(6): 1386-1399 [13] Zeng Xiaoqin, Yeung D S. Hidden Neuron Pruning of Multilayer Perceptrons Using a Quantified Sensitivity Measure. Neurocomputing, 2006, 69(7/8/9): 825-837 [14] Zeng Xiaoqin, Huang Yajuan, Yeung D S. Determining the Relevance of Input Features for Multilayer Perceptrons // Proc of the IEEE International Conference on Systems, Man and Cybernetics. Washington, USA, 2003, Ⅰ: 874-879 [15] Wang Yingfeng, Zeng Xiaoqin. Using a Sensitivity Measure to Improve Training Accuracy and Convergence for Madalines // Proc of the International Joint Conference on Neural Networks. Vancouver, Canada, 2006: 1750-1756