|
|
A Model Selection Method of Influence Diagrams Based on PSEM Algorithm and BP Neural Network |
YAO HongLiang, ZHANG YouSheng , WANG Hao, Wang RongGui |
Department of Computer Science and Technology, Hefei University of Technology, Hefei 230009 |
|
|
Abstract In the model selection of influence diagrams(IDs), the problems of the data dependency, the computation complexity and nonprobability relation are discussed. Based on the structure decomposition of IDs, a PSEM algorithm is presented. A BP Neural Network is introduced by learning local utility function of each utility node, and the overfitting is avoided by inducing the threshold of weights. To reduce the data dependency, a new MDL scoring is presented which includes the prior knowledge of network structures. Based on SEM algorithm, PSEM algorithm induces the new MDL scoring, and separates parameters learning from structures scoring to improve the computation efficiency. Compared with SEM algorithm, the performances of both the computation complexity and the data dependency of PSEM algorithm are improved, and the model selection of the utility part is easy to achieve.
|
Received: 17 February 2006
|
|
|
|
|
[1] Howard R A, Matheson J E. Influence Diagrams // Howard R A, Matheson J E, eds. Readings on the Principles and Applications of Decision Analysis. Menlo Park, USA: Strategic Decision Group, 1984, Ⅱ: 719792 [2] Charnes J, Shenoy P. Multistage Monte Carlo Method for Solving Influence Diagrams. Management Science, 2004, 50(3): 405418 [3] Diehl M, Haimes Y Y. Influence Diagrams with Multiple Objectives and Tradeoff Analysis. IEEE Trans on Systems, Man, and Cybernetics, 2004, 34(3): 293304 [4] Pettersson J, Wahde M. Application of the Utility Function Method for Behavioral Organization in a Locomotion Task. IEEE Trans on Evolutionary Computation, 2005, 9(5): 506521 [5] Nielseni T D, Jensen F V. Learning a Decision Maker’s Utility Function from (Possibly) Inconsistent Behavior. Artificial Intelligence, 2004, 160(1/2): 5378 [6] Heckerman D, Geiger D. Learning Bayesian Networks: The Combination of Knowledge and Statistical Data. Machine Learning, 1995, 20(3): 197243 [7] Chajewska U, Koller D. Utilities as Random Variables: Density Estimation and Structure Discovery // Proc of the 16th Annual Conference on Uncertainty in Artificial Intelligence. Stanford, USA, 2000: 6371 [8] Wang Shuangcheng, Yuan Senmiao. Research on Learning Bayesian Networks Structure with Missing Data. Journal of Software, 2004, 15(7): 10421048 (in Chinese) (王双成,苑森淼.具有丢失数据的贝叶斯网络结构学习研究.软件学报, 2004, 15(7): 10421048) [9] Ji Junzhong, Yan Jing, Liu Chunnian, et al. An Improved Bayesian Networks Learning Algorithm Based on Independence Test and MDL Scoring // Proc of the International Conference on Active Media Technology. Takamatsu, Japan, 2005: 315320 [10] Dodge Y, Zoppe A. Adjusting the EM Algorithm for Design of Experiments with Missing Data // Proc of the 26th International Conference on Information Technology Interfaces. Covtat, Croatia, 2004, Ⅰ: 912 [11] Friedman N. The Bayesian Structural EM Algorithm // Proc of the 14th International Conference on Uncertainty in Artificial Intelligence. Madison, USA, 1998: 129138 [12]Liu Dayou, Wang Fei, Lu Yinan, et al. Research on Learning Bayesian Network Structure Based on Genetic Algorithm. Journal of Computer Research and Development, 2001,38(8):916922 (in Chinese) (刘大有,王 飞,卢奕南,等.基于遗传算法的Bayesian网络结构学习研究.计算机研究与发展, 2001, 38(8): 916922) [13] Pernkopf E F, Bouchaffra D. GeneticBased EM Algorithm for Learning Gaussian Mixture Models. IEEE Trans on Pattern Analysis and Machine Intelligence, 2005, 27(8):13441348 [14] Lampinen J, Vehtari A. Bayesian Approach for Neural Networks-Review and Case Studies. Neural Networks, 2001, 14(3): 257274 [15] Ng S K, McLachlan G J. Using the EM Algorithm to Train Neural Networks: Misconceptions and a New Algorithm for Multiclass Classification. IEEE Trans on Neural Networks, 15(3): 738749 [16] Murphy K. The Bayes Net Toolbox for Matlab [DB/OL]. [20010101]. http://bnt.sourceforge.net |
|
|
|