Abstract:In view of the fact that difference of the importance of different dimensions is not taken into account in traditional learning vector quantization (LVQ) algorithms, a weighted LVQ algorithm is presented. In the proposedalgorithm, a set of additional weights is introduced for each neuron to indicate the importance of their respective dimensions.The weights are updated adaptively regarding the fitness of their corresponding neuron over the training iteration. The updating thresholds and step are decided according to the mean value of distance of all dimensions. Furthermore, according to the mean value of distance, it gets better stability and updates the weights without normalization. Six well known databases from UCI machine learning repository are selected to verify the performance of the proposed weighted LVQ (WLVQ) algorithm. The experimental results show that the proposed method gains insight to the role of the data dimensions, especially local weights, and yields the superior performance in recognition rate, stability, and computational complexity.