文藻外語大學W-Portfolio

研究資料首頁-> 期刊論文

研究資料明細

論文名稱 Constrained-storage variable-branch neural tree for classification
發表日期 2018-01-05
論文收錄分類 SCI
所有作者 楊雄斌
作者順序 第一作者
通訊作者
刊物名稱 Neural Computing and Applications
發表卷數  
是否具有審稿制度
發表期數  
期刊或學報出版地國別/地區 NATUSA-美國
發表年份 2011
發表月份 12
發表形式 電子期刊
所屬計劃案
可公開文檔  
可公開文檔  
可公開文檔   
附件 NCA_revisedII.pdfNCA_revisedII.pdf


[摘要] :
在這項研究中,提出了用於模式分類的約束存儲可變分支神經樹(CSVBNT)。 在CSVBNT中,每個內部節點被設計為用於分類輸入樣本的單層神經網絡(SLNN)。 提出了遺傳算法(GA)來尋找SLNN輸出層中適當數量的輸出節點。 此外,由於存儲約束,提出了增長方法來確定哪個節點具有在CSVBNT中分割的最高優先級。 增長方法根據CSVBNT的分類錯誤率和計算複雜度選擇一個節點在CSVBNT中分割。 在實驗中,當計算時間相同時,CSVBNT的分類錯誤率低於其他NT。
關鍵詞:神經樹,神經網絡,遺傳算法。

[英文摘要] :
In this study, the constrained-storage variable-branch neural tree (CSVBNT) is proposed for pattern classification. In the CSVBNT, each internal node is designed as a single layer neural network (SLNN) that is used to classify the input samples. The genetic algorithm (GA) is proposed to search for the proper number of output nodes in the output layer of the SLNN. Furthermore, the growing method is proposed to determine which node has the highest priority to split in the CSVBNT because of storage constraint. The growing method selects a node to split in the CSVBNT according to the classification error rate and computing complexity of the CSVBNT. In the experiments, CSVBNT has lower classification error rate than other NTs when they have the same computing time.
Key words: Neural trees, neural network, genetic algorithm.

[參考文獻] :
[1] Gelfand,S. B., Ravishankar,C. S., and Delp, E. J. (1991). an Iterative Growing and Pruning Algorithm for Classification Tree Design. IEEE Trans. Pattern Analysis and Machine Intell., 13(2), 163-174.
[2] Yildiz, O. T. and Alpaydin, E. (2001). Omnivariate Decision Trees. IEEE Trans. Neural Network, 12(6), 1539-1546.
[3] Zhao, H. and S. Ram (2004). Constrained Cascade Generalization of Decision Trees. IEEE Trans. Knowledgement and data Engineering, 16(6), 727-739.
[4] Gonzalo, M. M. and Alberto, S. (2004). Using All Data to Generate Decision Tree Ensembles. IEEE Trans. Systems, Man, Cybernetics, C, Applications and Reviews, 34(4), 393-397.
[5] Witold, P. and Zenon, A. S. (2005). C-fuzzy Decision Trees. IEEE Trans. Systems, Man, Cybernetics, C, Applications and Reviews, 35(4), 498-511.
[6] Wang, X. B. Chen, G. Q., and Ye, F. (2000). On the Optimization of Fuzzy Decision Trees. Fuzzy Sets and Systems, 112(3), 117-125.
[7] Deffuant, G., Neural units recruitment algorithm for generation of decision trees., Proceedings of the international joint conference on neural networks, 1 (1990) 637–642.
[8] Lippmann, R., An introduction to computing with neural nets., IEEE Acoustics, Speech, and Signal Processing Magazine. 4(2) (1987) 4–22.
[9] Sankar, A., & Mammone, R., Neural tree networks. In Neural network: theory and application, San Diego, CA, USA: Academic Press Professional, Inc. 1992, pp. 281–302.
[10] Sethi, I. K. and Yoo, J., Structure-driven induction of decision tree classifiers through neural learning, Pattern Recognition. 30(11) (1997) 1893–1904.
[11] Sirat, J., & Nadal, J., Neural trees: a new tool for classification, Neural Network. 1 (1990) 423–448.
[12] T. Li, Y. Y. Tang, and F. Y. Fang, A structure-parameter-adaptive (SPA) neural tree for the recognition of large character set, Pattern Recognit. 28(3) (1995) 315–329.
[13] M. Zhang and J. Fulcher, Face recognition using artificial neural networks group-based adaptive tolerance (GAT) trees, IEEE Trans. Neural Networks. 7 (1996) 555–567.
[14] G. L. Foresti and G. G. Pieroni, Exploiting neural trees in range image understanding, Pattern Recognit. Lett. 19(9) (1998) 869–878.
[15] H. H. Song and S.W. Lee, Aself-organizing neural tree for large set pattern classification, IEEE Trans. Neural Networks. 9 (1998) 369–380.
[16] G. L. Foresti, Outdoor scene classification by a neural tree based approach, Pattern Anal. Applic. 2 (1999) 129–142.
[17] H. Guo and S. B. Gelfand, Classification trees with neural networks feature extraction, IEEE Trans. Neural Networks. (1992) 923–933.
[18] G. L. Foresti, An adaptive high-order neural tree for pattern recognition, IEEE Trans. Systems, man, cybernetics-part B: cybernetics. 34 (2004) 988-996.
[19] G. L. Giles and T. Maxwell, Learning, invariance, and variable-branchization in high-order neural networks, 26 (1987) 4972–4978.
[20] P. Maji, Efficient design of neural network tree using a single spilitting criterion, Nerocomputing. 71 (2008) 787–800.
[21] P. E. Utgoff, Perceptron tree: a case study in hybrid concept representation, Proc. VII Nat. Conf. Artificial Intelligence. (1998) 601–605.
[22] J. A. Sirat and J. P. Nadal, Neural tree: a new tool for classification, Network. 1 (1990) 423–438.
[23] G. L. Foresti and C. Micheloni, Generalized neural trees for pattern classification, IEEE Trans. Neural Networks. 13 (2002) 1540–1547.
[24] C. Micheloni, A. Rani, S. Kumarb, G. L. Foresti, A balanced neural tree for pattern classification, Neural Networks. 27 (2012) 81-90.
[25] D. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning. Reading, MA: Addison-Wesley, 1989.
[26] J. Koza, Genetic Programming. Cambridge, MA: MIT Press, 1992.
[27] S. Grossberg, Ed., Neural Networks and Natural Intelligence. Cambridge, MA: MIT Press, 1988.
[28] D. Rumelhart and J. McClelland, Eds., Parallel Distributed Processing: Explorations in Microstructure of Cognition. Cambridge, MA: MIT Press, 1986.
[29] J. M. Zurada, Ed., Introduction to Neural Systems. St. Paul, MN:West, 1992.
[30] P. J. Angeline, G. M. Saunders, and J. B. Pollack, “An evolutionary algorithm that constructs recurrent neural networks,” IEEE Trans. Neural Netw., vol. 5, no. 1, pp. 54–64, Jan. 1994.
[31] Mahmoudabadi H., Izadi M., Menhaj M.B. A hybrid method for grade estimation using genetic algorithm and neural networks. Computational Geosciences. 2009;13:91–101.
[32] Samanta B., Bandopadhyay S., Ganguli R. Data segmentation and genetic algorithms for sparse data division in Nome placer gold grade estimation using neural network and geostatistics. Mining Exploration Geology. 2004;11(1–4):69–76.
[33] Chatterjee S., Bandopadhyay S., Machuca D. Ore grade prediction using a genetic algorithm and clustering based ensemble neural network model. Mathematical Geosciences. 2010;42(3):309–326.
[34] Tahmasebi P., Hezarkhani A. IAMG09. Stanford University; California: 2009. (Application of Optimized Neural Network by Genetic Algorithm).
[35] J. Stallkamp, M. Schlipsing, J. Salmen, and C. Igel. The German Traffic Sign Recognition Benchmark: A multi-class classification competition. In International Joint Conference on Neural Networks, 2011.
[36] A. Krizhevsky. Learning multiple layers of features from tiny images. Master’s thesis, Computer Science Department, University of Toronto, 2009.
[37] R. C. Gonzalez and R. E. Woods. Digital Image Processing. Addison–Wesley, Boston, MA, 1992.