earticle

논문검색

Multi Branch Decision Tree: A New Splitting Criterion

초록

영어

In this paper, a new splitting criterion to build a decision tree is proposed. Splitting criterion specifies the best splitting variable and its threshold for further splitting in a tree. Giving the idea from classical Forward Selection method and its enhanced versions, the variable having the largest absolute correlation with the target value is chosen as the best splitting variable in each node. Then, the idea of maximizing the margin between classes in SVM is used to find the best threshold on the selected variable to classify the data. This procedure will execute recursively in each node, until reaching the leaf nodes. The final decision tree has a comparable shorter height than the previous methods, which effectively reduces more useless variables and the time of classification for future data. Unclassified regions are also generated, which can be interpreted as an advantage or disadvantage for the proposed method. Simulation results demonstrate this improvement in the proposed decision tree.

목차

Abstract
 1. Introduction
  1.1. Commonly Used Splitting Criteria
 2. Background
  2.1 Correlation Based Feature Selection
  2.2 Support Vector Machines
 3. The Proposed Method
  3.1 DT Generation Using Correlation Based Feature Selection and SVM Threshold Function
  3.2 Rule Extraction from Decision Tree
 4. Discussion
 6. Conclusion
 7. Future Work
 References

저자정보

  • Hadi Sadoghi Yazdi Department of Computer Engineering, Ferdowsi University of Mashhad, Mashhad, Iran, Center of Excellence on Soft Computing and Intelligent Information Processing, Ferdowsi University of Mashhad, Mashhad, Iran
  • Nima Salehi-Moghaddami Department of Computer Engineering, Ferdowsi University of Mashhad, Mashhad, Iran

참고문헌

자료제공 : 네이버학술정보

    함께 이용한 논문

      ※ 원문제공기관과의 협약기간이 종료되어 열람이 제한될 수 있습니다.

      0개의 논문이 장바구니에 담겼습니다.