https://machinelearning1.quora.com/Decision-Tree-Classifier
• In a decision tree, every decision rule occurs at a decision node, with the rule creating branches leading to new nodes. One reason for the popularity of tree-based models is their interpretability. In fact, decision trees can literally be drawn out in their complete form to create a highly intuitive model.
• Decision tree learners attempt to find a decision rule that produces the greatest decrease in impurity at a node. While there are a number of measurements of impurity, by default Decision Tree Classifier uses Gini impurity.
• This process of finding the decision rules that create splits to increase impurity is repeated recursively until all leaf nodes are pure or some arbitrary cut-off is reached.
• One of the advantages of decision tree classifiers is that we can visualize the entire trained model making decision trees one of the most interpretable models in machine learning.
• While this solution visualized a decision tree classifier, it can just as easily be used to visualize a decision tree regressor.
'Fam. & Hea.INFO > RAM : mem' 카테고리의 다른 글
rel. R : Locality + Re- volve = eco - localism : cf. cutomized localization (0) | 2022.01.17 |
---|---|
2022 세계시장진출전략 설명회 (KoTra)/ 2021 디지털 트윈 커넥트 데이 (0) | 2021.12.17 |
rel. '배움' ~ 자기계발 : 스스로를 계발하다? (0) | 2021.11.16 |
rel. 배움 : "공유대학" ? (0) | 2021.11.05 |
rel. 공공 박람회 의 방향성? e.g. innoexpo? (0) | 2021.11.01 |