20.17 Build a Decision Tree Model

As seen from Rattle’s Log tab the decision tree model is built using rpart::rpart(). Once the different template variables have been defined as in Section @ref(dtrees:sec:model_setup) (form, ds, tr, and vars) we can use this template call to build the model:

library(rpart)
model <- rpart(formula=form, data=ds[tr, vars], model=TRUE)

This is essentially the same as the command used by Rattle except that some parameter settings are removed. These will be explored later.

In the above call to rpart::rpart() we have named each of the arguments. If we have a look at the structure of rpart::rpart() we see that the arguments are in their expected order, and hence the use of the argument names, and , is optional.

str(rpart)
## function (formula, data, weights, subset, na.action=na.rpart, method, 
##     model=FALSE, x=FALSE, y=TRUE, parms, control, cost, ...)

Whilst the argument names are optional they can assist in reading the code, and so the use of argument names in function calls is encouraged.

A textual presentation of the model is concise and informative, once we learn how to read it. Note this tree is different to the previous one we have seen, since we are using a much larger (the full) weather dataset which includes multiple years of daily observations from many different weather stations across Australia.

model
## n= 158807 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 158807 34195 No (0.7846757 0.2153243)  
##    2) humidity_3pm< 71.5 132754 18543 No (0.8603206 0.1396794) *
##    3) humidity_3pm>=71.5 26053 10401 Yes (0.3992247 0.6007753)  
##      6) humidity_3pm< 82.5 14235  6521 No (0.5419038 0.4580962)  
##       12) rainfall< 0.85 7884  2748 No (0.6514460 0.3485540) *
##       13) rainfall>=0.85 6351  2578 Yes (0.4059203 0.5940797) *
##      7) humidity_3pm>=82.5 11818  2687 Yes (0.2273650 0.7726350) *

Refer to Section @ref(dtrees:sec:explain_read_tree) for an explanation of the format of the textual presentation of the decision tree.



Your donation will support ongoing availability and give you access to the PDF version of this book. Desktop Survival Guides include Data Science, GNU/Linux, and MLHub. Books available on Amazon include Data Mining with Rattle and Essentials of Data Science. Popular open source software includes rattle, wajig, and mlhub. Hosted by Togaware, a pioneer of free and open source software since 1984. Copyright © 1995-2022 Graham.Williams@togaware.com Creative Commons Attribution-ShareAlike 4.0