21.17 Build a Decision Tree Model

As seen from Rattle’s Log tab the decision tree model is built using rpart::rpart(). Once the different template variables have been defined as in Section @ref(dtrees:sec:model_setup) (form, ds, tr, and vars) we can use this template call to build the model:

library(rpart)
model <- rpart(formula=form, data=ds[tr, vars], model=TRUE)

This is essentially the same as the command used by Rattle except that some parameter settings are removed. These will be explored later.

In the above call to rpart::rpart() we have named each of the arguments. If we have a look at the structure of rpart::rpart() we see that the arguments are in their expected order, and hence the use of the argument names, and , is optional.

str(rpart)
## function (formula, data, weights, subset, na.action = na.rpart, method, 
##     model = FALSE, x = FALSE, y = TRUE, parms, control, cost, ...)

Whilst the argument names are optional they can assist in reading the code, and so the use of argument names in function calls is encouraged.

A textual presentation of the model is concise and informative, once we learn how to read it. Note this tree is different to the previous one we have seen, since we are using a much larger (the full) weather dataset which includes multiple years of daily observations from many different weather stations across Australia.

model
## n= 134001 
## 
## node), split, n, loss, yval, (yprob)
##       * denotes terminal node
## 
##  1) root 134001 28166 No (0.7898075 0.2101925)  
##    2) humidity_3pm< 71.5 112640 15454 No (0.8628018 0.1371982) *
##    3) humidity_3pm>=71.5 21361  8649 Yes (0.4048968 0.5951032)  
##      6) humidity_3pm< 82.5 11741  5313 No (0.5474832 0.4525168)  
##       12) rainfall< 2.05 7728  2831 No (0.6336698 0.3663302) *
##       13) rainfall>=2.05 4013  1531 Yes (0.3815101 0.6184899) *
##      7) humidity_3pm>=82.5 9620  2221 Yes (0.2308732 0.7691268) *

Refer to Section @ref(dtrees:sec:explain_read_tree) for an explanation of the format of the textual presentation of the decision tree.



Your donation will support ongoing development and give you access to the PDF version of this book. Desktop Survival Guides include Data Science, GNU/Linux, and MLHub. Books available on Amazon include Data Mining with Rattle and Essentials of Data Science. Popular open source software includes rattle, wajig, and mlhub. Hosted by Togaware, a pioneer of free and open source software since 1984.
Copyright © 1995-2021 Graham.Williams@togaware.com Creative Commons Attribution-ShareAlike 4.0.