# Download the imagerosalba=image_read("https://upload.wikimedia.org/wikipedia/commons/a/aa/Rembrandt_Peale_-_Portrait_of_Rosalba_Peale_-_Google_Art_Project.jpg")
@@ -135,7 +138,7 @@
Example 1: Peale’s “Portrait o
With our cropped image in hand, let’s walk through the 4-step recipe
from above.
Step 1. Convert the image into a data frame.
-
+
# Coerce to data framerosalba_df=as.data.frame(rosalba)
@@ -153,7 +156,7 @@
Example 1: Peale’s “Portrait o
Step 2. Split the image by RGB colour channel. This
is the cc column above, where 1=Red, 2=Green, and
3=Blue.
-
+
rosalba_ccs=split(rosalba_df, rosalba_df$cc)# We have a list of three DFs by colour channel. Uncomment if you want to see:
@@ -165,7 +168,7 @@
Example 1: Peale’s “Portrait o
predictions) and trimming each tree to a maximum depth of 30 nodes. The
next code chunk takes about 15 seconds to run on my laptop, but should
be much quicker if you downloaded a lower-res image.
-
+
## Start creating regression tree for each color channel. We'll adjust some## control parameters to give us the "right" amount of resolution in the final## plots.
@@ -178,7 +181,7 @@
Example 1: Peale’s “Portrait o
construct our abstracted art piece. I was bit glib about it earlier,
since it really involves a few sub-steps. First, let’s grab the
predictions for each of our trees.
-
+
pred=lapply(trees, predict)# get predictions for each tree
# The pred object is a list, so we convert it to a vector before overwriting the# value column of the original data framerosalba_df$value=do.call("c", pred)
@@ -201,7 +204,7 @@
Example 1: Peale’s “Portrait o
parttree will enter the fray, since this is what we’ll
be using to highlight the partitioned areas of the downscaled pixels.
Here’s how we can do it using base R graphics.
-
+
# get a list of parttree data frames (one for each tree)pts=lapply(trees, parttree)
@@ -226,7 +229,7 @@
Example 1: Peale’s “Portrait o
#> NULL
We can achieve the same effect with ggplot2 if you
prefer to use that.
Speaking of visualization, underneath the hood
-plot.parttree calls the powerful tinyplot
+plot.parttree calls the powerful tinyplot
package. All of the latter’s various customization arguments can be
passed on to our parttree plot to make it look a bit nicer.
For example:
Alongside the rpart
model objects that we have been working with thus far,
-parttree also supports decision trees created by the partykit
+parttree also supports decision trees created by the partykit
package. Here we see how the latter’s ctree (conditional
inference tree) algorithm yields a slightly more sophisticated
partitioning that the former’s default.
@@ -185,7 +185,7 @@
Supported model classesctree(species~flipper_length_mm+bill_length_mm, data =penguins)|>parttree()|>
-plot(pch =19, palette ="classic", alpha =0.5)
Quickstarthomepage includes an introductory vignette and detailed documentation. But here’s a quickstart example using the “kyphosis” dataset that comes bundled with the rpart package. In this case, we are interested in predicting kyphosis recovery after spinal surgery, as a function of 1) the number of topmost vertebra that were operated, and 2) patient age.
The key function is parttree(), which comes with its own plotting method.
-
+
library(rpart)# For the dataset and fitting decisions trees
-library(parttree)# This package
+library(parttree)# This packagefit=rpart(Kyphosis~Start+Age, data =kyphosis)
@@ -80,24 +83,24 @@