I'm pretty sure my question doesn't totally make sense, but I'm trying to create a classification tree in R using 'rpart' and initially had the fit as something like:
fit <- rpart(success ~ A + B + C)
I have now realised 'success' might also be measured by another 'value'. So I was going to amend it to:
fit <- rpart(success + new_option ~ A + B + C)
But when I run these lines:
plot(fit, uniform=TRUE, main="Success plot")
text(fit, use.n = TRUE, all=TRUE, cex=.8)
post(fit, file = "tree.ps", title="Success plot")
I get this error:
Error in plot.rpart(fit, uniform = TRUE, main = "Success plot") :
fit is not a tree, just a root
So just wondering - is this even possible? Or do I need to tackle this in a totally different way?
This means that your tree algorithm has not created any splits. You can use the cp parameter to increase the complexity of your tree. the default value of cp is 0.01 so you can try 0.001. but note that this might imply you are over fitting your model.
p.s.. you can only have on response variable not var1 + var 2... if you need to combine both, do it before you insert it in to your modeling function.