rmachine-learningr-caretmlr

Recursive feature elimination with mlr


It is possible to conduct a recursive feature elimination feature (rfe) with mlr ? I know this is possible with caret here but even if there is some documentation about feature selection with mlr, I did not find an equivalent to rfe.


Solution

  • To perform recursive feature elimination in mlr you can use the function makeFeatSelControlSequential with the argument method = sbs (sequential backwards selection). Here is an example of usage using lda learner :

    library(mlr)
    ctrl <- makeFeatSelControlSequential(method = "sbs",
                                         beta = 0.005)
    
    rdesc <- makeResampleDesc("CV", iters = 3)
    
    sfeats <- selectFeatures(learner = "classif.lda",
                             task = sonar.task,
                             resampling = rdesc,
                             control = ctrl,
                             show.info = FALSE)
    
    
    FeatSel result:
    Features (57): V1, V2, V3, V4, V5, V6, V7, V8, V9, V11, V12, V13, V14, V15, V16, V17, V18, V19, V21, V22, V23, V24, V25, V26, V27, V28, V29, V30, V31, V32, V33, V34, V35, V36, V37, V38, V39, V40, V41, V42, V43, V44, V45, V46, V47, V48, V49, V50, V51, V52, V53, V54, V55, V56, V57, V58, V60
    mmce.test.mean=0.2066943
    

    here, 57 variables out of 60 were selected.

    you can use:

    analyzeFeatSelResult(sfeats)
    

    to get a hold of the selection path

    #output
        Path to optimum:
    - Features:   60  Init   :                       Perf = 0.26936  Diff: NA  *
    - Features:   59  Remove : V59                   Perf = 0.2403  Diff: 0.029055  *
    - Features:   58  Remove : V10                   Perf = 0.22588  Diff: 0.014424  *
    - Features:   57  Remove : V20                   Perf = 0.20669  Diff: 0.019186  *
    
    Stopped, because no improving feature was found.