rmachine-learningregressionglmnetlasso-regression

Plotting lasso beta coefficients


I wrote this lasso code in R, and I got some beta values:

#Lasso
library(MASS)
library(glmnet)
Boston=na.omit(Boston)
x=model.matrix(crim~.,Boston)[,-1]
rownames(x)=c()
y=as.matrix(Boston$crim)
lasso.mod =glmnet(x,y, alpha =1, lambda = 0.1)
beta=as.matrix(rep(0,19))
beta=coef(lasso.mod)
matplot(beta,type="l",lty=1,xlab="L1",ylab="beta",main="lasso")

I want to plot the betas in a plot like this:

enter image description here

but I don't know what plot function in R I can use to do that.


Solution

  • I don't understand why you don't want to use the build-in glmnet method but you can certainly reproduce its results (here with ggplot).
    You still need the model object to extract the lambda values...

    Edit: added Coefs vs L1 norm

    Reproduce your minimal example

    library(MASS)
    library(glmnet)
    #> Le chargement a nécessité le package : Matrix
    #> Le chargement a nécessité le package : foreach
    #> Loaded glmnet 2.0-13
    library(ggplot2)
    library(reshape)
    #> 
    #> Attachement du package : 'reshape'
    #> The following object is masked from 'package:Matrix':
    #> 
    #>     expand
    
    Boston=na.omit(Boston)
    x=model.matrix(crim~.,Boston)[,-1]
    y=as.matrix(Boston$crim)
    lasso.mod =glmnet(x,y, alpha =1)
    beta=coef(lasso.mod)
    

    Extract the coef values and transform them in a long, tidy form suitable for ggplot

    tmp <- as.data.frame(as.matrix(beta))
    tmp$coef <- row.names(tmp)
    tmp <- reshape::melt(tmp, id = "coef")
    tmp$variable <- as.numeric(gsub("s", "", tmp$variable))
    tmp$lambda <- lasso.mod$lambda[tmp$variable+1] # extract the lambda values
    tmp$norm <- apply(abs(beta[-1,]), 2, sum)[tmp$variable+1] # compute L1 norm
    

    Plot with ggplot : coef vs lambda

    # x11(width = 13/2.54, height = 9/2.54)
    ggplot(tmp[tmp$coef != "(Intercept)",], aes(lambda, value, color = coef, linetype = coef)) + 
        geom_line() + 
        scale_x_log10() + 
        xlab("Lambda (log scale)") + 
        guides(color = guide_legend(title = ""), 
               linetype = guide_legend(title = "")) +
        theme_bw() + 
        theme(legend.key.width = unit(3,"lines"))
    

    The same with the base plot glmnet method :

    # x11(width = 9/2.54, height = 8/2.54)
    par(mfrow = c(1,1), mar = c(3.5,3.5,2,1), mgp = c(2, 0.6, 0), cex = 0.8, las = 1)
    plot(lasso.mod, "lambda", label = TRUE)
    

    Plot with ggplot : coef vs L1 norm

    # x11(width = 13/2.54, height = 9/2.54)
    ggplot(tmp[tmp$coef != "(Intercept)",], aes(norm, value, color = coef, linetype = coef)) + 
        geom_line() + 
        xlab("L1 norm") + 
        guides(color = guide_legend(title = ""), 
               linetype = guide_legend(title = "")) +
        theme_bw() + 
        theme(legend.key.width = unit(3,"lines"))
    

    The same with the base plot glmnet method :

    # x11(width = 9/2.54, height = 8/2.54)
    par(mfrow = c(1,1), mar = c(3.5,3.5,2,1), mgp = c(2, 0.6, 0), cex = 0.8, las = 1)
    plot(lasso.mod, "norm", label = TRUE)
    

    Created on 2018-02-26 by the reprex package (v0.2.0).