Graphical Least Absolute Shrinkage and Selection Operator, has been introduced by Jerome Friedman, Trevor Hastie and Robert Tibshirani ("Sparse inverse covariance estimation with the graphical lasso",2014). They suggest the block coordinate-descent algorithm for the problem solution (see. "The Graphical Lasso: New Insights and Alternatives" by Rahul Mazumder and Trevor Hastie). I wrote this easy MATLAB code using CVX, given X
(regressor's matrix of size m,n
):
S = cov(X,0);
cvx_begin
variable theta(n,n) semidefinite
minimize (trace(S*theta)-log_det(theta)+lambda*norm(theta,1))
cvx_end
What's the difference between the block coordinate-descent algorithm solution and CVX solution? Can CVX be set to give the very same solution? The question refers to Graphical LASSO algorithm, but can be extended to other similar problems in which authors suggest a specific algorithm (es. ADMM), but it is possible to find a solution with optimization packages.
Provided
Both implementations should return the exact same solution as the problem is a convex semidefinite program. The difference you should observe is
For small and toy problems this should not be a big deal (which is usually the case if you are an academic.) If you are a person trying to do something useful in the real world, runtime & memory usage tend to be extremely important, as they control what size problems you can tackle with your approach.
The only way to know the relative limitations to each approach is to implement and try both! At the very least, I would implement and run both approaches as a sanity check that both implementations are likely correct (the chance of both implementations being incorrect and reporting the same results across a range on inputs is very low.)