julialinear-algebraconvex-optimizationconvex

Julia error using convex package with diagind function


I'm trying to solve the problem

d = 0.5 * ||X - \Sigma||_{Frobenius Norm} + 0.01 * ||XX||_{1}, 

where X is a symmetric positive definite matrix, and all the diagnoal element should be 1. XX is same with X except the diagonal matrix is 0. \Sigma is known, I want minimum d with X.

My code is as following:

using Convex
m = 5;
A = randn(m, m); 
x = Semidefinite(5);
xx=x;
xx[diagind(xx)].=0;
obj=vecnorm(A-x,2)+sumabs(xx)*0.01;
pro= minimize(obj, [x >= 0]);
pro.constraints+=[x[diagind(x)].=1];
solve!(pro)

MethodError: no method matching diagind(::Convex.Variable)

I just solve the optimal problem by constrain the diagonal elements in matrix, but it seems diagind function could not work here, How can I solve the problem.


Solution

  • I think the following does what you want:

    m = 5
    Σ = randn(m, m)
    X = Semidefinite(m)
    XX = X - diagm(diag(X))
    obj = 0.5 * vecnorm(X - Σ, 2) + 0.01 * sum(abs(XX))
    constraints = [X >= 0, diag(X) == 1]
    pro = minimize(obj, constraints)
    solve!(pro)
    

    For the types of operations:

    So, to have XX be X with zero diagonal, we subtract the diagonal of X from it. And to constrain X having diagonal 1, we compare its diagonal with 1, using ==.

    It is a good idea to keep immutable values as far as possible, instead of trying to modify things. I don't know whether Convex even supports that.