I am using Optim.jl in julia. This page provides some tips for writing codes.
Avoiding repeating computation, I want to optimize a cost function with providing a gradient. If there is no constant parameter in the cost function, the code below works.
using LinearAlgebra
using Optim
function cost!(F, G, x)
if G !== nothing
G .= x
end
if F !== nothing
return dot(x, x) / 2
end
end
sol = optimize(Optim.only_fg!(cost!), randn(5), LBFGS())
But what if the cost function has a constant parameter C
?
using LinearAlgebra
using Optim
function cost!(F, G, x, C)
if G !== nothing
G .= C * x
end
if F !== nothing
return C * dot(x, x) / 2
end
end
sol = optimize(Optim.only_fg!(cost!), randn(5), LBFGS())
I think I have to combine the tip that writes res = optimize(b -> sqerror(b, x, y), [0.0, 0.0])
as described on that page, but I do not know how to do that.
I understood. What I wanted to do is probably below:
C = 1
sol = optimize(Optim.only_fg!( (F,G,x) -> cost!(F,G,x,C) ), randn(5), LBFGS())
I did not understand the arguments of anonymous functions.