rnon-linear-regressionsingular

R: investigating singular gradient in non-linear regression


library(nls2)
# Investigate singular gradient.
# Note that this cannot be done with nls since the singular gradient at
# the initial conditions would stop it with an error.
DF1 <- data.frame(y=1:9, one=rep(1,9))
xx <- nls2(y~(a+2*b)*one, DF1, start = c(a=1, b=1), algorithm = "brute-force")
svd(xx$m$Rmat())[-2]

I am playing around with the nls2 package, which determines the non-linear least square estimates for non-linear regression. In the documentation, one of the examples (shown above) notes that it is investigating singular gradient. I see that xx is an nls object that has no parameter estimates. Does this mean the algorithm did not converge? Why is this? And what exactly is svd(xx$m$Rmat())[-2] doing?


Solution

  • There is no notion of convergence for brute-force. It simply evaluates the objective function at the starting value or values given and returns an nls object or objects depending on the particular arguments. See ?nls2 for details.

    Normally it is used for getting starting values for input into nls or other optimization function, for investigating singular values (since nls chokes on them but nls2 does not) or simply evaluating the nls objective function at known values.

    Since the example from the documentation reproduced in the question gave nls2 one starting value it evaluated the objective at that single value and returned. The parameter estimates are just the parameter values at which it was evaluated, namely the starting value.

    > coef(xx)
    a b 
    1 1 
    

    xx$m$Rmat() is a matrix such that its singlar values vector contains at least one zero if the problem is singular at the point calculated. The R function svd(...) returns a list of which the component d is the vector of singular values and u and v are the next two components with v being the eigenvectors. We are not interested in u here so [-2] was used to omit it.

    For this particular case, we see that the second singular value is zero:

    s <- sv(xx$m$Rmat)
    s$d
    ## [1] 6.708204 0.000000
    

    and it corresponds to the eigenvector

    v2 <- s$v[, 2]; v2
    ## [1] -0.8944272  0.4472136
    

    and since eigenvectors are only determined up to scalar multiple that is the same as:

    v2/v2[2]
    ## [1] -2  1
    

    which is the direction of singularity at the current point of evaluation. In this case adding any multiple of (-2, 1) to (1, 1) gives a RHS that is identical in value to the RHS at (1, 1) so it is clearly singular in that direction. It is simpler in this case than the general case due to linearity of the RHS but it works analogously relative to the tangent space, i.e. infinitesmally, for nonlinear objective functions.