I am not an expert with non-linear regression, but I struggle to make it work. I would like to fit the polynomial f(x) = ax^3 + bx^2 + cx + d, with the following constraints on my parameters :
- I want f to be increasing, meaning that the derivative of f with respect to x should be positive, that is, to have b^2 - 3ac < 0 and a >= 0.
To do so, I have implemented several functions :
- the
polynomialfunction implements this polynomial - the
linear_modelfunction returns the sum of squared residuals - the
grad_linear_modelfunction returns the gradient with respect to each of the parameters of theparvector - the
constraintfunction implements the constraint above (I've added a1e-10term in the constraint to make it strictly negative) - the
jacobianfunction implements the gradient of the constraint above, with respect to each of theparvector
However, depending on the optimization method I use, I get different error codes/solver status. On the method below, I get the solver status 4, and my parameters haven't slightly changed. If I use some other methods, I may get "Code failure" solver status. I can't figure out what is wrong in my code.
Thanks for your help.
library(nloptr)
x <- seq(0, 20, 0.1)
y <- 500 + 0.4 * (x-10)^3 + rnorm(length(x), 10, 80)
fit_polyreg <- function(x, y){
polynomial <- function(par, x){
return(par[1]*x^3 + par[2]*x^2 + par[3]*x + par[4])
}
linear_model <- function(par, x, y){
x <- as.numeric(x)
return(sum((y - polynomial(par, x))^2))
}
grad_linear_model <- function(par, x, y){
x <- as.numeric(x)
return(c(sum((y - polynomial(par, x))*x^3),
sum((y - polynomial(par, x))*x^2),
sum((y - polynomial(par, x))*x),
sum((y - polynomial(par, x)))))
}
constraints <- function(par, x, y){
return(par[2]^2 - 3*par[1]*par[3] + 1e-10)
}
jacobian <- function(par, x, y){
return(c(-3*par[3], 2*par[2], -3*par[1], 0))
}
pars <- nloptr(rep(0, 4),
function(par) linear_model(par, x, y),
function(par) grad_linear_model(par, x, y),
lb = c(0, -Inf, -Inf, -Inf),
eval_g_ineq = function(par) constraints(par, x, y),
eval_jac_g_ineq = function(par) jacobian(par, x, y),
opts = list("algorithm" = "NLOPT_LD_MMA", "xtol_abs" = 1.0e-10, "maxeval" = 500))
return(pars)
rnormfunction. In addition, if you fit the polynomial without the constraint, it returns a result that fits the constraint, which is surprising.