0

I am not an expert with non-linear regression, but I struggle to make it work. I would like to fit the polynomial f(x) = ax^3 + bx^2 + cx + d, with the following constraints on my parameters :

  • I want f to be increasing, meaning that the derivative of f with respect to x should be positive, that is, to have b^2 - 3ac < 0 and a >= 0.

To do so, I have implemented several functions :

  • the polynomial function implements this polynomial
  • the linear_model function returns the sum of squared residuals
  • the grad_linear_model function returns the gradient with respect to each of the parameters of the par vector
  • the constraint function implements the constraint above (I've added a 1e-10 term in the constraint to make it strictly negative)
  • the jacobian function implements the gradient of the constraint above, with respect to each of the par vector

However, depending on the optimization method I use, I get different error codes/solver status. On the method below, I get the solver status 4, and my parameters haven't slightly changed. If I use some other methods, I may get "Code failure" solver status. I can't figure out what is wrong in my code.

Thanks for your help.

library(nloptr)

x <- seq(0, 20, 0.1)
y <- 500 + 0.4 * (x-10)^3 + rnorm(length(x), 10, 80)

fit_polyreg <- function(x, y){
  
  polynomial <- function(par, x){
    return(par[1]*x^3 + par[2]*x^2 + par[3]*x + par[4])
  }
  
  linear_model <- function(par, x, y){
    x <- as.numeric(x)
    return(sum((y - polynomial(par, x))^2))
  }
  
  grad_linear_model <- function(par, x, y){
    x <- as.numeric(x)
    return(c(sum((y - polynomial(par, x))*x^3),
             sum((y - polynomial(par, x))*x^2),
             sum((y - polynomial(par, x))*x),
             sum((y - polynomial(par, x)))))
  }
  
  constraints <- function(par, x, y){
    return(par[2]^2 - 3*par[1]*par[3] + 1e-10)
  }
  
  jacobian <- function(par, x, y){
    return(c(-3*par[3], 2*par[2], -3*par[1], 0))
  }
  
  pars <- nloptr(rep(0, 4),
                 function(par) linear_model(par, x, y),
                 function(par) grad_linear_model(par, x, y),
                 lb = c(0, -Inf, -Inf, -Inf),
                 eval_g_ineq = function(par) constraints(par, x, y),
                 eval_jac_g_ineq = function(par) jacobian(par, x, y),
                 opts = list("algorithm" = "NLOPT_LD_MMA", "xtol_abs" = 1.0e-10, "maxeval" = 500))
  
  return(pars)
2
  • your errors do not have a mean of 0. Also have a very huge standard deviation. Plot the data and might end up not depicting the pattern Commented Apr 3, 2024 at 17:55
  • @Onyambu, actually the data is just for the reproducible example, but I get the same error with my own data. On the reproducible example, unless I'm wrong, the error term is 0 (if you take out the 10 of the rnorm function. In addition, if you fit the polynomial without the constraint, it returns a result that fits the constraint, which is surprising. Commented Apr 3, 2024 at 22:24

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.