0
$\begingroup$

I have a special non-convex optimization problem:

$\min / \max \ f(x) + g(x) + h(x)$,

subject to $| g(x) - h(x)| < \varepsilon$,

where $f(x)$ is non-convex, but both $g(x)$ and $h(x)$ are convex.

Is there any local implement algorithms to do that? I know maybe there is no global optima, so I just need the local optima.

If possible, can I have some pseudo code?

I try to use Lagrange multiplier,

$L(\lambda) = f(x) + g(x) + h(x) + \lambda(|g(x) - h(x)| - \varepsilon)$

or maybe for smooth case,

$L(\lambda_1, \lambda_2) = f(x) + g(x) + h(x) + \lambda_1(g(x) - h(x) - \varepsilon) + \lambda_2(h(x) - g(x) - \varepsilon)$.

Then for some convex cases, we can find the analytical solution of $\lambda$ according to the gradients.

So how to obtain $\lambda$ and solve the problem numerically if there is no closed form solution.

In fact, here, $f$ is the sum of entropy and cross entropy of mixture Gaussian distribution. $g$ is one part of expected log likelihood (here likelihood is Gaussian or logistic) and $h$ is another part of expected log likelihood (here likelihood is Gaussian or logistic). $g + h$ is the whole expected log likelihood. Actually, it is constrained optimization for ELBO(Evidence lower bound) in variational inference.

$\endgroup$
2
  • 2
    $\begingroup$ perhaps it would be useful to give $f, g$ and $h$ concretely, there might be more structure to be exploited $\endgroup$ Commented May 1, 2018 at 18:34
  • $\begingroup$ I added the explanation of f,g,h. Actually, it is constrained optimization for ELBO(Evidence lower bound) in variational inference. $\endgroup$ Commented May 8, 2018 at 14:33

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.