0
$\begingroup$

I'm investigating a particular topic and I'd like to get some references on it.

The idea is as follows: pick some natural $d$ and let $\mathcal{F}_d$ be a Gaussian Process on $\mathbb{R}^d$ with mean 0 and covariance function:

$K(x, x') = \exp\left(-\|x - x'\|^2 / 2 \right)$

Now I'm gonna consider the problem of maximizing a sample function $f \sim \mathcal{F}_d$ with gradient methods. I'll consider the following methods: fix some strictly lower triangular matrix $A$ of size $n \times n$. Define a random sequence $X_0, X_1, ... , X_n$ by:

$X_0 = 0 \\ X_{k} = A_{k, 0} \nabla f(X_0) + ... + A_{k, k - 1} \nabla f(X_{k - 1}) \ \ \ \text{for} \ k = 1, 2, ... , n$

so that $X_0, X_1, ... , X_n$ represent the evaluation points of my method. We note that $f(X_n)$ is a random variable, and its distribution is dependent on $d$ and $A$ only. Now here is what I know: as I let $d \rightarrow \infty$, the variable $f(X_n)$ converges in distribution to a Dirac measure $\delta_c$, where $c$ is a function of matrix $A$ only.


Has such setup been considered somewhere before? I suppose it's rather unusual to take $d$ to infinity. Nonetheless, I'll be grateful for some references. Thank You!

$\endgroup$

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.