4
$\begingroup$

I noticed an interesting numerical phenomennon \begin{equation} \dfrac{1}{n}\sum_{k=0}^{n-1} w_k \alpha_n^{km} \rightarrow 0 \text{ uniformly for all $m\in \{1, \dots, n-1\}$ when }n\rightarrow \infty, \, w_k\overset{iid}{\sim}\mathrm{Exp}(1). \end{equation} where $\alpha_n = e^{\tfrac{2\pi i}{n}}$.

This possibly should hold in probability, or in maybe even 'almost sure'. At best, I would be interested in rates, at worst just to understand how to give reasonable bounds.

Bellow I attach simulations for $n=100, n=1000$ sampling $N=10000$ times. From this data I check distribution of maximum of absolute value of the sum over $k$. Plots clearly show some simultaneous contraction for all $k$.

enter image description here enter image description here

$\endgroup$
3
  • $\begingroup$ I have just understood that if one considers imaginary or real part, then one gets the sum of independent random variables. It is easy to see that variance of real/imaginary part of the sum equals (2n)^{-1}. This is for fixed m, I wonder if it is difficult to obtain a joint bound. $\endgroup$ Commented Jan 13 at 9:18
  • $\begingroup$ What are your plots, exactly -- formally described, as, say, (possibly pseudo-random) sets? $\endgroup$ Commented Jan 13 at 12:48
  • $\begingroup$ Plots show distribution of maximum of absolute value of the above sum, where maximum is taken over m. I also tried to scale not with n^{-1} but with n^{-1/2} and it looked like 'almost correct rate' of contraction. $\endgroup$ Commented Jan 13 at 14:10

1 Answer 1

6
$\begingroup$

$\newcommand\ep\varepsilon$Yes, these random variables (r.v.'s) converge to $0$ uniformly in $m\in[n-1]:=\{1,\dots,n-1\}$ almost surely (a.s.).

Indeed, the r.v. in question is a special case of $C_{n,m}+i S_{n,m}$, where $$C_{n,m}:=\frac1n\,\sum_{k=0}^{n-1}X_k\cos\frac{2\pi km}n,\quad S_{n,m}:=\frac1n\,\sum_{k=0}^{n-1}X_k\sin\frac{2\pi km}n,$$ where $X_1,X_2,\dots$ are i.i.d. r.v.'s with finite absolute fifth moment. Let also $Y_k:=X_k-EX_k$. Then $$C_{n,m}=\frac1n\,\sum_{k=0}^{n-1}Y_k\cos\frac{2\pi km}n,$$ because $\sum_{k=0}^{n-1}\cos\frac{2\pi km}n=0$ for $m\in[n-1]$. So, $$EC_{n,m}^2=\frac{EY_1^2}{n^2}\,\sum_{k=0}^{n-1}\cos^2\frac{2\pi km}n=O(1/n);$$ the constants in $O(\cdot)$ here and in what follows depend only on $E|Y_1|^5$.
So, by Rosenthal's inequality (see e.g. this), $$E|C_{n,m}|^5=O(1/n^{5/2}).$$ So, by the union and Markov inequalities, for all real $\ep>0$, $$P(\max_{m\in[n-1]}|C_{n,m}|>\ep) \le\sum_{m\in[n-1]} P(|C_{n,m}|>\ep) \le\sum_{m\in[n-1]} \frac{E|C_{n,m}|^5}{\ep^5} =O\Big(\frac{n^{-3/2}}{\ep^5}\Big).$$ So, by the Borel--Cantelli lemma, a.s. the events $\{\max_{m\in[n-1]}|C_{n,m}|>\ep\}$ occur only for finitely many $n$. That is, a.s. $$\max_{m\in[n-1]}|C_{n,m}|\to0$$ (as $n\to\infty$).

Quite similarly, a.s. $$\max_{m\in[n-1]}|S_{n,m}|\to0.$$

So, a.s. $$\max_{m\in[n-1]}|C_{n,m}+i S_{n,m}|\to0.\quad\Box$$

Remark: Actually, it would be enough to assume that $E|X_1|^p<\infty$ for some real $p>4$.

$\endgroup$
2
  • $\begingroup$ Top ! Thanks, I've never heard before of Rosenthal's inequality! Could you give a link maybe to a book, where I could find it ? I like your proof as it also gives an upper bound on the rate of convergence to zero, though the exact rate seems more difficult to get. $\endgroup$ Commented Jan 13 at 14:08
  • 1
    $\begingroup$ @FedorGoncharov : The best link on the state of the art concerning Rosenthal's inequality for sums of independent zero-mean real-valued random variables that I can give is the second link given in this previous answer; see also references there. Strangely enough, I don't know a book reference to it. $\endgroup$ Commented Jan 13 at 14:17

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.