It is known that if $x_1, x_2, ..., x_n$ are all positive distinct real numbers, then the matrix
$$ \begin{pmatrix} x_1^{a_1} & x_1^{a_2} & \cdots & x_1^{a_n} \\ x_2^{a_1} & x_2^{a_2} & \cdots & x_2^{a_n} \\ x_3^{a_1} & x_3^{a_2} & \cdots & x_3^{a_n} \\ \vdots & \vdots & \vdots & \vdots \\ x_n^{a_1} & x_n^{a_2} & \cdots & x_n^{a_n} \end{pmatrix}$$
is of rank $n$, where $a_1, a_2, ... , a_n$ are distinct natural (zero included) numbers (Apart from the proof in the link, I think it is a consequence of Descartes' rule of signs.).
Therefore, assuming $x_1, x_2, ..., x_n$ are all positive distinct real numbers, if we divide the columns of the matrix $$\begin{pmatrix} 1 & x_1 &x_1^2 \cdots & x_1^{2n-2} \\ 1 & x_2 &x_2^2 \cdots & x_2^{2n-2} \\ 1 & x_3 &x_3^2 \cdots & x_3^{2n-2} \\ \vdots & \vdots & \vdots & \vdots \\ 1 & x_n &x_n^2 \cdots & x_n^{2n-2} \end{pmatrix},$$ into two subsets, then the vectors (or elements) of one of the subsets span the whole $\mathbb R^n$ since one of the two subsets contains at least $n$ vectors, which are independent.
Now let's assume non zero $x_1, x_2, ..., x_n$ are not necessarily positive but their absolute values are all distinct. If we divide the columns of the matrix $$\begin{pmatrix} 1 & x_1 & x_1^2 \cdots & x_1^{4n-4} \\ 1 & x_2 & x_2^2 \cdots & x_2^{4n-4} \\ 1 & x_3 & x_3^2 \cdots & x_3^{4n-4} \\ \vdots & \vdots & \vdots & \vdots \\ 1 & x_n &x_n^2 \cdots & x_n^{4n-4} \end{pmatrix},$$ into two subsets, then the vectors of one of the subsets span the whole $\mathbb R^n$ since we have $2n-1$ even powers and so one of the two subsets has at least $n$ vectors with even powers, which are independent (Just compare it with the initial fact.).
However, when working with some examples, it seems that, having $4n-3$ vectors (or columns), is not a sharp upper bound. For example when $n=2$, $2n=4$ vectors do the job. I think when $n=3$, $2n=6$ vectors do the job too. I investigated most of the possible cases for $n=3$ and couldn't find any counterexamples. So my goal is to figure out if the following statement is true or not:
Assume $x_1, x_2, ..., x_n$ are non zero real numbers with distinct absolute values. If we divide the columns of the matrix $$\begin{pmatrix} 1 & x_1 & x_1^2 \cdots & x_1^{2n-1} \\ 1 & x_2 & x_2^2 \cdots & x_2^{2n-1} \\ 1 & x_3 & x_3^2 \cdots & x_3^{2n-1} \\ \vdots & \vdots & \vdots & \vdots \\ 1 & x_n & x_n^2\cdots & x_n^{2n-1} \end{pmatrix},$$ into two subsets, at least one of them spans $\mathbb R^n.$
If this statement is not true, any sort of counterexample is a great help.
I know that this problem can be transformed into a problem involving polynomials and their real roots. For example, when $n=3$, assume that $\{1, x^2, x^3\}$ and $\{x,x^4,x^5\}$ are our subsets (we can identify the vectors with their powers). $\{1, x^2, x^3\}$ not being independent means there are $c_0, c_2, c_3$ such that $x_1, x_2, x_3$ are roots of $c_0+c_2X^2+c_3X^3$, which implies $\frac{1}{x_1}+\frac{1}{x_2}+\frac{1}{x_3}=0$ (compute the coefficient of $X$). On the other hand, $\{x, x^4, x^5\}$ not being independent means there are $d_1, d_4, d_5$ such that $x_1, x_2, x_3$ are roots of $d_1X+d_4X^4+d_5X^5$, which implies $x_1, x_2, x_3$ and some other non zero real number, say $x_4$, are roots of $d_1+d_4X^3+d_5X^4$, which implies $\frac{1}{x_1}+\frac{1}{x_2}+\frac{1}{x_3}+\frac{1}{x_4}=0$ (compute the coefficient of $X$), and this is a contradiction. Similarly, we can rule out the other cases.
EDIT: I included the "Number Theory" tag since any proof or counterexamples for integers could be also interesting.
