0
$\begingroup$

I found an interesting problem in Post 1, Post 2. Let us suppose to have $ M $ quadratic equations $$ \underline{x}^T A_i \underline{x} + \underline{b}^T \underline{x} = c \quad i = 1,...,M $$ with $ \underline{x} \in \mathbb{R}^N $ and $ N < M $ (this should be the most interesting case). The goal is to identify the conditions to have not null solutions. At first, we can get rid of the linear forms and retain only the quadratic ones:

$$ \underline{\hat{x}}^T B_i \underline{\hat{x}} = c \quad i = 1,...,M \quad \underline{\hat{x}} = [1 \quad \underline{x}^T]^T \quad B_i = \begin{bmatrix} 0 & \underline{b}^T/2 \\ \underline{b}/2 & A_i \end{bmatrix} $$

I have doubts on what happens as $ c $ varies:

1) $c = 0$. Let us suppose that there exists $ \underline{\hat{x}}^* \in \mathbb{R}^{N+1} $ that satisfies the $ M $ quadratic equations, namely $ {\underline{\hat{x}}^*}^T B_i \underline{\hat{x}}^* = 0, i = 1,...,M $. These can be written also as scalar products $ \langle \underline{\hat{x}}^* , B_i \underline{\hat{x}}^* \rangle = 0 $, then $ \underline{\hat{x}}^* $ is orthogonal to the space spanned by the vectors $ B_i \underline{\hat{x}}^* $. Let us call this space as $ \mathcal{T} $. Let us suppose that among the vectors spanning $ \mathcal{T} $ we can find $ r $ linearly independent vectors $ B_j \underline{\hat{x}}^* $, with $ M \geq r $. If $ r = N+1 $, then these $ r $ vectors can generate any vectors in $ \mathbb{R}^{N+1} $. Since we saw that $ \underline{\hat{x}}^* \in \mathbb{R}^{N+1} $ is orthogonal to $ \mathcal{T} $, we conclude that $ \underline{\hat{x}}^* = \underline{0} $ is the sole solution. Regrettably, I cannot understand what this implies about the properties of the matrices $ B_j $ (positive definiteness, positive semidefiniteness, ...). Namely, what are the properties of the matrices $ B_j $ to have $ r $ linearly independent vectors $ B_j \underline{\hat{x}} $? On the posts that I cited it is said that the matrices $ B_j $ must be invertible, but I cannot understand how this guarantees to have linearly independent vectors $ B_j \underline{\hat{x}} $.

2) again $ c = 0 $, but this time with $ r < N + 1 $. In this case, the space $ \mathcal{T} $ has dimension lower than $ N+1 $. Then, because of the orthogonality conditions of the previous point, now a not null solution $ \underline{\hat{x}}^* \in \mathbb{R}^{N+1} $ that satisfies all the $ M $ quadratic equations can exist. Regrettably, as in the previous point, I am not able to outline the properties of the matrices $ B_i $ to make it happen. Moreover, I think that it is necessary to take into account that $ \underline{\hat{x}} $ has a dummy component, because it is defined as $ \underline{\hat{x}} = [1 \quad \underline{x}^T]^T $.

3) $ c > 0 $. In this case, I think that it should be sufficient to have a single matrix among the $ B_i $ that is negative definite to have $ \underline{\hat{x}}^* = \underline{0} $ as the sole solution. Indeed, given a negative definite matrix $ B_j^* $, we have $ \underline{\hat{x}}^T B_j^* \underline{\hat{x}} < 0 $ for $ \forall \underline{\hat{x}} \in \mathbb{R}^{N+1} \setminus \{ \underline{0} \} $, than it cannot satisfy $ \underline{\hat{x}}^T B_j^* \underline{\hat{x}} = c $ with $ c > 0 $. Analogously, if $ c < 0 $, a sole positive definite matrix should be sufficient to make the system incompatible. But I cannot understand what happens if the matrices are only semidefinite and/or not definite. Do you have any hints?

Thanks a lot!

$\endgroup$

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.