2
$\begingroup$

I have the following two linear systems:

$$\begin{bmatrix} u_{11} & u_{12} \end{bmatrix} A = 0$$ $$\begin{bmatrix} u_{21} & u_{22} \end{bmatrix} B = 0$$

Both $A,B$ are $2 \times 2$ matrices with a non-trivial kernel.

Obviously I can treat $(u_{11}, u_{12}, u_{21}, u_{22})$ as a four-dimensional vector. However I'm interested in solutions that form a unitary matrix. In the case of a vector, if a linear system has a non-trivial kernel, then there is at least one solution with norm $1$.

More generally, if I consider a linear system: $$U T = 0$$ with $U$ as $n \times n$ matrix and $T$ as $n \times n \times n$ tensor, what are the conditions for a unitary solution to exist?

I'm not familiar with tensors and tensor linear systems, so references/notes on them are welcome.

$\endgroup$
6
  • $\begingroup$ can you write this out in components? is $A$ a $2\times n\times n$ tensor? do you then mean to say that $u_{11} A_{1ij}+u_{12}A_{2ij}=0$ for all $i,j\in\{1,2,\ldots n\}$ ? $\endgroup$ Commented Sep 8, 2023 at 14:54
  • $\begingroup$ @CarloBeenakker in the end I was writing about the most general case, where we have $u_{11}, \dots, u_{1n}$, and $n$ linear systems constraining each of the rows of $U$. Hope it is clearer now $\endgroup$ Commented Sep 9, 2023 at 15:10
  • $\begingroup$ when you write "I can treat this as a four-dimensional vector", what is "this" ? Is $A$ a vector or a matrix? What are its dimensions? $\endgroup$ Commented Sep 10, 2023 at 14:21
  • $\begingroup$ @CarloBeenakker edited $\endgroup$ Commented Sep 11, 2023 at 7:26
  • 1
    $\begingroup$ so you want two orthogonal vectors $a$ and $b$ such that $A^\top a=0$ and $B^\top b=0$; these will not exist in general, even if $A$ and $B$ are singular. $\endgroup$ Commented Sep 11, 2023 at 8:59

1 Answer 1

1
$\begingroup$

In general dimension this problem does not look easy, because once you add the conditions coming from $U^T U = I$ to your linear system in the matrix entries, the problem not linear anymore. Systems of quadratic equations, in general, are not easier than system of polynomial equations, and if the kernels have large dimensions I don't see any meaningful simplification (but of course I could be wrong). This problem looks general enough to suggest that any system of polynomial equations can be reduced to it.

In dimension 2 it's easy because you can solve everything by hand: assume $A,B$ are singular but not the zero matrix (otherwise the problem is easy); then the kernel of $A^T$ is the span of a single vector $v$, the kernel of $B^T$ is the span of a single vector $w$, hence a solution exists iff they satisfy $v^Tw = 0$.

EDIT: fixed $A, B$ -> $A^T, B^T$ as noted in the comments.

$\endgroup$
3
  • $\begingroup$ Don't you need to replace $A,B$ by $A^T,B^T$ as the kernels may change. Besides i guess describing for when there is a solution is easy, for every $k\le n-1$ and $A_i^Tu_i=0$, $i\le k$, $A_{i+1}^T$ should be singular on $\left\{u_1,\ldots,u_i\right\}^{\perp}$ with $\{u_1,\ldots,u_{i+1}\}$ an orthogonal basis. $\endgroup$ Commented Sep 16, 2023 at 5:52
  • $\begingroup$ @ToniMhax You are correct on the first part, those should be the kernels of $A^T$ and $B^T$. On the second part, I don't follow you: in higher dimension, those kernels may have dimension larger than 1, so the choice of each $u_i$ is non-unique; some choices may work and some may cause obstructions for a later $i$. $\endgroup$ Commented Sep 16, 2023 at 7:45
  • $\begingroup$ Yes, i only mean the condition is plan as in my comment, if there is no solution then it is impossible. $\endgroup$ Commented Sep 16, 2023 at 8:30

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.