0
$\begingroup$

Let $A \in \mathbb{F}^{n \times n}$ be a square matrix, and let $(i,j)$ denote the entry in the $i$-th row and $j$-th column of $A$.

We say that the position $(i,j)$ is unreachable if for all positive integers $k \ge 1$, the $(i,j)$-th entry of $A^k$ is zero, i.e., $(A^k)_{ij} = 0, \quad \forall k \ge 1.$

In this case, the matrix $A$ is said to be elementwise unreachable at position $(i,j)$. I want to ask if anyone has studied this, and How to determine this matrix? What are the properties of this matrix?

?

$\endgroup$
2
  • $\begingroup$ Please could you edit the question to tighten things up. I cannot tell if 'unreachable' means the same thing as 'elementwise unreachable'. And you say $(i,j)$ is an element of a matrix, but then call it a position ... $\endgroup$ Commented Sep 23 at 16:26
  • 1
    $\begingroup$ @MarkWildon Imagine a matrix $A$ having a zero at some position in the first row, such that the first row of $A^2$ is identically zero. $\endgroup$ Commented Sep 23 at 18:31

2 Answers 2

1
$\begingroup$

Note that Ilya's answer that powers of $A$ are spanned by the first $n$ powers, means that it suffices to compute a random polynomial $p(A)$, and any position which is reachable will be nonzero with probability $1-\frac1{|\mathbb{F}|}$. Using the Paterson-Stockmeyer algorithm this can be done in time $O(n^3)$.

However, we can actually do better. Consider the following algorithm: start with $X = A$, and for $i$ from 1 to $\log(n)$ set $X = (I + c A^{2^i}) X$ for a random scalar $c$. This can be computed in time $O(M(n) \log(n))$ where $M(n)$ is the complexity of matrix multiplication. To bound the success probability we can consider the reverse view of the algorithm, where we start from $v = A, A^2, \dots, A^{2^i}$ for $i=\log(n)$, and each time we fold out vectors of matrices in half by setting $v_j' = v_j + c v_{j+2^{i-1}}$ and decrease $i$. If $i,j$ was nonzero in some matrix, then after a fold it will remain nonzero with probability at least $1-\frac1{|\mathbb{F}|}$. We fold $\log(n)$ times, so the success probability is at least $(1-\frac1{|\mathbb{F}|})^{\log(n)}$. This can be bad (for $\mathbb{F}=\mathbb{F}_2$, this is around $\frac1n$), but instead of working over $\mathbb{F}$ we can choose scalars from a degree $d$ extension field. This gives success probability of at least $1-\log(n)/|\mathbb{F}|^d$. To achieve total probability $1-\varepsilon$ we want to set $d=\log_{|\mathbb{F}|}(n / \varepsilon)$. The overall complexity is $O(M(n) \log(n) \log_{|\mathbb{F}|}(n / \varepsilon) \log(\log_{|\mathbb{F}|}(n / \varepsilon)))$.

Comparing this to the complexity of transitive closure, which is a special case of what you require, and is known to be equivalent to boolean matrix multiplication, we can see that this is close to optimal.

$\endgroup$
0
$\begingroup$

This is more about how to check that $(i,j)$ is unreachable.

First of all, all matrices of the form $A^k$ lie in the linear span $\langle I,A,\dots,A^{n-1}\rangle$ by Cayley--Hamilton; hence, for $i\neq j$, it suffices to check whether the position $(i,j)$ in $A,A^2,\dots,A^{n-1}$ vanishes. For $i=j$, we also nees to check that the constant term of the characteristic polynomial (i.e., $|A|$) is zero, which guarantees that $I$ is never used; otherwise, even if the position vanishes at $A,A^2,\dots,A^{n-1}$, it will not vanish at $A^n$.

Another approach is to say that the condition reads that $A^ke_j$ has zero $i$th coordinate, for each $k$. This means that the cyclic subspace generated by $Ae_j$ lies completely in $V_i=\langle e_1,e_2,\dots,e_{i-1}, e_{i+1},\dots,e_n\rangle$. If $Ae_j=r_1+\dots+r_t$ is the decomposition of $Ae_j$ into the sum of root vectors (with distinct eigenvalues, over some algebraic extension of $\mathbb F$), then that cyclic subspace is the direct sum of the cyclic subspaces generated by the $r_i$. So it suffices to check that each such subspace lies in $V_i$. That may occasionally happen to be faster (or more plausible from the theoretical point of view).

I doubt that much more can be said about this.

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.