2
$\begingroup$

I'm working on a project that requires quickly calculating the resultant of two polynomials in $\mathbb{Z}[x]$ of large degree $d,e$. On the Wikipedia article for polynomial resultants, it says that

The use of fast multiplication of integers and polynomials allows algorithms for resultants and greatest common divisors that have a better time complexity, which is of the order of the complexity of the multiplication, multiplied by the logarithm of the size of the input $(\log(s(d+e))$, where $s$ is an upper bound of the number of digits of the input polynomials).

Does anyone know of a reference of such a result? A 2021 paper seems to claim that the current fastest method is $d\log(d)^{O(1)}$ (for $d \geq e$), which is obviously much slower than $\log(s(d+e))$. Am I misunderstanding the Wikipedia result?

$\endgroup$
6
  • $\begingroup$ Are you asking from a theoretical or practical perspective? From a practical perspective the best available algorithm is probably implemented in computer algebra systems like SageMath $\endgroup$ Commented Oct 29 at 2:07
  • 1
    $\begingroup$ Flint (flintlib.org/doc/fmpz_poly.html) refers to "Cohen, Henri : A course in computational algebraic number theory (1996)", and "Collins, George E. : The Calculation of Multivariate Polynomial Resultants, SYMSAC ‘71, ACM 1971 212–222", are either of those useful? $\endgroup$ Commented Oct 29 at 2:46
  • 1
    $\begingroup$ Clearly, the complexity in terms of bit operations has to depends on the size of the input, not just on the degree, hence the only possibility is that the claimed $d(\log d)^{O(1)}$ bound only counts the number of arithmetical operations (which is not really adequate for working with arbitrary size integers). The “complexity of multiplication” as mentioned by Wikipedia is itself $sd(\log sd)^{O(1)}$ bit operations, or $d(\log d)^{O(1)}$ arithmetic operations. (For theoretical purposes, there is an algorithm that makes the exponent $1$, but for practically usable algorithms it is a bit higher.) $\endgroup$ Commented Oct 29 at 6:21
  • 2
    $\begingroup$ ... So anyway, up to the unspecified value of the exponent, the bound on Wikipedia is the same as quoted in that paper: $d(\log d)^{O(1)}$ arithmetical operations, or $sd(\log sd)^{O(1)}$ bit operations. From the wording of the question, it seems to me that what you are misunderstanding is that you are for some reason completely ignoring the most crucial “the order of the complexity of the multiplication multiplied by ...” part of the description of the complexity. $\endgroup$ Commented Oct 29 at 6:23
  • 1
    $\begingroup$ I agree. If you want to copy and paste this into an answer, I’ll accept it! $\endgroup$ Commented Oct 29 at 11:10

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.