adamant's blog

By adamant, history, 16 months ago, In English

Hi everyone!

Two problems were recently added to Library Checker:

Note: We can divide or multiply the $$$k$$$-th coefficient of initial/resulting polynomial by $$$a^k$$$, so we may assume $$$a=1$$$.

Today we'll learn how to solve both of them in $$$O(n \log n)$$$.


Evaluation at $$$r^k$$$

Well, the first one is also known as chirp Z-transform and it's by now standard (see here or here). It uses the fact that

$$$ \binom{a+b}{2} - \binom{a}{2} - \binom{b}{2} = ab, $$$

which is also important e.g. for graphic generating functions. Using this identity, we can represent

$$$ f(r^k) = \sum\limits_{j=0}^{n} f_j r^{kj} = r^{-\binom{k}{2}} \sum\limits_{j=0}^{n} \left(f_j r^{-\binom{j}{2}}\right) r^{\binom{k+j}{2}}. $$$

In other words, the sequence $$$c_k = r^{\binom{k}{2}}f(r^k)$$$ is the cross-correlation of the sequences $$$a_i = f_i r^{-\binom{i}{2}}$$$ and $$$b_j = r^{\binom{j}{2}}$$$.

This approach, commonly known as Bluestein algorithm, allows to solve the first task in $$$O(n \log n)$$$. But what about the second?

Interpolation at $$$r^k$$$

It gets suspiciously difficult at this point! Naturally, we may try to invert the Bluestein algorithm, but we will fail. Why? Well, if we know $$$f(r^k)$$$, we can recover $$$c_k$$$ directly. We also know the values of $$$b_j$$$. So, it shouldn't be difficult to recover $$$a_i$$$? Let's write it as a linear system:

$$$ \begin{pmatrix} c_0 \\ c_1 \\ \dots \\ c_n \end{pmatrix} = \begin{pmatrix} b_0 & b_1 & \dots & b_n \\ b_1 & b_2 & \dots & b_{n+1} \\ \vdots & \vdots & \ddots & \vdots \\ b_n & b_{n+1} & \dots & b_{2n} \end{pmatrix} \begin{pmatrix} a_0 \\ a_1 \\ \dots \\ a_n \end{pmatrix} $$$

It would be natural to be able to reverse cross-correlation in a similar manner to how we can inverse regular convolution. Well, it doesn't work this way. Why? Let's look on the system for convolution of $$$a_i$$$ and $$$b_j$$$ instead of cross-correlation:

$$$ \begin{pmatrix} c_0 \\ c_1 \\ \dots \\ c_n \end{pmatrix} = \begin{pmatrix} b_0 & 0 & \dots & 0 \\ b_1 & b_0 & \dots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ b_n & b_{n-1} & \dots & b_{0} \end{pmatrix} \begin{pmatrix} a_0 \\ a_1 \\ \dots \\ a_n \end{pmatrix} $$$

Notice anything? Yep. When we invert convolution, we, implicitly, find an inverse for a lower-triangular matrix. But with a cross-correlation, it is no longer the case. So, what matrix do we actually work with? Generally, matrices like the one we have for cross-correlation are known as Hankel matrices or, if we reverse the rows, as Toeplitz matrices. From their definition, it seems that inverting a cross-correlation is, in the general case, equivalent to multiplying a vector with an inverse of a general Hankel/Toeplitz matrix.

Okay. How fast can we do so in general case? Wikipedia points to so-called Levinson recursion, which runs in $$$O(n^2)$$$, and trying to find anything faster doesn't provide anything fruitful enough: fastest known algorithms seem to all be $$$O(n \log^2 n)$$$ and based on divide and conquer, similar to general solution for polynomial interpolation. If we want $$$O(n \log n)$$$ algorithm, we must do something beyond inverting general cross-correlation from Bluestein algorithm, and we must use the fact that we're working with a geometric progression.

Lagrange interpolation

let's recall how we could interpolate this polynomial with Lagrange interpolation:

$$$ f(x) = \sum\limits_{i=0}^n y_i \prod\limits_{j \neq i} \frac{x-x_j}{x_i-x_j}. $$$

For geometric progression $$$1, z, z^2, \dots, z^n$$$, we have a special case $$$x_i = z^i$$$ which allows us to greatly simplify this expression. First of all, we should notice that now we're able to compute denominator near each $$$y_i$$$ in $$$O(n)$$$ by representing it as

$$$ \prod\limits_{j \neq i} (z^i - z^j) = z^{ni} \prod\limits_{j \neq i} (1-z^{j-i}) = z^{ni} \prod\limits_{j=1}^i (1-z^{-j}) \prod\limits_{j=1}^{n-i} (1-z^{j}). $$$

So, we can compute the denominator near $$$y_i$$$ as $$$z^{ni}$$$, times a prefix product of $$$1-z^{-k}$$$, times a prefix product of $$$1-z^k$$$. Now let $$$y'_i$$$ be $$$y_i$$$, divided by corresponding denominators. We can find all $$$y'_i$$$ in $$$O(n \log n)$$$ with the decomposition above.

After that what remains is for us to compute the remaining sum, which looks like

$$$ f(x) = \sum\limits_{i=0}^n y'_i \prod\limits_{j \neq i} (x-z^j). $$$

Partial fraction decomposition

Let's divide each side by $$$Q_n(x) = \prod\limits_{j=0}^n (x-z^j)$$$, then the expression rewrites as

$$$ \frac{f(x)}{Q_n(x)} = \sum\limits_{j=0}^n \frac{y'_j}{x-z^j}. $$$

Hmm, so, if we could find right-hand side as a formal power series, we might recover $$$f(x)$$$ by multiplying RHS with $$$Q_n(x)$$$ and then only keeping first $$$n+1$$$ coefficients. Of course, two questions arise immediately when we think about it:

  • How to find first $$$n+1$$$ coefficients of the right-hand side?
  • How to compute $$$Q_n(x)$$$ quickly enough?
How to expand right-hand side?

To answer both questions, let's consider polynomial reversal

$$$ Q_n^R(x) = \prod\limits_{j=0}^n (1-z^j x). $$$

With this, the expression above rewrites as

$$$ \frac{f^R(x)}{Q_n^R(x)} = \sum\limits_{j=0}^n \frac{y_j'}{1-z^j x}. $$$

That's much better! Now we can expand right hand side as $$$\frac{1}{1-x} = 1 + x + x^2 + x^3 + \dots$$$, and get...

$$$ \sum\limits_{j=0}^n \frac{y_j'}{1-z^j x} = \sum\limits_{j=0}^n y_j' \sum\limits_{i=0}^\infty z^{ji} x^i = \sum\limits_{i=0}^\infty x^i \sum\limits_{j=0}^n y_j' z^{ij}. $$$

Huh?! The coefficient near $$$x^i$$$ in this expression is the value of $$$g(x) = \sum\limits_{j=0}^n y'_j x^j$$$ in the point $$$x=z^i$$$!

So, we can find first $$$n+1$$$ coefficients of the right-hand side via regular chirp Z-transform in $$$O(n \log n)$$$!

How to compute $$$Q_n(x)$$$ quickly?

And what about $$$Q_n^R(x)$$$? Turns out, we can also find it in $$$O(n \log n)$$$! For this, we notice that

$$$ Q_n(x) = Q_{t}(x) Q_{n-t}(z^t x) $$$

for any $$$t$$$. If we know $$$Q_{n-t}(x)$$$, finding $$$Q_{n-t}(z^t x)$$$ can be done by multiplying the $$$k$$$-th coefficient of $$$Q_{n-t}(x)$$$ with $$$z^{tk}$$$. From this, we notice that we can compute $$$Q_n(x)$$$ with a process similar to binary exponentiation by picking $$$t = \lfloor n/2 \rfloor$$$. The procedure then works in

$$$ T(n) = T\left(\frac{n}{2} \right) + O(n \log n) = O(n \log n). $$$

Altogether it solves the whole interpolation problem in $$$O(n \log n)$$$. Please refer to my submission on Library Checker for implementation.

  • Vote: I like it
  • +147
  • Vote: I do not like it

»
16 months ago, # |
  Vote: I like it +61 Vote: I do not like it

Polynomial Hamiltonian path in arbitrary graphs when?

  • »
    »
    16 months ago, # ^ |
      Vote: I like it -10 Vote: I do not like it

    When you make a generating function out of the problem

»
16 months ago, # |
  Vote: I like it +43 Vote: I do not like it

Today we'll learn

who "we"? you are alone here

»
16 months ago, # |
  Vote: I like it +3 Vote: I do not like it

This is very interesting. Thank you for putting in the time to write that