Let $P$ be the class of all polynomial time computable functions from $\{0,1\}^*\rightarrow \{0,1\}$. For any $f\in P$, define function $f^A:\mathbb{N}\rightarrow \{0,1\}^*$ by $$f^A(n)=(f(x_1),\cdots,f(x_{2^n})),$$ where $x_i$ runs over all strings in $\{0,1\}^n$. Suppose the time complexity of $f$ with input size $n$ is $n^k$, we know that $f^A(n)$ can be computed in $n^k2^n$ by simply compute $f$ on each $x_i$.
The amortized complexity of $f$ is defined to be $\text{AC}(f^A,n)=\frac{\text{Complexity}(f^A(n))}{2^n}$, where $\text{Complexity}(f^A(n))$ is the minimum time needed to compute $f^A(n)$ expressed in terms of $n$.
My questions:
- Do we always have $\text{AC}(f^A,n)\ll C(f,n)$ for $f\in P$, where $C(f,n)$ is the complexity of computing $f$ with input size $n$?
- Is there exist constant $c$, such that $\text{AC}(f^A,n)\ll n^c$ for all $f\in P$.
Any references in the literature will be appreciated.
Edit: Here we provide an simple example that the amortized complexity is asymptotically less than the worst case complexity. Let $f(x)=n_x\mod 3$, where $n_x$ is the integer that represented by binary string $x$. Clearly, one need at least $\Omega(|x|)$ time to compute $f$ (using an adversary argument), but the amortized complexity is $O(1)$.
One can, of course, consider any other NP-complete problems that can be solved by dynamic programming, but the complexity will be conditioned.