1
$\begingroup$

we consider Boolean circuits as we do, Specifically, inner nodes are either AND, OR (both – fan-in 2), or NOT (fan-in 1) gates. The fan-out of each gate is 2. The size of a circuit is the number of inner nodes (gates) in it.

How can I Show that there are boolean functions $𝑓 ∢ \{0,1\}^βˆ— β†’ \{0,1\}$ for which the following hold:

  1. 𝑓 depends only on the first $⌈2.1β‹…\log (𝑛)βŒ‰$ bits of the input (i.e., the output of 𝑓 on inputs of length 𝑛 depends only on a prefix of this logarithmic length of the input), and
  2. 𝑓 cannot be recognized (decided) by any circuit family $\{𝐢_𝑛\}_{π‘›βˆˆβ„•}$ of size $O(𝑛^2).$

How can I prove the above argument? Would I use Shannon’s counting argument? Anybody give me some idea?

Would we Conclude that $\text{SIZE}(n^2)βŠ‚\text{SIZE}(n^3)?$ That is, how we prove that there are languages that can be decided by circuit families of size $O(n^3)$, but cannot be decided by circuit families of size $O(n^2)$?

$\endgroup$
1
  • 1
    $\begingroup$ Are you sure the fan-out is $2$? This is unusual. $\endgroup$ Commented Sep 20, 2024 at 16:31

1 Answer 1

1
$\begingroup$

To use the Shannon's argument, you first need the following statement: if a function $f$ depends only on the first $m$ bits of the input, then every optimal circuit for $f$ (any circuit of minimal size) only uses the first $m$ inputs bits.

To prove this, consider a circuit computing $f$ which uses some of the inputs except the first $m$. Note that since $f(x_1, \dots, x_n) = f(x_1, \dots, x_m, 0, \dots, 0)$, we can replace all inputs except the first $m$ by constant $0$ inputs, and then use the identities $0 \wedge x = 0$, $0 \vee x = x$, $\neg 0 = 1$ to remove at least one computational gate and obtain a new circuit, which

  • Still computes the function $f$
  • Has less gates than the original one
  • Uses only inputs $x_1, \dots, x_m$

So for every circuit which uses bits after $x_m$ there is a smaller circuit which does not use these bits; it follows that an optimal circuit only uses $x_1, \dots, x_m$.

Now we can use Shannon's argument. There exists a function of $m = \lceil 2.1 \log n \rceil$ bits which requires circuits of size $$\frac{2^m}{m}(1 - o(1)) = \frac{n^{2.1}}{2.1 \log n}(1 - o(1)),$$ which is asymptotically more than $n^2$.

On the other hand, Shannon's or Lupanov's upper bound says that every function of $m = \lceil 2.1\log n\rceil$ variables is computed by a circuit of size $$O(\frac{2^m}{m}) = O(\frac{n^{2.1}}{2.1 \log n}) = O(n^3).$$

$\endgroup$
8
  • $\begingroup$ Thank you for your answer, but what about second condition: 𝑓 cannot be recognized (decided) by any circuit family $\{𝐢_𝑛\}_{π‘›βˆˆβ„•}$ of size $O(𝑛^2).$ $\endgroup$ Commented Sep 20, 2024 at 16:39
  • $\begingroup$ $(n^{2.1}/2.1\log n)(1 - o(1))$ is not $O(n^2)$ $\endgroup$ Commented Sep 20, 2024 at 16:39
  • $\begingroup$ Okay understand, I forgot to care that $n^{2.1}/\log n\approx O(n^2).$ $\endgroup$ Commented Sep 20, 2024 at 16:44
  • 1
    $\begingroup$ By Shannon's or Lupanov's upper bound any function of $2.1\log n$ bits can be computed in $O(\frac{n^{2.1}}{2.1\log n}) = O(n^3)$, buut we proved that there exist such functions which require more than $O(n^2)$ $\endgroup$ Commented Sep 20, 2024 at 16:53
  • 1
    $\begingroup$ Thank you, you cleared my complete confusion. Would you put your last comments in an answer if I put my last question (in comments) in question? $\endgroup$ Commented Sep 20, 2024 at 16:58

You must log in to answer this question.