5

The answers to the following question -

Hierarchy theorem for circuit size

give a "circuit hierarchy theorem" for boolean circuits. Does there exist a similar hierarchy theorem for arithmetic circuits? In particular, for $f(n) >> g(n)$ (for some notion of >>) does there exist a family of polynomials $h(x_1, \cdots, x_n) \in \mathbb{F}(x_1, \cdots x_n)$, $n \geq 1$, computable by arithmetic circuits of size $f(n)$ but not by arithmetic circuits of size $g(n)$?

1 Answers1

1

This question has a somewhat trivial answer because the polynomial $x^{2^s}$ requires $s$ multiplications, so you can just take $h = x_1^{2^{f(n)}}$. This is one of the reasons why in algebraic complexity we almost always talk about families with polynomially bounded degree.

For a less trivial statement, we can use the same strategy as in the Boolean settings, but instead of simple counting, in algebraic complexity we use dimension counting.

Main idea: in a circuit computing a non-constant function has $s$ gates, then it has at most $s$ constants. By fixing the structure of the circuit and varying the constants we see that the set of all polynomials computed by the fixed circuit (which is a Zariski constructible set by Chevalley's theorem) has dimension at most $s$. There is only finite number of circuits of complexity $s$, so the set of all polynomials with complexity $s$ has dimension at most $s$. Comparing $s$ with the dimension of the set of polynomials with given degree and number of variables, we can get a statement about the existence of hard polynomials.

Vladimir Lysikov
  • 632
  • 1
  • 6
  • 15