Semináø z výpoèetní slo¾itosti

Pavel Pudlák, Jiøí Sgall

Pøedchozí program semináøe, letní semestr 2008 [Previous program, Spring 2008]

13. 5. 2008 (úterý [Tuesday], 14:00, MÚ ®itná 25, velká posluchárna v pøízemí)

Harry Buhrman and John M. Hitchcock: NP-Hard Sets are Exponentially Dense Unless coNP ⊆ NP/poly. CCC 2008.
(referuje Michal Koucký)

Abstract. We show that hard sets S for NP must have exponential density, i.e. |S=n| ≥ 2nε for some ε > 0 and infinitely many n, unless coNP ⊆ NP\poly and the polynomial-time hierarchy collapses. This result holds for Turing reductions that make n1-ε queries.
In addition we study the instance complexity of NP-hard problems and show that hard sets also have an exponential amount of instances that have instance complexity nδ for some δ > 0. This result also holds for Turing reductions that make n1-ε queries.

6. 5., 20. 5. 2008 (úterý [Tuesday], 14:00, MÚ ®itná 25, velká posluchárna v pøízemí)

S. Aaronson and A. Wigderson. Algebrization: A New Barrier in Complexity Theory. STOC'2008. Conference version.
(referuje Jiøí Sgall)

Abstract. Any proof of P=?NP will have to overcome two barriers: relativization and natural proofs. Yet over the last decade, we have seen circuit lower bounds (for example, that PP does not have linear-size circuits) that overcome both barriers simultaneously. So the question arises of whether there is a third barrier to progress on the central questions in complexity theory. In this paper we present such a barrier, which we call algebraic relativization or algebrization. The idea is that, when we relativize some complexity class inclusion, we should give the simulating machine access not only to an oracle A, but also to a low-degree extension of A over a finite field or ring. We systematically go through basic results and open problems in complexity theory to delineate the power of the new algebrization barrier. First, we show that all known non-relativizing results based on arithmetization do indeed algebrize. Second, we show that almost all of the major open problems—including P versus NP, P versus RP, and NEXP versus P/poly—will require non-algebrizing techniques. In some cases algebrization seems to explain exactly why progress stopped where it did: for example, why we have superlinear circuit lower bounds for PromiseMA but not for NP.

22. 4., 29.4. 2008 (úterý [Tuesday], 14:00, MÚ ®itná 25, velká posluchárna v pøízemí)

A. A. Sherstov: Separating AC_0 from depth-2 majority circuits. STOC 2007.
(referuje Tomá¹ Ebenlendr)

Abstract. We construct a function in AC0 that cannot be computed by a depth-2 majority circuit of size less than exp((n1/5)). This solves an open problem due to Krause and Pudl´ak (1994) and matches Allender’s classic result (1989) that AC0 can be efficiently simulated by depth-3 majority circuits. To obtain our result, we develop a novel technique for proving lower bounds on communication complexity. This technique, the Degree/Discrepancy Theorem, is of independent interest. It translates lower bounds on the threshold degree of a Boolean function into upper bounds on the discrepancy of a related function. Upper bounds on the discrepancy, in turn, immediately imply lower bounds on communication and circuit size. In particular, our work yields the first known function in AC0 with exponentially small discrepancy, exp(−(n1/5)).

15. 4. 2008 (úterý [Tuesday], 14:00, MÚ ®itná 25, velká posluchárna v pøízemí)

L. Fortnow and R. Santhanam. Infeasibility of instance compression and succinct PCPs for NP. STOC 2008.
(referuje Ondøej Zajíèek)

Abstract: The OR-SAT problem asks, given Boolean formulae phi_1,...,phi_m each of size at most n, whether at least one of the phi_i's is satisable. We show that there is no reduction from OR-SAT to any set A where the length of the output is bounded by a polynomial in n, unless NP Time Hierarchy collapses. This result settles an open problem proposed by Bodlaender et. al. [4] and Harnik and Naor [15] and has a number of implications.

8. 4. 2008
(úterý [Tuesday], 14:00, MÚ ®itná 25, velká posluchárna v pøízemí)

Sergey Yekhanin: Locally Decodable Codes From Nice Subsets of Finite Fields and Prime Factors of Mersenne Numbers. ECCC TR07-040
(referuje Tomá¹ Dzetkuliè)

Abstract: A k-query Locally Decodable Code (LDC) encodes an n-bit message x as an N-bit codeword C(x), such that one can probabilistically recover any bit x_i of the message by querying only k bits of the codeword C(x), even after some constant fraction of codeword bits has been corrupted. The major goal of LDC related research is to establish the optimal trade-off between length and query complexity of such codes. Recently [Y] introduced a novel technique for constructing locally decodable codes and vastly improved the upper bounds for code length. The technique is based on Mersenne primes. In this paper we extend the work of [Y] and argue that further progress via these methods is tied to progress on an old number theory question regarding the size of the largest prime factors of Mersenne numbers. Specifically, we show that every Mersenne number m = 2^t - 1 that has a prime factor p > m^gamma yields a family of k(gamma)-query locally decodable codes of length Exp(n^{1/t}). Conversely, if for some fixed k and all epsilon>0 one can use the technique of [Y] to obtain a family of k-query LDCs of length Exp(n^epsilon); then infinitely many Mersenne numbers have prime factors larger than known currently.

25. 3., 1. 4. 2008 (úterý [Tuesday], 14:00, MÚ ®itná 25, velká posluchárna v pøízemí)

Sergey Yekhanin: New Locally Decodable Codes and Private Information Retrieval Schemes. ECCC TR06-127
Prasad Raghavendra: A Note on Yekhanin's Locally Decodable Codes. ECCC TR07-016
(referuje Tomá¹ Ebenlendr)

Abstract: Locally Decodable codes(LDC) support decoding of any particular symbol of the input message by reading constant number of symbols of the codeword, even in presence of constant fraction of errors. In a recent breakthrough, Yekhanin designed $3$-query LDCs that hugely improve over earlier constructions. Specifically, for a Mersenne prime $p = 2^t -1$, binary LDCs of length $2^{O(n^{1/t})}$ for infinitely many $n$ were obtained. Using the largest known Mersenne prime, this implies LDCs of length less than $2^{O(n^{10^{-7}})}$. Assuming infinitude of Mersenne primes, the construction yields LDCs of length $2^{O(n^{1/log log n})}$ for infinitely many $n$. Inspired by the breakthrough, we construct $3$-query binary LDCs with same parameters from Mersenne primes. While all the main technical tools are borrowed from Yekhanin's construction, we give a self-contained simple construction of LDCs. Our bounds do not improve over Yekhanin's, and have worse soundness of the decoder. However the LDCs are simple and generalize naturally to prime fields other than $Ftwo = {0,1}$. The LDCs presented also translate directly in to three server Private Information Retrieval(PIR) protocols with communication complexities $O(n^{1/t})$ for a database of size $n$, starting with a Mersenne prime $p = 2^{t} -1$.

18. 3. 2008 (úterý [Tuesday], 14:00, MÚ ®itná 25, velká posluchárna v pøízemí)

Laplante, Lee, Szegedy: The quantum adversary method and classical formula size lower bounds
(referuje Pavel Pudlák)

Nenechte se odradit "kvantovou metodou". Bude se jednat o klasicke odhady pro formulovou slozitost zalozene na jistych algebraickych normach na obdelnicich. Je to zajmiave v souvislosti s normami, ktere jsme zkoumali, protoze jejich metoda udajne pokryva vsechny dosavadni dolni odhady pro formulovou slozitost.

(joint work with D. Gavinski - dokonèení)

Abstract: We give the first exponential separation between quantum and classical multi-part y communication complexity in the (non-interactive) one-way and simultaneous message pas sing settings.  For every $k\le\sqrt{\log n}$, we demonstrate a relational communication problem between $k$ parties that can be solved exactly by a quantum simultaneous message passing protocol of cost \asO{\log n} and requires protocols of cost $n^{c/k^2}$, where $c>0$ is a constant, in the classical non-interactive one-way message passing model with shared randomness and bounded error.

26. 2., 11. 3. 2008 (úterý [Tuesday], 14:00, MÚ ®itná 25, velká posluchárna v pøízemí)

(joint work with D. Gavinski)

Abstract: We give the first exponential separation between quantum and classical multi-part y communication complexity in the (non-interactive) one-way and simultaneous message pas sing settings.  For every $k\le\sqrt{\log n}$, we demonstrate a relational communication problem between $k$ parties that can be solved exactly by a quantum simultaneous message passing protocol of cost \asO{\log n} and requires protocols of cost $n^{c/k^2}$, where $c>0$ is a constant, in the classical non-interactive one-way message passing model with shared randomness and bounded error.