Article

How Hard Is It to Approximate the Best Nash Equilibrium?

SIAM Journal on Computing (Impact Factor: 0.76). 01/2011; 40:79-91. DOI: 10.1137/090766991
Source: DBLP

ABSTRACT The quest for a PTAS for Nash equilibrium in a two-player game seeks to circumvent the PPAD-completeness of an (exact) Nash equilibrium by finding an approximate equilibrium, and has emerged as a major open question in Algorithmic Game Theory. A closely related problem is that of finding an equilibrium maximizing a certain objective, such as the social welfare. This optimization problem was shown to be NP-hard by Gilboa and Zemel [Games and Economic Behavior 1989]. However, this NP-hardness is unlikely to extend to finding an approximate equilibrium, since the latter admits a quasi-polynomial time algorithm, as proved by Lipton, Markakis and Mehta [Proc. of 4th EC, 2003]. We show that this optimization problem, namely, finding in a two-player game an approximate equilibrium achieving large social welfare is unlikely to have a polynomial time algorithm. One interpretation of our results is that the quest for a PTAS for Nash equilibrium should not extend to a PTAS for finding the best Nash equilibrium, which stands in contrast to certain algorithmic techniques used so far (e.g. sampling and enumeration). Technically, our result is a reduction from a notoriously difficult problem in modern Combinatorics, of finding a planted (but hidden) clique in a random graph G(n, 1/2). Our reduction starts from an instance with planted clique size k = O(log n). For comparison, the currently known algorithms due to Alon, Krivelevich and Sudakov [Random Struct. & Algorithms, 1998], and Krauthgamer and Feige [Random Struct. & Algorithms, 2000], are effective for a much larger clique size k = Ω(√n).

Full-text

Available from: Elad Hazan, Mar 09, 2015
0 Followers
 · 
69 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We consider the problem of approximating the minmax value of a mul-tiplayer game in strategic form. We argue that in 3-player games with 0-1 payoffs, approximating the minmax value within an additive constant smaller than ξ/2, where ξ = 3− √ 5 2 ≈ 0.382, is not possible by a polynomial time algorithm. This is based on assuming hardness of a version of the so-called planted clique problem in Erd˝ os-Rényi random graphs, namely that of detecting a planted clique. Our results are stated as reductions from a promise graph problem to the problem of approximating the minmax value, and we use the detection problem for planted cliques to argue for its hardness. We present two reductions: a randomized many-one reduction and a deterministic Turing reduction. The latter, which may be seen as a derandomization of the former, may be used to argue for hardness of approximating the minmax value based on a hardness assumption about deterministic algorithms. Our technique for derandomization is general enough to also apply to related work about equilibria.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We consider the following detection problem: given a realization of a symmetric matrix ${\mathbf{X}}$ of dimension $n$, distinguish between the hypothesis that all upper triangular variables are i.i.d. Gaussians variables with mean 0 and variance $1$ and the hypothesis where ${\mathbf{X}}$ is the sum of such matrix and an independent rank-one perturbation. This setup applies to the situation where under the alternative, there is a planted principal submatrix ${\mathbf{B}}$ of size $L$ for which all upper triangular variables are i.i.d. Gaussians with mean $1$ and variance $1$, whereas all other upper triangular elements of ${\mathbf{X}}$ not in ${\mathbf{B}}$ are i.i.d. Gaussians variables with mean 0 and variance $1$. We refer to this as the `Gaussian hidden clique problem.' When $L=(1+\epsilon)\sqrt{n}$ ($\epsilon>0$), it is possible to solve this detection problem with probability $1-o_n(1)$ by computing the spectrum of ${\mathbf{X}}$ and considering the largest eigenvalue of ${\mathbf{X}}$. We prove that this condition is tight in the following sense: when $L<(1-\epsilon)\sqrt{n}$ no algorithm that examines only the eigenvalues of ${\mathbf{X}}$ can detect the existence of a hidden Gaussian clique, with error probability vanishing as $n\to\infty$. We prove this result as an immediate consequence of a more general result on rank-one perturbations of $k$-dimensional Gaussian tensors. In this context we establish a lower bound on the critical signal-to-noise ratio below which a rank-one signal cannot be detected.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This lecture continues last week's theme of somewhat specific data models to inform the analysis of clustering and graph partitioning heuristics. We begin with the classical notion of a mixture model.