The binary perceptron model, a simple single-layer neural network, has a rich history in theoretical physics and machine learning. This model considers the problem of finding a sign vector that satisfies a set of random halfspace constraints. The two central questions are: for what constraint densities do solutions exist with high probability, and can we efficiently find a solution when one exists?
In this talk, I will discuss my work addressing both questions, guided by long-standing conjectures from physics. These conjectures predict a sharp satisfiability threshold for the existence of solutions, and a strong freezing property (where almost all solutions are isolated, suggesting that finding solutions using polynomial-time algorithms is typically hard). For the symmetric binary perceptron, we rigorously establish both predictions. Furthermore, the strong freezing property is particularly intriguing, because empirical evidence shows that polynomial time algorithms often succeed in finding a solution, challenging the typically hard prediction. This suggests that such algorithms find atypical solutions. We establish formally this phenomenon, showing that at low constraint density, there exists a rare but well-connected cluster of solutions, and that an efficient multiscale majority algorithm can find solutions in such a cluster with high probability. Additionally, we analyze the performance of canonical discrepancy minimization algorithms for the binary perceptrons, yielding new algorithmic results.