Coding for the additive Gaussian noise channel is a probabilistic counterpart to the combinatorial geometry problem of efficient packing of spheres in Rn. Shannon provided the optimal rates in 1948, initiating a path of many developments in the field, including empirically justified practical schemes in the 1990s that are an essential part of the cell phone revolution. But the proof of computationally feasible algorithms, reliable at all rates below Shannon capacity has been lacking for the ubiquitous Gaussian noise channel. The codes we develop here are sparse superposition codes based on a high-dimensional regression model, and they have fast encoders and decoders based on iterative regression fits. In this presentation theoretical demonstration of the desired statistical properties are provided. The error probability is shown to scale favorably with the size of the code, namely, the error probability is exponentially small for all communication rates below capacity. This work is joint with Yale students Antony Joseph and Sanghee Cho.