The past decade has witnessed a surge in the development and adoption of machine learning algorithms to solve day-a-day computational tasks. Yet, a solid theoretical understanding of even the most basic tools used in practice is still lacking, as traditional statistical learning methods are unfit to deal with the modern regime in which the number of model parameters are of the same order as the quantity of data - a problem known as the curse of dimensionality. Curiously, this is precisely the regime studied by Physicists since the mid 19th century in the context of interacting many-particle systems. This connection, which was first established in the seminal work of Elisabeth Gardner and Bernard Derrida in the 80s, is the basis of a long and fruitful marriage between these two fields.
In this talk I will motivate and review the connections between Statistical Physics and problems in high-dimensional Statistics, such as the ones stemming from the fields of Machine Learning and Signal Processing. Finally, I will exemplify this discussion with some recent concrete applications of the Statistical Physics toolbox to different problems of interest.