Goals

Sparsity and convexity are ubiquitous notions in Machine Learning and Statistics. In this course, we study the mathematical foundations of some powerful methods based on convex relaxation: L1-regularisation techniques in Statistics and Signal Processing. These approaches turned to be Semi-Definite representable (SDP) and hence tractable in practice. The theoretical part of the course will focus on the guarantees of these algorithms under the sparsity assumption. The practical part of this course will present the standard solvers of these learning problems.

Programme

I. Optimisation convexe et méthodes d'accélération II. Algorithmes pour la regression parcimonieuse en grande dimension III. Guaranties théoriques en grande dimension IV. Apprentissage compressé

Study
4h
 
Course
22h
 
TC
4h
 

Responsibles

  • Yohann DE CASTRO
  • Alexandre SAIDI
  • Céline HARTWEG-HELBERT

Language

French

Keywords

L1-regularization; Sparse Models; Optimization;