Guest lecture: Information geometry and fast supervised learning
by Professor Richard Nock (Université des Antilles et de la Guyane)
on Thursday March 27 at 12.15 in C222.
Abstract : Supervised learning algorithms like AdaBoost have demonstrated
the learning superiority of the strategy which consists in minimizing a
surrogate of the empirical risk, in lieu of the risk itself.
There are so many different candidate surrogates that
recent works have stressed the need to find the groups whose members
would share the most interesting profiles, from standpoints that are
statistical or computational.
In this talk, I will show that a large class of surrogates admits a fast (boosting-type) optimization algorithm that provably converges to the optimum, and it works for all its members. A particular subset of these surrogates carries an important rationale for classification. The proofs involve tools from convex analysis and the geometric structure of a non-metric space.
This is joint work with Frank Nielsen (LIX - Ecole Polytechnique, Palaiseau, France).