Parisian Master of Research in Computer Science
Master Parisien de Recherche en Informatique (MPRI)

Machine Learning (48h, 6 ECTS)

In charge: Michèle Sebag (CNRS, LRI - U. Paris-Sud).

Lecturers

  1. Michele Sebag, CNRS, U. Paris-Sud (lectures)
  2. Benoît Barbot, ENS-Cachan (TD/TP)

Language

The lectures will be given in French unless attended by non French-speaking students. Slides are in English.

Motivations and main objectives

We don't know how to write the specifications of an algorithm able to achieve face recognition or machine translation (I mean, a competent algorithm). In such domains, an alternative is to provide examples (this is a face, this is not a face) and to let the machine itself build the algorithm able to recognize faces (called, classifier): this is machine learning.

Short description

The lecture is structured as follows:

  1. An introductory course (history; goals; what is achieved and what is harder than expected; why)
  2. Supervised Machine Learning
  3. Unsupervised Machine Learning
  4. Reinforcement Machine Learning

Lecture 1: Building intelligent machines

After Turing, machine learning was at the core of intelligent machines (one could carry through the organization of an intelligent machine with only two interfering inputs, one for pleasure or reward, and the other for pain or punishment.) But Artificial Intelligence took another direction. Why, what, etc.

Lecture 2: Introduction to Supervised Machine Learning

Notations and formal background. Oldies but goodies: Decision trees.

Lecture 3: How to assess a machine learning algorithm

  1. Which results ?
  2. Should I be happy with the results ?
  3. Is my algorithm results better than yours ?
  4. How to best tune the algorithm.

Lecture 4: Neural nets

NN were at the beginning of ML, with excellent empirical results.

Lecture 5: Support Vector Machines

SVM outpassed NNs, with a strong statistical theory and efficient algorithms (nothing practical like a good theory).

Lecture 6: Come back of Neural nets

Deep learning and applications.

Lecture 7: Ensemble learning

From finding the best classifier, to finding an ensemble of them.

Lecture 8: Clustering

Finding clusters in the data; exploratory analysis. Application: you are given the catalog of a music editor. How to put some order in there ?

Lecture 9: Data streaming

Modelling online data: e.g. electric consumption, job queries in a computational grid. The data are huge, their distribution is (usually) non stationary.

Lecture 10: Metric learning

As usual, the problem is more than half solved if the representation/the metric is OK.

Lecture 11: Reinforcement learning

If time permits. The goal is to learn a good policy: how to (learn to) play games, to program a robotic controller.

Exams

The exam involves two written essays (un partiel et un examen final).

The questions will be in French (or in English if there are non-French speaking students). Students may answer in French or in English.

Additionally, some focused issues (with an available tutorial) will be proposed: students can volunteer to present the issue to all other students for 30mns during the lecture time.

Prerequisites

  1. basic statistics
  2. computing skills

Related courses

Bibliography

  1. Bishop, Christopher (2006). Pattern recognition and machine learning. Berlin: Springer. ISBN 0-387-31073-8
 
Universités partenaires Université Paris-Diderot
Université Paris-Saclay
ENS Cachan École polytechnique Télécom ParisTech
ENS
Établissements associés Université Pierre-et-Marie-Curie CNRS INRIA CEA