Tentative Syllabus (September 8th, 2016)

1. Introduction.
Definition of learning systems. Goals and applications of machine learning
(classication and regression). Basics on statistical learning theory (Vapnik
Chervonenkis bound). Undertting and Overfitting. Use of data: training set,
test set, validation set. (4 classes)

2. Articial Neural Networks.
Neurons and biological motivation.  The Perceptron
and its learning algorithm. (2 classes)
Multi-Layer Feedforward Neural Networks.  Back-
propagation (BP) algorithm. BP batch version and BP on-line version. Momentum updating rule. (4 classes)
Radial-Basis function (RBF) networks: regularized and generalized RBF net-
works. Learning strategies and
error functions. Unsupervised selection of center. Supervised selection of
weights and centers: decomposition methods into two blocks and decomposi-
tion methods into more blocks.  (4 classes)

3. Support Vector Machines (Kernel methods)
Soft and hard Maximum Margin Classiers. Quadratic programming formula-
tion of the soft/hard maximum margin separators. Kernels methods. KKT conditions (2 classes)
Dual formulation of the primal QP problem. Wolfe duality theory for QP.
 Decomposition methods. (4 classes)
Implementation tricks: Caching, shrinking. (2 classes)
Multiclass SVM problems: one-against-one and one-against-all. (2 classes). Choosing parameters: k-fold cross-validation. (2 classes)

4. Practical use of learning algorithms. Comparing learning algorithms from the optimization point of view.  Use of standard software (4 classes)