Master course in Communication Engineering


Machine Learning for Pattern Recognition (MLPR) -- Profs A. Bononi/ S. Cagnoni
Prof. Alberto Bononi                  Tel. 0521 905760            alberto.bononi@unipr.it         http://www.tlc.unipr.it/bononi/didattica/ML/ML.html
 Course Objectives

The objective of the course is to provide the student with the ability to understand and apply the basic rules of machine learning for pattern recognition. Objective of Part 1 is to provide the student with the statistical bases of Machine Learning.


 Classes (Academic Year 2023-2024)

Part 1 (Prof. Bononi): Monday 16:30-18:30 in presence (room B/4 scientific complex) and online on Teams (team course registration code: ng2crh3). Videolectures are available from a previous year, along with class notes. ID and password to access the videos/slides will be communicated to you in class on the first lecture.

Part 2 (Prof. Cagnoni): Wednesday 14:30-16:30 in presence Room 2 (didactic plex); Thursday 13:30-15:30 in presence Room 2 (classes videos will be available on Elly: https://elly2023.dia.unipr.it/course/view.php?id=424 ).


 Office Hours

Bononi: Monday 13:30-15:30 (Scientific Complex, Building 2, floor 2, Room 2/19T).


 Credits and Organization

This course is worth 6 credits (CFU). It is split into part I "Statistical Foundations of ML" (Prof Bononi, 3 CFU, 12 2-hour classes), and part II "Applications of ML" (Prof. Cagnoni, 3 CFU, 12 2-hour classes). The official 3CFUs by prof Cagnoni are borrowed from the Information Engineering "Machine Learning" (ML) course.


Suggested Reading for Part I (Bononi)
[1] C. W. Therrien, "Decision, estimation and classification" Wiley, 1989
[2] R. O. Duda, P. E. Hart, D. G. Stork, "Pattern classification", 2nd Ed., Wiley, 2001
[3] D. Barber "Bayesian Reasoning and Machine Learning" Cambridge University Press, 2012.
[4] C. M. Bishop "Pattern Recognition and Machine Learning", Springer, 2006.
[5] T. Hastie, R. Tibshirani, J. Friedman, "The Elements of Statistical Learning: Data mining, inference, and prediction", Springer, 2008.
Exams

Part 1, Bononi:
Oral only, to be scheduled on an individual basis. When ready, please contact the instructor by email at alberto.bononi[AT]unipr. it and by specifying the requested date. The exam consists of solving some exercises and explaining theoretical details connected with them, for a total time of about 1 hour. You can bring your summary of important formulas in an A4 sheet to consult if you so wish.
Part 2, Cagnoni:
A practical project will be assigned in agreement with the student, whose results will be presented and discussed by the student both as a written report and as an oral presentation.



Syllabus (2 hours each class)


PART I: Statistical Foundations of Machine Learning (Bononi)



Lec. 1. Introduction to ML
- Problem statement and definitions
- Examples of machine learning problems
- Glossary of equivalent terms in Radar detecton theory, hypothesis testing and machine learning

Lec. 2. Probability refresher
- Axioms, conditional probability, total probability law, Bayes law, double conditioning, chain rule, independence and conditional independence of events.
- Discrete random variables (RV): expectation, conditional expectation. Pairs of RVs. Sum rule. Iterated expectation. Vectors of RVs. An extended example.

Lec. 3. Probability refresher
- Random vectors:
expectation, covariance and its properties, spectral decomposition of covariance matrix, whitening.
- Continuous RV.
Parallels with discrete RVs. Functions of RVs. Mixed RVs. Continuous random vectors.
- Appendix: differentiation rules for vectors and matrices.

Lec. 4.
- Gaussian RVs and their linear transformations. Mahalanobis distance.
Classification:
- Bayesian prediction: introduction, loss function, conditional risk, argmin/argmax rules
- Bayes classification: introduction

Lec. 5. Classification
- 0/1 loss -> maximum a posteriori (MAP) classifier. Binary MAP. Decision regions.
- Classifier performance.
- Likelihood ratio tests and receiver operating curve (ROC)
- Minimax rule

Lec. 6. Classification
- Binary Gaussian classification
- Homoscedastic case: linear discriminant analysis
- Heteroscedastic case: Bhattacharrya bound
- Bayes classification with discrete features
- Classification with missing data (composite hypothesis testing)

Lec. 7. Estimation
- Bayesian estimation: introduction
- Quadratic loss: minimum mean square error (MMSE) estimator = regression curve
- L1 loss: minimum mean absolute error (MMAE) estimator
- 0/1 loss: MAP estimator, and maximum likelihood (ML) in uniform prior.
- Regression for vector Gaussian case
- ML estimation for Gaussian observations

Lec. 8. Estimation
- ML for multinomial
- Conjugate priors in MAP estimation
- Estimation accuracy and ML properties, Cramer Rao bounds.
Suboptimal (non Bayesian) estimation:
- LMMSE estimation (linear regression)
- LMMSE derivation with LDU decomposition


Lec. 9. Estimation
- LMMSE examples
- Generalized linear regression
- Example: polynomial regression
- Sample LMMSE
- Generalized sample LMMSE.

Lec. 10. Learning
- Supervised learning: introduction
- Generative vs discriminatie approaches
- Example: logistic model
- Plug-in learning
ML fitting of logistic model: logistic regression
Example: handwritten digit recognition.
- Bayesian Learning

Lec. 11.
Learning:
- Empirical risk minimization
Nonparametric density estimation:
- Parzen window estimator
- kNN estimator

Lec. 12. linear data reduction
- Principal component analysis (PCA)
- Fisher linear classifier

PART II: Applications of ML (Cagnoni)


Syllabus and schedule can downloaded here.