SylabUZ

Generate PDF for this page

Machine learning - course description

General information
Course name Machine learning
Course ID 11.9-WE-INFD-MachLear-Er
Faculty Faculty of Computer Science, Electrical Engineering and Automatics
Field of study Computer Science
Education profile academic
Level of studies Second-cycle Erasmus programme
Beginning semester winter term 2022/2023
Course information
Semester 1
ECTS credits to win 5
Course type obligatory
Teaching language english
Author of syllabus
  • prof. dr hab. inż. Dariusz Uciński
Classes forms
The class form Hours per semester (full-time) Hours per week (full-time) Hours per semester (part-time) Hours per week (part-time) Form of assignment
Lecture 30 2 - - Exam
Laboratory 30 2 - - Credit with grade

Aim of the course

  • Familiarize students with the concept of machine learning and its applications to the analysis of large data sets included in social media, ERP systems and modern e-business applications.
  • Teach students how to select the appropriate data analysis techniques depending on the scale of the problem under consideration and the type of analysis (real-time, batch mode, data stream processing).
  • Teach students to work using modern programming languages and platforms directed towards machine leaning, such as Python, R and JMP.

Prerequisites

  • Fundamentals of probability and engineering statistics
  • Basics of numerical methods
  • Basic programming skills in Python

Scope

Linear classification methods: supervised classification; linear discriminant analysis; discrimination based on linear regression and logistic regression; model diagnostics.

Classification based on probability distributions: Bayesian classifier and maximum likelihood; optimality of the Bayes rule; practical synthesis of classifiers.

Classification based o nonparametric estimation of probability distributions: estimation of distributions within classes; nearest neighbor rule.

Decision tress and families of classifiers: partition rules; trimming rules; algorithms of bagging and boosting; random forests.

Regression analysis: global parametric models; nonparametric regression; random effects and mixed linear models.

Generalizations of linear methods: elastic discrimination; support vector machines.

Projection methods and detection of hidden varables: unsupervised learning systems; principal component analysis; factor analysis; multidimensional scaling.

Cluster analysis: combinatorial metods; hierarchical methods.

Deep learning: unidirectional deep networks; regularization; convolution networks; recurrent networks.

Teaching methods

conventional lecture, discussion, laboratory classes

Learning outcomes and methods of theirs verification

Outcome description Outcome symbols Methods of verification The class form

Assignment conditions

  • Lecture - the passing criterion is a sufficient mark from the final exam.
  • Laboratory - the passing criterion are positive marks for all laboratory exercises.
  • Final mark components = lecture: 50% + laboratory: 50%

Recommended reading

  1. Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani: An Introduction to Statistical Learning with Applications in R, Springer, 2013
  2. Trevor Hastie, Robert Tibshirani, Jerome Friedman: The Elements of Statistical Learning. Data Mining, Inference, and Prediction. Second Edition, Springer, 2009
  3. Ian Goodfellow, Yoshua Bengio, Aaron Courville: Deep Learning. MIT Press, Warszawa, 2016
  4. Brian Steele, John Chandler, Swarna Reddy: Algorithms for Data Science, Springer, 2016

Further reading


 

Notes


Modified by prof. dr hab. inż. Dariusz Uciński (last modification: 20-04-2022 16:31)