# FOUNDATIONS OF MACHINE LEARNING

# Course info

# Course materials

# FAQ

Course materials

/

Section

# Introduction

The models that we have covered to this point are *discriminative* models. These models learn to infer the target $y$ from the input $\mathbf{x}$, without knowing anything about the distribution over the input. From a probabilistic perspective, the discriminative models infer a conditional distribution $p(y\mid \mathbf{x})$ which is used to discriminate between the inputs. In contrast, the models that we will cover in this section infer the joint distribution $p(\mathbf{x}, y)$. Given such a model it is (at least conceptually) possible to generate new data instances by sampling from the joint distribution, and these models are therefore referred to as *generative*. This section will introduce both supervised and *unsupervised* learning algorithms. In the latter case, we do not provide the model with observed target values during training. Instead, patterns and structures are inferred from the input data itself.

Before moving on to the examples, read chapter 10 in the course book, covering the subject of generative models. Chapter 10.1 introduces Gaussian Mixture Models and classification with discriminant analysis. Chapter 10.2 describes some commonly used unsupervised algorithms for grouping of data, also referred to as *clustering* algorithms. In chapter 10.3, you will learn about how deep neural networks can be used as a basis for so-called deep generative models. Finally, chapter 10.4 is about representation learning and how you can reduce the dimensionality of your data such that it is still useful for your machine learning algorithm.

The first example in this section is about classifying handwritten digits using Gaussian Mixture Models and Principal Component Analysis. You then have a choice between sections diving deeper into deep generative models (alternative A) or generative language models (alternative B). Only one of these sections has to be completed, but you are of course welcome to work through both.

The section about deep generative models will guide you through the fundamental ideas underlying this model family and discuss some applications and recent innovations. Generative language models is a topic that has gained much attention lately through the launch of systems like Chat-GPT. If you go through this section you will gain a better understanding how these systems work, and how they relate to other machine learning techniques in this course.

This webpage contains the course materials for the course ETE370 Foundations of Machine Learning.

The content is licensed under Creative Commons Attribution 4.0 International.

Copyright © 2021, Joel Oskarsson, Amanda Olmin & Fredrik Lindsten