Discriminative Model
Introduction
Definition: Learn a probability distribution
Example: Given a cat image (), we want to know the probability of this image maps to certain label
is the probability density function (pdf)
Functions
- Assign labels to data
- Feature Learning: create feature vector (with labels)
Problem: Can’t reject unreasonable input
For any input image, the model must give a pdf. However, when we put in unreasonable input such as a monkey image into a model without monkey label, the model still need to give an output
(Unconditional) Generative Model
Introduction
Definition: Learn a probability distribution
Example: We input a cat image, the model will tell us for all the images it is trained on, how likely this pixel pattern appears
Functions
- Detect Outlier: In manufacturing, we don’t know what cause the defectives, thus we train generative model to find outliers
- Feature Learning (without labels)
- Generate New Data: give random noise image to create realistic output
Generative model can reject unreasonable input
When the user input photos that is random or never appear in its training data, the model can simply assign a very low probability telling us that it never seen this kind of pixel pattern before
Conditional Generative Model
Introduction
Definition: Learn a probability distribution
Example: Given we want to generate “cat” (), what’s the probability of producing a specific pixel configuration during the generation process
Functions
- Assign labels while rejecting outliers:
- For input , find all
- If every is too small, reject the input
- else, assign with highest prob
- Generate New Data: Given random noise and wanted label, create realistic output
Also can reject unreasonable pixel pattern
We can assign low probability to unreasonable pixel pattern for each label
Relation
By Bayes’ Rule
- : Conditional Generative Model
- : Discriminative Model
- : (Unconditional) Generative Model
- : The probability each label is in our dataset