Essentially, a Gaussian mixture model is a way to combine several Gaussian PDFs (Probability Distribution Functions) of different shapes and sizes into a single distribution. This is done by making a linear combination of the individual Gaussian PDFs.
Expectation-Maximization (EM) is a procedure that allows us to learn the parameters of the Gaussian mixture model. These parameters are refined over several iterations. The Expectation step (E-step) keeps fixed the mean μc, covariance Σc, and size πc of the Gaussian, c. The assignment probability, ric, that a each data point belongs to cluster c is then calculated.
In this example, the data point x is more likely to belong to distributyion 2 (66% chance) over distribution 1 (33% chance) |
Using Expectation-Maximization, the model can learn the parameters over time and refine a distribution given datapoints from overlapping distributions.
-----------------------------------
link to videos:
https://www.youtube.com/watch?v=Rkl30Fr2S38
https://www.youtube.com/watch?v=qMTuMa86NzU
No comments:
Post a Comment