EM Algorithm for Gaussian Mixture
A Detailed Example of EM Algorithm for Mixture model
Given a mixture model with all Gaussian mixture, each of the observation density is defined as:
The parameters θpr=(π1,...,πN) are the mixing probabilities and θobs=(μ1,...,σN) are the parameters for normal density function.
The expected complete-data log-likelihood now can be defined as:
Recall during the maximization step we need to find the parameter that maximizes the expected complete-data log likelihood. So we take partial derivative and set it to 0
For the mixing probability the process is a bit more involved. We can re-express the component probabilities in multinomial logit form and use the fact that mixing probabilities sums to 1:
Finally we can solve for all of them and obtain the following estimator:
Last updated