Population Level Analysis of Convergence of the EM Algorithm for Overspecified Mixtures

Loading...
Thumbnail Image

Date

Journal Title

Journal ISSN

Volume Title

Publisher

Nazarbayev University School of Sciences and Humanities

Abstract

This thesis analyzes the convergence properties of the Expectation-Maximization (EM) algorithm when applied to an overspecified Gaussian Mixture Model (GMM). Specifically, it examines the case where a two-component balanced GMM is fitted to data generated from a single Gaussian distribution. A population-level analysis establishes an upper bound of Õ(1/t^2 ) on the Kullback-Leibler (KL) divergence between the learned and true distributions, where t is number of steps of EM algorithm. These theoretical findings are further validated through empirical experiments. This thesis contributes to a broader collaborative study (see Acknowledgments) titled Convergence of the EM Algorithm in KL Distance for Overspecified Gaussian Mixtures.

Description

Citation

Pak, Artur. (2025). Population Level Analysis of Convergence of the EM Algorithm for Overspecified Mixtures. Nazarbayev University School of Sciences and Humanities.

Endorsement

Review

Supplemented By

Referenced By

Creative Commons license

Except where otherwised noted, this item's license is described as Attribution 3.0 United States