Population Level Analysis of Convergence of the EM Algorithm for Overspecified Mixtures
| dc.contributor.author | Pak, Artur | |
| dc.date.accessioned | 2025-05-20T05:20:23Z | |
| dc.date.available | 2025-05-20T05:20:23Z | |
| dc.date.issued | 2025 | |
| dc.description.abstract | This thesis analyzes the convergence properties of the Expectation-Maximization (EM) algorithm when applied to an overspecified Gaussian Mixture Model (GMM). Specifically, it examines the case where a two-component balanced GMM is fitted to data generated from a single Gaussian distribution. A population-level analysis establishes an upper bound of Õ(1/t^2 ) on the Kullback-Leibler (KL) divergence between the learned and true distributions, where t is number of steps of EM algorithm. These theoretical findings are further validated through empirical experiments. This thesis contributes to a broader collaborative study (see Acknowledgments) titled Convergence of the EM Algorithm in KL Distance for Overspecified Gaussian Mixtures. | |
| dc.identifier.citation | Pak, Artur. (2025). Population Level Analysis of Convergence of the EM Algorithm for Overspecified Mixtures. Nazarbayev University School of Sciences and Humanities. | |
| dc.identifier.uri | https://nur.nu.edu.kz/handle/123456789/8542 | |
| dc.language.iso | en | |
| dc.publisher | Nazarbayev University School of Sciences and Humanities | |
| dc.rights | Attribution 3.0 United States | en |
| dc.rights.uri | http://creativecommons.org/licenses/by/3.0/us/ | |
| dc.subject | Type of access: Open access | |
| dc.subject | EM algorithm | |
| dc.subject | KL divegence | |
| dc.subject | Expectation-Maximization | |
| dc.subject | Kullback-Leibler | |
| dc.subject | GMM | |
| dc.subject | Gaussian Mixture Model | |
| dc.subject | Overspecified models | |
| dc.title | Population Level Analysis of Convergence of the EM Algorithm for Overspecified Mixtures | |
| dc.type | Master`s thesis |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Population Level Analysis of Convergence of the EM Algorithm for Overspecified Mixtures.pdf
- Size:
- 3.25 MB
- Format:
- Adobe Portable Document Format
- Description:
- This thesis analyzes the convergence properties of the Expectation-Maximization (EM) algorithm when applied to an overspecified Gaussian Mixture Model (GMM). Specifically, it examines the case where a two-component balanced GMM is fitted to data generated from a single Gaussian distribution. A population-level analysis establishes an upper bound of Õ(1/t^2 ) on the Kullback-Leibler (KL) divergence between the learned and true distributions, where t is number of steps of EM algorithm. These theoretical findings are further validated through empirical experiments. This thesis contributes to a broader collaborative study (see Acknowledgments) titled Convergence of the EM Algorithm in KL Distance for Overspecified Gaussian Mixtures.