OPTIMIZATION OF SMALL-SIGNAL SCALABLE MODELS OF GAN HEMTS USING ML TECHNIQUES

Loading...
Thumbnail Image

Journal Title

Journal ISSN

Volume Title

Publisher

Nazarbayev University School of Engineering and Digital Sciences

Abstract

The optimization of small-signal scalable models for GaN High Electron Mobility Transistors (HEMTs) is crucial for their efficient application in high-power and high-frequency electronic systems. This project uses Machine Learning (ML) techniques, specifically AdaBoost, Random Forest, and Artificial Neural Network algorithms, to enhance accuracy and reliability of the transistor models. Sixteen different models will be developed to model the transistor behavior based on the measurement data. Performance metrics such as Mean Absolute Error (MAE), Mean Squared Error (MSE), and R² scores would demonstrate the models’ effectiveness, with high predictive accuracy and strong generalization across varying operational conditions. The results of the models highlight the potential of ML overthrowing conventional methods by overcoming their limitations and providing scalable and robust solutions for the optimization of GaN HEMTs. In addition, this work could be the foundation for further integration of advanced data-driven techniques in semiconductor device modeling applications.

Description

Citation

Kurmangali, A. (2025). "Optimization of small-signal scalable models of GaN HEMTs using ML techniques". Nazarbayev University School of Engineering and Digital Sciences.

Endorsement

Review

Supplemented By

Referenced By

Creative Commons license

Except where otherwised noted, this item's license is described as Attribution-NonCommercial 3.0 United States