MICROWAVE TRANSISTOR MODELING USING MACHINE LEARNING TECHNIQUES

Loading...
Thumbnail Image

Journal Title

Journal ISSN

Volume Title

Publisher

Nazarbayev University School of Engineering and Digital Sciences

Abstract

The current research assesses machine learning methods for modeling GaN HEMT, specifically interpolation and extrapolation issues within higher-frequency circuits. S-parameters prediction based on GaN HEMT data is evaluated by comparing Neural Networks (ANN) and Tree-Based Models (Extra Trees - ET, Random Forests - RF). The methodology involved splitting data by VDS bias and comparing validation strategies (80/20 split vs 5-fold CV) and various hyperparameter optimizers. Results show the 80/20 split is significantly faster (∼9×) with comparable or better performance than CV=5. ET models generally outperformed RF. ET GA outperformed in interpolation, whereas ANN Optuna had superior stability and robustness with extrapolation. Models based on trees are even simpler and much quicker to apply and adjust (ET GA tuning ∼14 times quicker compared to ANN Optuna). The research confirms the trade-offs between types of models and validation strategies for accurate and efficient microwave transistor modeling.

Description

Citation

Ibragim, R. (2025). Microwave Transistor Modeling Using Machine Learning Techniques: An Analysis of Cross-Validation, Optimizer Performance, and Extra Randomness in Tree-Based Models for GaN HEMT Modeling in Comparison with Artificial Neural Networks. Nazarbayev University School of Engineering and Digital Sciences

Endorsement

Review

Supplemented By

Referenced By

Creative Commons license

Except where otherwised noted, this item's license is described as Attribution-NonCommercial 3.0 United States