APPROXIMATION ERROR OF FOURIER NEURAL NETWORKS

Loading...
Thumbnail Image

Date

2021-03-23

Authors

Zhumekenov, Abylay
Takhanov, Rustem
Castro, Alejandro J.
Assylbekov, Zhenisbek

Journal Title

Journal ISSN

Volume Title

Publisher

John Wiley and Sons Inc

Abstract

The paper investigates approximation error of two-layer feedforward Fourier Neural Networks (FNNs). Such networks are motivated by the approximation properties of Fourier series. Several implementations of FNNs were proposed since 1980s: by Gallant and White, Silvescu, Tan, Zuo and Cai, and Liu. The main focus of our work is Silvescu's FNN, because its activation function does not fit into the category of networks, where the linearly transformed input is exposed to activation. The latter ones were extensively described by Hornik. In regard to non-trivial Silvescu's FNN, its convergence rate is proven to be of order O(1/n). The paper continues investigating classes of functions approximated by Silvescu FNN, which appeared to be from Schwartz space and space of positive definite functions.

Description

Keywords

Type of access: Open Access, approximation error, convergence, Fourier, neural networks

Citation

Zhumekenov, A., Takhanov, R., Castro, A. J., & Assylbekov, Z. (2021). Approximation error of Fourier neural networks. Statistical Analysis and Data Mining: The ASA Data Science Journal, 14(3), 258–270. https://doi.org/10.1002/sam.11506

Collections