Počet záznamů: 1
Translation-Invariant Kernels for Multivariable Approximation
- 1.0532708 - ÚI 2022 RIV US eng J - Článek v odborném periodiku
Kůrková, Věra - Coufal, David
Translation-Invariant Kernels for Multivariable Approximation.
IEEE Transactions on Neural Networks and Learning Systems. Roč. 32, č. 11 (2021), s. 5072-5081. ISSN 2162-237X. E-ISSN 2162-2388
Grant CEP: GA ČR(CZ) GA18-23827S
Institucionální podpora: RVO:67985807
Klíčová slova: Classification * Fourier and Hankel transforms * 17 function approximation * radial kernels * translation-invariant kernels
Obor OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Impakt faktor: 14.255, rok: 2021 ; AIS: 3, rok: 2021
Způsob publikování: Omezený přístup
Web výsledku:
http://dx.doi.org/10.1109/TNNLS.2020.3026720DOI: https://doi.org/10.1109/TNNLS.2020.3026720
Suitability of shallow (one-hidden-layer) networks with translation-invariant kernel units for function approximation and classification tasks is investigated. It is shown that a critical property influencing the capabilities of kernel networks is how the Fourier transforms of kernels converge to zero. The Fourier transforms of kernels suitable for multivariable approximation can have negative values but must be almost everywhere nonzero. In contrast, the Fourier transforms of kernels suitable for maximal margin classification must be everywhere nonnegative but can have large sets where they are equal to zero (e.g., they can be compactly supported). The behavior of the Fourier transforms of multivariable kernels is analyzed using the Hankel transform. The general results are illustrated by examples of both univariable and multivariable kernels (such as Gaussian, Laplace, rectangle, sinc, and cut power kernels)
Trvalý link: http://hdl.handle.net/11104/0311119
Počet záznamů: 1