Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder Class
Work
Year: 2023
Type: article
Abstract: In this paper, we construct neural networks with ReLU, sine, and as activation functions. For a general continuous defined on with continuity modulus , we construct -sine- networks that enjoy an appro... more
Institution Wuhan University
Cites: 29
Cited by: 1
Related to: 10
FWCI: 0.285
Citation percentile (by year/subfield): 49.75
Subfield: Artificial Intelligence
Field: Computer Science
Domain: Physical Sciences
Open Access status: green
Funders National Natural Science Foundation of China, National Natural Science Foundation of China, National Natural Science Foundation of China, National Natural Science Foundation of China
Grant IDS 12125103, 12071362, 11871385, 11871474