Redirigiendo al acceso original de articulo en 21 segundos...
Inicio  /  Applied Sciences  /  Vol: 11 Par: 1 (2021)  /  Artículo
ARTÍCULO
TITULO

ReLU Network with Bounded Width Is a Universal Approximator in View of an Approximate Identity

Sunghwan Moon    

Resumen

Deep neural networks have shown very successful performance in a wide range of tasks, but a theory of why they work so well is in the early stage. Recently, the expressive power of neural networks, important for understanding deep learning, has received considerable attention. Classic results, provided by Cybenko, Barron, etc., state that a network with a single hidden layer and suitable activation functions is a universal approximator. A few years ago, one started to study how width affects the expressiveness of neural networks, i.e., a universal approximation theorem for a deep neural network with a Rectified Linear Unit (ReLU) activation function and bounded width. Here, we show how any continuous function on a compact set of R??????,???????N R n i n , n i n ? N can be approximated by a ReLU network having hidden layers with at most ??????+5 n i n + 5 nodes in view of an approximate identity.

 Artículos similares

       
 
Saurabh Agarwal and Ki-Hyun Jung    
Digital images are very popular and commonly used for hiding crucial data. In a few instances, image steganography is misused for communicating with improper data. In this paper, a robust deep neural network is proposed for the identification of content-... ver más
Revista: Applied Sciences

 
Sultan Ahmed Almalki, Ahmed Abdel-Rahim and Frederick T. Sheldon    
The adoption of cooperative intelligent transportation systems (cITSs) improves road safety and traffic efficiency. Vehicles connected to cITS form vehicular ad hoc networks (VANET) to exchange messages. Like other networks and systems, cITSs are targete... ver más
Revista: Algorithms

 
Nalinda Kulathunga, Nishath Rajiv Ranasinghe, Daniel Vrinceanu, Zackary Kinsman, Lei Huang and Yunjiao Wang    
The nonlinearity of activation functions used in deep learning models is crucial for the success of predictive models. Several simple nonlinear functions, including Rectified Linear Unit (ReLU) and Leaky-ReLU (L-ReLU) are commonly used in neural networks... ver más
Revista: Algorithms

 
Manuel Domínguez-Rodrigo, Ander Fernández-Jaúregui, Gabriel Cifuentes-Alcobendas and Enrique Baquedano    
Deep learning models are based on a combination of neural network architectures, optimization parameters and activation functions. All of them provide exponential combinations whose computational fitness is difficult to pinpoint. The intricate resemblanc... ver más
Revista: Applied Sciences

 
Lilian Asimwe Leonidas and Yang Jie    
In recent years, deep learning has been used in various applications including the classification of ship targets in inland waterways for enhancing intelligent transport systems. Various researchers introduced different classification algorithms, but the... ver más
Revista: Information