Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros








Base de dados
Intervalo de ano de publicação
1.
Neural Netw ; 137: 85-96, 2021 May.
Artigo em Inglês | MEDLINE | ID: mdl-33571864

RESUMO

We examine the closedness of sets of realized neural networks of a fixed architecture in Sobolev spaces. For an exactly m-times differentiable activation function ρ, we construct a sequence of neural networks [Formula: see text] whose realizations converge in order-(m-1) Sobolev norm to a function that cannot be realized exactly by a neural network. Thus, sets of realized neural networks are not closed in order-(m-1) Sobolev spaces Wm-1,p for p∈[1,∞). We further show that these sets are not closed in Wm,p under slightly stronger conditions on the mth derivative of ρ. For a real analytic activation function, we show that sets of realized neural networks are not closed in Wk,p for anyk∈N. The nonclosedness allows for approximation of non-network target functions with unbounded parameter growth. We partially characterize the rate of parameter growth for most activation functions by showing that a specific sequence of realized neural networks can approximate the activation function's derivative with weights increasing inversely proportional to the Lp approximation error. Finally, we present experimental results showing that networks are capable of closely approximating non-network target functions with increasing parameters via training.


Assuntos
Redes Neurais de Computação
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA