Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add more filters










Database
Language
Publication year range
1.
Neural Netw ; 141: 404-419, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34146968

ABSTRACT

We study the approximation of two-layer compositions f(x)=g(ϕ(x)) via deep networks with ReLU activation, where ϕ is a geometrically intuitive, dimensionality reducing feature map. We focus on two intuitive and practically relevant choices for ϕ: the projection onto a low-dimensional embedded submanifold and a distance to a collection of low-dimensional sets. We achieve near optimal approximation rates, which depend only on the complexity of the dimensionality reducing map ϕ rather than the ambient dimension. Since ϕ encapsulates all nonlinear features that are material to the function f, this suggests that deep nets are faithful to an intrinsic dimension governed by f rather than the complexity of the domain of f. In particular, the prevalent assumption of approximating functions on low-dimensional manifolds can be significantly relaxed using functions of type f(x)=g(ϕ(x)) with ϕ representing an orthogonal projection onto the same manifold.


Subject(s)
Neural Networks, Computer
SELECTION OF CITATIONS
SEARCH DETAIL