Your browser doesn't support javascript.
loading
Finite-Time Lyapunov Exponents of Deep Neural Networks.
Storm, L; Linander, H; Bec, J; Gustavsson, K; Mehlig, B.
Afiliação
  • Storm L; Department of Physics, University of Gothenburg, 41296 Gothenburg, Sweden.
  • Linander H; Department of Mathematical Sciences, Chalmers Technical University and University of Gothenburg, Gothenburg, Sweden.
  • Bec J; MINES Paris, PSL Research University, CNRS, Cemef, Valbonne, F-06900, France.
  • Gustavsson K; Université Côte d'Azur, Inria, CNRS, Cemef, Valbonne, F-06900, France.
  • Mehlig B; Department of Physics, University of Gothenburg, 41296 Gothenburg, Sweden.
Phys Rev Lett ; 132(5): 057301, 2024 Feb 02.
Article em En | MEDLINE | ID: mdl-38364126
ABSTRACT
We compute how small input perturbations affect the output of deep neural networks, exploring an analogy between deep feed-forward networks and dynamical systems, where the growth or decay of local perturbations is characterized by finite-time Lyapunov exponents. We show that the maximal exponent forms geometrical structures in input space, akin to coherent structures in dynamical systems. Ridges of large positive exponents divide input space into different regions that the network associates with different classes. These ridges visualize the geometry that deep networks construct in input space, shedding light on the fundamental mechanisms underlying their learning capabilities.

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Base de dados: MEDLINE Idioma: En Ano de publicação: 2024 Tipo de documento: Article