Your browser doesn't support javascript.
loading
Data-driven learning of chaotic dynamical systems using Discrete-Temporal Sobolev Networks.
Kennedy, Connor; Crowdis, Trace; Hu, Haoran; Vaidyanathan, Sankaran; Zhang, Hong-Kun.
Afiliação
  • Kennedy C; Department of Mathematics & Statistics, University of Massachusetts, Amherst, MA 01003, USA. Electronic address: conmkennedy@umass.edu.
  • Crowdis T; Department of Mathematics & Statistics, University of Massachusetts, Amherst, MA 01003, USA. Electronic address: tcrowdis@umass.edu.
  • Hu H; Department of Mathematics & Statistics, University of Massachusetts, Amherst, MA 01003, USA. Electronic address: haoranhu@umass.edu.
  • Vaidyanathan S; Department of Mathematics & Statistics, University of Massachusetts, Amherst, MA 01003, USA. Electronic address: sankaranv@cs.umass.edu.
  • Zhang HK; Department of Mathematics & Statistics, University of Massachusetts, Amherst, MA 01003, USA. Electronic address: hongkunz@umass.edu.
Neural Netw ; 173: 106152, 2024 May.
Article em En | MEDLINE | ID: mdl-38359640
ABSTRACT
We introduce the Discrete-Temporal Sobolev Network (DTSN), a neural network loss function that assists dynamical system forecasting by minimizing variational differences between the network output and the training data via a temporal Sobolev norm. This approach is entirely data-driven, architecture agnostic, and does not require derivative information from the estimated system. The DTSN is particularly well suited to chaotic dynamical systems as it minimizes noise in the network output which is crucial for such sensitive systems. For our test cases we consider discrete approximations of the Lorenz-63 system and the Chua circuit. For the network architectures we use the Long Short-Term Memory (LSTM) and the Transformer. The performance of the DTSN is compared with the standard MSE loss for both architectures, as well as with the Physics Informed Neural Network (PINN) loss for the LSTM. The DTSN loss is shown to substantially improve accuracy for both architectures, while requiring less information than the PINN and without noticeably increasing computational time, thereby demonstrating its potential to improve neural network forecasting of dynamical systems.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Algoritmos / Redes Neurais de Computação Idioma: En Revista: Neural Netw Assunto da revista: NEUROLOGIA Ano de publicação: 2024 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Algoritmos / Redes Neurais de Computação Idioma: En Revista: Neural Netw Assunto da revista: NEUROLOGIA Ano de publicação: 2024 Tipo de documento: Article