Your browser doesn't support javascript.
loading
The Discriminative Kalman Filter for Bayesian Filtering with Nonlinear and Nongaussian Observation Models.
Burkhart, Michael C; Brandman, David M; Franco, Brian; Hochberg, Leigh R; Harrison, Matthew T.
Afiliação
  • Burkhart MC; Division of Applied Mathematics, Brown University, Providence, RI 02912, U.S.A. michael_burkhart@alumni.brown.edu.
  • Brandman DM; Department of Neuroscience, Brown University, Providence, RI 02912, U.S.A., and Department of Surgery (Neurosurgery), Dalhousie University, Halifax, NS, B3H 4R2, Canada david_brandman@alumni.brown.edu.
  • Franco B; Center for Neurotechnology and Neurorecovery, Neurology, Massachusetts General Hospital, Boston, MA 02114, U.S.A. brfranco34@gmail.com.
  • Hochberg LR; Center for Neurotechnology and Neurorecovery, Neurology, Massachusetts General Hospital, Boston, MA 02114, U.S.A.; School of Engineering and Carney Institute for Brain Science, Brown University, Providence, RI 02912, U.S.A.; Neurology, Harvard Medical School, Boston, MA 02115, U.S.A.; and VA RR&
  • Harrison MT; Division of Applied Mathematics, Brown University, Providence, RI 02912, U.S.A. matthew_harrison@brown.edu.
Neural Comput ; 32(5): 969-1017, 2020 05.
Article em En | MEDLINE | ID: mdl-32187000
ABSTRACT
The Kalman filter provides a simple and efficient algorithm to compute the posterior distribution for state-space models where both the latent state and measurement models are linear and gaussian. Extensions to the Kalman filter, including the extended and unscented Kalman filters, incorporate linearizations for models where the observation model p(observation|state) is nonlinear. We argue that in many cases, a model for p(state|observation) proves both easier to learn and more accurate for latent state estimation. Approximating p(state|observation) as gaussian leads to a new filtering algorithm, the discriminative Kalman filter (DKF), which can perform well even when p(observation|state) is highly nonlinear and/or nongaussian. The approximation, motivated by the Bernstein-von Mises theorem, improves as the dimensionality of the observations increases. The DKF has computational complexity similar to the Kalman filter, allowing it in some cases to perform much faster than particle filters with similar precision, while better accounting for nonlinear and nongaussian observation models than Kalman-based extensions. When the observation model must be learned from training data prior to filtering, off-the-shelf nonlinear and nonparametric regression techniques can provide a gaussian model for p(observation|state) that cleanly integrates with the DKF. As part of the BrainGate2 clinical trial, we successfully implemented gaussian process regression with the DKF framework in a brain-computer interface to provide real-time, closed-loop cursor control to a person with a complete spinal cord injury. In this letter, we explore the theory underlying the DKF, exhibit some illustrative examples, and outline potential extensions.
Assuntos

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Algoritmos / Teorema de Bayes / Dinâmica não Linear / Interfaces Cérebro-Computador Tipo de estudo: Clinical_trials / Prognostic_studies Limite: Humans Idioma: En Ano de publicação: 2020 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Algoritmos / Teorema de Bayes / Dinâmica não Linear / Interfaces Cérebro-Computador Tipo de estudo: Clinical_trials / Prognostic_studies Limite: Humans Idioma: En Ano de publicação: 2020 Tipo de documento: Article