Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros

Banco de datos
País/Región como asunto
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Environ Sci Technol ; 50(5): 2487-97, 2016 Mar 01.
Artículo en Inglés | MEDLINE | ID: mdl-26807713

RESUMEN

This paper addresses the need for surveillance of fugitive methane emissions over broad geographical regions. Most existing techniques suffer from being either extensive (but qualitative) or quantitative (but intensive with poor scalability). A total of two novel advancements are made here. First, a recursive Bayesian method is presented for probabilistically characterizing fugitive point-sources from mobile sensor data. This approach is made possible by a new cross-plume integrated dispersion formulation that overcomes much of the need for time-averaging concentration data. The method is tested here against a limited data set of controlled methane release and shown to perform well. We then present an information-theoretic approach to plan the paths of the sensor-equipped vehicle, where the path is chosen so as to maximize expected reduction in integrated target source rate uncertainty in the region, subject to given starting and ending positions and prevailing meteorological conditions. The information-driven sensor path planning algorithm is tested and shown to provide robust results across a wide range of conditions. An overall system concept is presented for optionally piggybacking of these techniques onto normal industry maintenance operations using sensor-equipped work trucks.


Asunto(s)
Monitoreo del Ambiente/métodos , Metano/análisis , Industria del Petróleo y Gas/métodos , Tecnología de Sensores Remotos/métodos , Contaminantes Atmosféricos/análisis , Teorema de Bayes , Modelos Teóricos , Vehículos a Motor , North Carolina , Yacimiento de Petróleo y Gas
2.
Int J Neural Syst ; 28(2): 1750015, 2018 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-28270025

RESUMEN

Recent developments in neural stimulation and recording technologies are providing scientists with the ability of recording and controlling the activity of individual neurons in vitro or in vivo, with very high spatial and temporal resolution. Tools such as optogenetics, for example, are having a significant impact in the neuroscience field by delivering optical firing control with the precision and spatiotemporal resolution required for investigating information processing and plasticity in biological brains. While a number of training algorithms have been developed to date for spiking neural network (SNN) models of biological neuronal circuits, exiting methods rely on learning rules that adjust the synaptic strengths (or weights) directly, in order to obtain the desired network-level (or functional-level) performance. As such, they are not applicable to modifying plasticity in biological neuronal circuits, in which synaptic strengths only change as a result of pre- and post-synaptic neuron firings or biological mechanisms beyond our control. This paper presents a weight-free training algorithm that relies solely on adjusting the spatiotemporal delivery of neuron firings in order to optimize the network performance. The proposed weight-free algorithm does not require any knowledge of the SNN model or its plasticity mechanisms. As a result, this training approach is potentially realizable in vitro or in vivo via neural stimulation and recording technologies, such as optogenetics and multielectrode arrays, and could be utilized to control plasticity at multiple scales of biological neuronal circuits. The approach is demonstrated by training SNNs with hundreds of units to control a virtual insect navigating in an unknown environment.


Asunto(s)
Potenciales de Acción/fisiología , Algoritmos , Aprendizaje/fisiología , Modelos Neurológicos , Red Nerviosa/fisiología , Neuronas/fisiología , Animales , Simulación por Computador , Insectos , Locomoción/fisiología , Plasticidad Neuronal/fisiología , Interfaz Usuario-Computador
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA