Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Sci Rep ; 9(1): 3744, 2019 03 06.
Artigo em Inglês | MEDLINE | ID: mdl-30842458

RESUMO

Depth from defocus is an important mechanism that enables vision systems to perceive depth. While machine vision has developed several algorithms to estimate depth from the amount of defocus present at the focal plane, existing techniques are slow, energy demanding and mainly relying on numerous acquisitions and massive amounts of filtering operations on the pixels' absolute luminance value. Recent advances in neuromorphic engineering allow an alternative to this problem, with the use of event-based silicon retinas and neural processing devices inspired by the organizing principles of the brain. In this paper, we present a low power, compact and computationally inexpensive setup to estimate depth in a 3D scene in real time at high rates that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. Exploiting the high temporal resolution of the event-based silicon retina, we are able to extract depth at 100 Hz for a power budget lower than a 200 mW (10 mW for the camera, 90 mW for the liquid lens and ~100 mW for the computation). We validate the model with experimental results, highlighting features that are consistent with both computational neuroscience and recent findings in the retina physiology. We demonstrate its efficiency with a prototype of a neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological depth from defocus experiments reported in the literature.


Assuntos
Percepção de Profundidade/fisiologia , Processamento de Imagem Assistida por Computador/métodos , Visão Ocular/fisiologia , Potenciais de Ação/fisiologia , Algoritmos , Encéfalo/fisiologia , Computadores , Modelos Neurológicos , Redes Neurais de Computação , Neurônios/fisiologia , Retina/fisiologia
2.
IEEE Trans Biomed Circuits Syst ; 12(6): 1467-1474, 2018 12.
Artigo em Inglês | MEDLINE | ID: mdl-30334806

RESUMO

Johnson-Nyquist noise is the electronic noise generated by the thermal agitation of charge carriers, which increases when the sensor overheats. Current high-speed cameras used in low-light conditions are often cooled down to reduce thermal noise and increase their signal to noise ratio. These sensors, however, record hundreds of frames per second, which takes time, requires energy, and heavy computing power due to the substantial data load. Event-based sensors benefit from a high temporal resolution and record the information in a sparse manner. Based on an asynchronous time-based image sensor, we developed another version of this event-based camera whose pixels were designed for low-light applications and added a Peltier-effect-based cooling system at the back of the sensor in order to reduce thermal noise. We show the benefits from thermal noise reduction and study the improvement of the signal to noise ratio in the estimation of event-based normal flow norm and angle and particle tracking in microscopy.


Assuntos
Algoritmos , Temperatura Baixa , Processamento de Imagem Assistida por Computador/instrumentação , Processamento de Imagem Assistida por Computador/métodos , Razão Sinal-Ruído , Desenho de Equipamento , Microscopia
3.
Opt Express ; 25(11): 12611-12621, 2017 May 29.
Artigo em Inglês | MEDLINE | ID: mdl-28786616

RESUMO

This article introduces a method to extract the speed and density of microparticles in real time at several kHz using an asynchronous event-based camera mounted on a full-field optical coherence tomography (FF-OCT) setup. These cameras detect significant amplitude changes, allowing scene-driven acquisitions. They are composed of an array of autonomously operating pixels. Events are triggered when an illuminance change at the pixel level is significant at 1µs time precision. The event-driven FF-OCT algorithm relies on a time-based optical flow computation to operate directly on incoming events and updates the estimation of velocity, direction and density while reducing both computation and data load. We show that for fast moving microparticles in a range of 0.4 - 6.5mm/s, the method performs faster and more efficiently than existing techniques in real time. The target application of this work is to evaluate erythrocyte dynamics at the microvascular level in vivo with a high temporal resolution.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA